Hacker Newsnew | past | comments | ask | show | jobs | submit | fgeahfeaha's commentslogin

I found that if you follow a bunch of people on twitter who specialize in a niche topic you'll see a lot of casual interesting conversations there

But the problem is that you can't start new discussions or ask a question unless those people follow you back because no one will see/reply to your tweets


That's a failed repro then right?

Below 110K is below -163.15 Celcius

How would that compare to other superconductors?


It is and it isn't. It's at ambient pressure (which is something useful), and there is something very odd happening much higher up that needs to be explained. They say their sample purity is higher than the one the Korean team had, so that would normally lead to better yield and easier confirmation of the superconductivity. But since it does show the Meissner effect in other samples as well at room temperature there is a lot that still needs explaining before we can say it is a failed reproduction.


I adore stories that include the phrase "that's odd" or "something odd happens when ..."

Even if we don't get the astonishing result originally claimed by the rogue paper, it's still a triumph of science in my ignorant opinion.


And sometimes 'that's odd' leads to things larger than the original goal. You really don't want to hear those words in the doctors' office though.


perhaps an impurity caused the effect they're looking for


Yes, that's possible and something that has already happened once before: this is exactly how x-rays and eventually radioactivity were discovered, a chance contamination.


Tc = 110 K would take the #4 spot on https://en.wikipedia.org/wiki/List_of_superconductors


#3 spot at atmospheric pressure


At atmospheric pressure, no less.


> HgTlBaCaCuO

New band name. And band gap.


One of the most well known, YBCO, has a Tc of 95K.


Yeah it looks to me like either a replication failure or even evidence AGAINST a superconducting phase. A superconductor's resistance curve is supposed to show a sharp drop to zero at the transition temperature. If it's a dirty inhomogenous sample (e.g. specks of superconductor embedded in non-superconducting material), you get a kink where the curve descends to a non-zero background resistance. In the Southeast University data, there's a smooth curve that goes down until they get to the noise floor. There's no transition.


technically a failure, but still a strange result


I don't think it's a failure if the original claims are showing some kind of promise, science is being done, and the frontiers of knowledge are being pushed forward.

In other words, not "Eureka!" but "that's weird".


Yeah to me the block system thing just makes a lot of sense


Methinks it's all boiling down to some kind of lingua franca for document structure. Altho things like footnotes and tables of content will not fit. Not RSN anyways.


Game engines usually have their own runtime formats and just have an importer


It's fairly easy to load gltf at runtime in UE 4/5.


Obviously it depends on the person but I did 3 years of an EE degree then switched to CS so I've experienced both

EE was definitely harder to me mainly just because engineering had you taking a lot more credits at once compared to science

And doing vector calc/electromagnetic field math is way harder than proofs IMO

Also debugging software is so much easier than hardware/circuits. The real world is so much messier compared to computers where stuff is cleanly true/false.

You can put together a circuit perfectly but it turns out some component you ordered was busted or the tolerance is wrong or something got fried accidentally and it can be really painful to track it down with a multi meter/oscilloscope.

Whereas that doesn't really happen in software. You don't have a program that works one day and then and then suddenly the next day "if statements" aren't working. So much more stuff can go wrong in the physical world.

My main job now is pretty high level C++ but I don't regret doing the EE part though because it forced me do a few courses in Verilog and there's no better way to really understand how a computer works than building a simple CPU on an FPGA


> You don't have a program that works one day and then and then suddenly the next day "if statements" aren't working.

I know you're talking about the fixed nature of programming logic, but this phrase is surfacing some repressed timezone debugging memories.


Oh yeah, those Verilog courses are like the next level up from nand2tetris in wrapping your head around those kinds of things. I got a lot of mileage out of writing some automated scripts to run through Xcelium's interactive shell so I could get a faster debug loop going with them.


What do you do now that involves C++ if you don’t mind me asking? I too have a EE degree but have been doing web dev for the last few years, I would like to get back into maybe doing lower level software development or work on something I can use my degree somewhat on.


I do graphics programming on a game engine


The craziness in software starts at the systems level, especially distributed systems. The madness of debugging distributed systems (at present) shouldn't be underestimated :)


Yup, you just can't beat standardization

The support is way better because there isn't a million different models


I think it has more to do with vertical integration of the hardware and software than the number of models.

Supporting Android phones requires collaboration between the chip vendors, hardware manufacturers, and Google, which is difficult.


true


Other people have mentioned ray-tracing in one weekend

Complimentary to that I would recommend TinyRenderer

https://github.com/ssloy/tinyrenderer/wiki

This one is a CPU-based rasterizing renderer, it gives you a good understanding of what a GPU graphics pipeline does underneath.

In the graphics world the two common ways of rendering things are either rasterization or raytracing.

Raytracing is basically all the movie/VFX/CGI/offline renderers (although it is also being used for certain parts of real-time in recent years)

Raster is how most real-time renderers like the ones used for video games work.

If you're interested in graphics I'd highly recommend implementing a ray-tracer and a rasterizer from scratch at least once to get a good mental model of how they both work.


Out of curiosity, is there a third way? The two you mention are the big ones, I know, but your phrasing implies (perhaps unintentionally) the existence of at least one more.


Well there's the "steer a beam across a screen" as used in CRT's (eg. older oscilloscopes) and the Vectrex game console.

Maybe one could consider lasershows falling in this category too?

See "vector display"


vk_mini_path_tracer is a beginner-friendly introduction to writing your own fast, photorealistic path tracer in less than 300 lines of C++ code and 250 lines of GLSL shader code using Vulkan

https://nvpro-samples.github.io/vk_mini_path_tracer/index.ht...


woah woah dude, spoilers!


Next year the worksheet is just the rendering equation and a list of vertices/lights/transforms


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: