OP doesn't bring up the different temperature and pressure differential of heat sources. That's a huge red flag. For all we know the CTO role at that company was entirely focused on keeping the staff laptops running and is largely disconnected from the fundamentals of heat to electricity generation. Why the hell would she (or he) ask here when (s)he has current or former colleagues in the industry to ask.
I said several time in this discussion that waste heat sources are typically ~600C with flow rates of ~80-100kg/s. Do the math and that's about 100MW of energy in a stack. Off the shelf Organic Rankine Cycle engines can convert heat energy at that temperature at about 20% efficiency.
Of course that varies by ambient temperature and pressure, but not significantly for the purposes of this discussion. A good rule of thumb is that for a given delta-T, current OTC tech can get you about half of Carnot efficiency at the optimal spot on the cost-efficiency curve.
I wouldn't know how to fix a laptop if you offered me a yacht to do it. What I asked about was not waste heat to power tech, but fusion power economics, which I know nothing about.
They ran the 18 day flight test in September, so not the dead of winter, but also not on the longest summer days:
A solar-powered aircraft has completed an 18-day test flight offering hope it could be used to create internet access for billions of unconnected people around the world...The test flight touched down in Arizona on September 13.
All of this comes with a big "Work in progress" caveat. Still, this type of technology has been theorized for decades so it's nice to see Airbus making progress. Let's see if they can get it to the point of commercialization.
When writing code you have a mental model of the machine, and the language and libraries. Back in the day those mental models could be close enough to reality that writing bug free code could be a choice. It would take time but you can essentially run the program in your head.
This is an old timer, back when machines and abstractions were simpler.
The post makes sense in that context, but today you can't have a complete mental model of the thing you're coding. It's impossible. So bugs.
But we HAVE simplified. I can now type 'fetch this data matching these conditions sorted by this' and be confident about not having any bugs in my code, while underneath there's millions of lines and decades of other people's work and solved bugs being run.
"Back in the day" they would spend weeks devising that system from scratch, hand-writing sorting algorithms, etc.
The mental model of software development is about the problem and domain nowadays, not the low level implementation details. Your job is to solve a business problem, instead of figuring out how to manipulate a CPU and its memory banks to do your bidding.
We have libraries that solve many problems for us.
But those libraries often have parameters, switches, hints, and secret knowledge of the type "do not solve problem X by doing Y because although it seems okay on the surface, the implementational details will increase the complexity from O(N) to O(N^2); use this trick instead".
So we use libraries to solve the problems, but we also need years of experience using those libraries, or we easily create problems we didn't expect. Often there are many alternative libraries for the same task, and the library-specific knowledge becomes obsolete in later version.
You can get that same effect today, just work in the same stack with the same libraries for years and you will learn exactly how everything works. Just need a stack that doesn't do breaking changes that often.
If programming worked like the more manual jobs like plumbing then you'd work 5 years on a stack before they'd call you proficient at it. But nowadays you work 2 years before changing jobs to another stack, and after 5 you are expected to manage people and no longer write code. So the problem is mostly organizational and not technical.
It's not just your tech stack in the narrow sense, your software also can't talk to anything that changes: no browser, no third party service. Systems are rarely that isolated anymore.
Ah, right, I haven't had a regular webdev job. But then that is because your job is to glue together different services rather than implement a complex chunk of code. Most of the jobs I've had were me implementing a lot of low level complex chunks or doing the architecture to make those chunks easy to write, for example when writing the runtime of a ML framework.
So it isn't that the time changed, the old jobs where you code still exists. But we added millions of glue code jobs on top of that, and that changed how people view software development. But it is field specific, even if webdev is the most common you can still work in any of the many other areas where the rules are different.
Edit: And yes, gluing together components is ridiculously hard. It is just hard in a different way, your job then becomes to learn about new things as quickly as possible so you can properly glue them together rather than trying to think about what code to write. That isn't easy at all.
The kind of bugs I write these days are quite different from the ones I made decades ago. My mistakes these days are from high level misunderstandings, rather than low level ones.
You absolutely can have a complete mental model of the thing you're coding. Especially if you properly break it down into pieces (modules, libraries, functions, whatever you want to call them) which can be understood as a unit.
The biggest problem is that many people use libraries as a crutch, without any knowledge of how they work or what they actually do.
Me too... Not 10 years but about 4. I can't imagine not WFH but finally I'm starting to see that I should probably force myself to do some physical interaction. I guess the brain didn't evolve for us to sit alone all the time, as much as I love that
My company has some pretty unique global shutter 360 camera footage, both indoors and out with precision GPS. Going back about 7 years. In all sorts of environments. It's quite a unique dataset and might be interesting to use for your project.
My god it feels like this article was republished from the 1960s but this is actually happening now.