Reminds me when I was working on the video system for a mast on a sub-marine 20 years ago.
Customer had impossible set of latency, resolution, processing and storage requirements for their video. They also insisted we use this new H.264 standard that just came out though not a requirement.
We quickly found MJPEG was superior for meeting their requirements in every way. It took a lot of convincing though. H.264 was and would still be a complete non-starter for them.
I recently went through 6 weeks of PT for injured tendons / tendinitis in my arms with 0 results.
The therapist suggested we try dry needling + electric stimulation for another 6 weeks. So we did that and I recovered 90% in the second 6 weeks of therapy.
There were side effects but they were minimal and completely gone now.
I was skeptical but sold on the benefits and relieved to have an effective therapy option to fall back on when it happens again as it does every couple years. Unfortunately, my insurance doesn’t pay for it.
Without a twin with the exact same injury and no intervention, to compare with, we don't know from this whether it was just the six extra weeks of healing that made the difference.
I do wonder if the first 6 weeks did the work, and the results appears in parallel with the alternate therapy. Of course, this sort of conversation is a prelude to "let's try alternate therapy first, for science!" with volunteers, so there is benefit.
Indeed, the takeaway I get from this is how we tend to underestimate how long healing takes. People expect major injuries to be healed in 6 weeks, but it often takes that long to simply turn the corner toward full healing.
Yes but some anecdotes are closer to evidence than others. And people seem to be treating the above anecdote like it is evidence. Which we both agree it isn't.
It isn't convincing given the time frame / lack of comparison.
People are adults and can be willing to take chances on anecdotes instead of waiting 30 years for science to maybe fund some studies that end up just as murky
My friend had a kid with a bad eczema. She tried everything. Desperate, she took her to one of these charlatans. He asked the girl to stand on a copper plate. After a few days the eczema disappeared. Now my friend totally believes in all this stuff.
It's probably the stopping of other treatments that fixed it. I had bad eczema and psoriasis. It stopped (after weeks) after I stopped treating it with random creams and taking cool showers. I later found out that the culprit was lidocaine.
Also copper is biocidal, so maybe there's something there.
I have small amounts of eczema on unfortunate spots. It comes and goes usually based on stress and inflammation. Been dealing with it for decades. It stinks.
I’m half tempted to buy myself a copper plate to stand on.
I mean, if I didn’t have anything else I was trying that could plausibly explain it, that’d be really hard to resist accepting as the cause. Totally understand it.
It’s hard to internalize we see permanent and seemingly arbitrary changes from things like hormone levels. The last teen pimple for example isn’t noticeable in the moment as the last teen pimple because you don’t know the future etc etc.
We still don't understand the placebo effect. But definitely better to accept it's a thing and move on than believe grifters actually know what they're talking about.
The placebo effect is unlikely to be important here.
Hormonal changes mean people have permanent differences in their skin at specific points in time. Eczema is known to respond more cyclically with menstrual cycles, which is a lot easier to correlate.
The GP told a good story and was very personable and relatable.
But you can treat their data as garbage, pseudoscience, backed by nothing. Because it is. Any effects are likely to be placebo. Wait for real research. Science isn’t a popularity contest.
My point is that it's tone-deaf to complain about lack of rigor when the first thing the comment says is that it's not meant to be evidence. It's like reading a fictional novel and giving it a negative review for not containing sufficient citations for the events being related.
Even with a twin, you still wouldn’t “know”, because there might have been a difference in either their injury, their ability to heal (people can heal at different rates for many non-genetic reasons), or other, even ‘random’ factors.
There is a well-known case study where a man ‘cracked’ each joint in one hand every day, and never ‘cracked’ any joint in the other hand for many years, to see whether it caused arthritis. He didn’t get arthritis in either hand. The only thing you can take away from that is that cracking the joints doesn’t necessarily cause arthritis for him.
The person posted an anecdote; you don’t have to rely on in, but your dismissal is shallow and unhelpful.
I don't think the dismissal is that shallow. The original anecdote came with a conclusion, the person you replied to seemed to be trying to warn against such conclusions.
So the reason the plural of anecdote is not, in fact, evidence is because science doesn't actually work by piling up data in favor of a hypothesis.
It works by disproving other hypotheses until only one (or more excitingly, zero) is left.
An anecdote like this doesn't disprove the null hypothesis of "the patient just got better after awhile, because people frequently just get better after awhile". It doesn't matter how many similar anecdotes you stack up, because the null hypothesis still hasn't been disproved. You could have millions of perfectly true, identical anecdotes, and it still wouldn't change the situation, so why should anyone listen to one?
(Now, anecdotes are useful for identifying avenues of search, but that means the only thing you should be doing after reading an anecdote like this is running off to do a lit search for any actual studies, not trying it yourself or yes-anding with your own anecdotes.)
On the other hand, there are situations where an anecdote provides ample evidence. If a reiki practitioner walked up to a patient with a complete dissection of the lower spine, verified on X-ray, waves his hands over the patient, and a week later the patient is up and walking, holy shit, reiki works! There is no "people sometimes get better"[0], so the null hypothesis of "the patient will still be paralyzed" would have been disproven adequately by a single anecdote, assuming fraud was ruled out.
[0]I don't actually know for sure that people don't spontaneously get better from such a injury, but it was the clearest example I could think of.
So if you say something is an anecdote, then that anecdote is immune to any discussion or analysis?
How about the idea that some anecdotes are better than others.
E.g. "Anecdotal, but I took paracetamol and found it wasn't helpful for my pain. So I don't think it works."
There's an anecdote for you, maybe you should stop taking paracetamol now. By your logic no one can discuss, analyse, or point out it any potential issues with it.
And btw my stance is that the electrotherapy is interesting and plausibly could help. But tendonitis issues can heal with 6 weeks of basically rest, and that should be acknowledged in the discussion. (12 weeks in total, including the 6 weeks with a PT.)
> tendonitis issues can heal with 6 weeks of basically rest
Not disagreeing with your larger point, but at least for triceps tendinopathy (still often called tendonitis of the elbow), based on getting this myself and doing some research online, the consensus is that it generally doesn't heal from just rest, and that although techniques like massage and foam rolling can offer substantial pain relief, this is only short term. My conclusion was that the only effective therapy is doing slow eccentrics -- allowing your initially extended elbow ( = straight arm) to slowly "lose the fight" against a force trying to flex it (trying to move your hand close to your shoulder), and gradually increasing the force (weights, bands, etc.) over 3-4 months as you become able to do so without pain.
I hope this random tidbit helps someone with a sore elbow.
For Achilles tendonitis, I was told the following:
1. It's going to come and go for the rest of your life.
2. Just try to stay off it while it hurts; here's a couple of simple things to try when it's flaring up (eg, wear shoes or lifts with a > 1 inch difference between the heel and to) to reduce the pain. Don't worry about it when it's not hurting, feel free to keep running etc.
3. At some point it may stop going away; at that point there's some surgical interventions, but they all have mediocre outcomes so you don't want to try that unless you're out of options.
So far it's been five years of minor flare ups once or twice a year lasting a week or two at a time. Goes away without intervention, doesn't seem to be getting progressively worse or more frequent at this point.
There's another universe I'm living in where I tried some treatment for it, and now I swear by it, running off to get it every time I feel a twinge -- after all, that first round of tendonitis was terrible, I could barely walk, it took several weeks to recover, and all those subsequent flare ups only lasted a week or two, and I can usually hobble through them without too much trouble.
I’m not a doctor. I have been through the runner with a different injury (slipped disc), and I’ve seen the medical advice change in my lifetime. my conclusion is pretty loosely “it’s going to come back, the cure may be worse than the disease, the better you take care of it when it’s good the easier it’ll be when it flares up”.
Look at elite athletes - golfers, tennis players, etc. they put their bodies under the stresses we do, pick up “career ending” injuries and manage to recover from them in many cases.
The person you are replying to isn't saying you should give anecdotes more weight. They are just saying that dismissing something outright because someone used an anecdote is similar in nature to blindly believing in an anecdote.
I don't necessarily think there was a problem with the comment they replied to.
This is easier to test than arthritis. A doctor can make incisions on both arms of the same person at the exact same depth and length, then apply electricity to one of the wounds and monitor healing time.
To misquote the amazing James Randi: if you throw a thousand reindeer off a cliff and none of them fly, you haven't proven that reindeer can't fly. You've proven that those specific one thousand reindeer either can't fly or chose not to fly.
Yeah, that's a "no" from me dawg. My PT stuck the needle in, and I was fine with that. Then he moved it a little, and I turned pale as a ghost and started sweating. Same thing happened when I had my nerve conduction study - never again. Needles going in and out is fine. Needles moving around under my skin ain't gonna happen any more. (Except at the dentist, but that's what the laughing gas is for!)
The whole point of the electro therapy is to make the muscle move though, so this is effectively the same (Galilean relativity) as moving the needle, right?
I had the same thing happen once, and it was as fascinating as it was unsettling. Very slight movement of one needle in what seemed like a pretty inconsequential part of my body produced a near-instantaneous full-body reaction involving many systems.
That's the magic of action potentials. As sodium ions (+1 charge) propagate, they dissipate throughout the cytosol and sometimes leak out of the cell membrane, but they also trigger their own influx of regenerative current by opening voltage-gated ion channels on the cell membrane. Think of it as a "signal repeater".
As long as the initial stimulus is strong enough to trigger an action potential, the signal propagates all the way from the nerve ending to the central nervous system, and whatever response the CNS cooks up always makes it all the way to all the muscles it intends to trigger. Stated another way, the peripheral and central nervous system have enough of these signal repeaters for any signal to travel anywhere.
I usually great with all kinds of pain, but I had to have injured fingernails removed and they put needles down the side of my fingers to numb them. Needles against the bone, not a feeling I want to experience again.
I didn't return for the other nail, I preferred to do it at home with a knife, it was less painful.
I had the same problem with my elbow, electrotherapy did not help. Turned out it was systemic inflammation in my body that was preventing it from healing. Change of diet fixed it.
Something that has really helped with inflammation and improved my diet is a Spirulina and Chlorella algae supplement. I take it with Metamucil to help with absorption in the intestines as well. If you use the powder version, Metamucil covers up the umami taste, but the tablets are tasteless.
This is from some research that shows spirulina can absorb toxins, and then when I take Metamucil, it helps to slow my body's processing of the supplement.
What I have noticed is that the increase in nutrients will go through me super fast since I have a lower fiber diet, but when I have taken it with Metamucil, the fiber increase really slows down my bodily function. I figured if the algae is moving more slowly, it should also be able to have a higher absorption rate since it stays in my body longer.
Spirulina had a moment, and I took it, in the UK in the 90s. I'm struggling to remember exactly why I took it or stopped taking it, but, as a veggie, it did seem to offer everything else I might be lacking in my diet.
As I've aged my digestion is no longer as robust as it was, so I'm keen to give spirulina another go, plus the husks (metamucil is especially good, you think?) for a few months and see if I feel any different.
The removal of heavy metals is a plus, though I also am a little skeptical of how essential removal is, or of the genuine serious long term damage caused. I need to read up more.
I stopped eating desserts and sweets, stopped overeating, stopped eating after 7pm, switched to eating more veggies and unprocessed foods. I started taking supplements (ag1 powder, fish oil, turmeric). Once inflammation started to reduce I started to exercise more. It wasn’t just pain in my elbow, it was also in my foot and my shoulder, for many years: all better now.
Tendons take a long time to heal, much longer than skeletal muscle damage. I'm sure electric stimulation helped, but it could have just taken 12 weeks for the tendons to recover.
I've had electro-acupuncture to as part of my recovery from shoulder surgery. One possible side-effect is that nerves can occasionally misfire or auto-fire. It could manifest itself as a tick or a twitch, where a specific muscle fires on its own without any stimulus (or the wrong stimulus). It goes away with extra physical training. I guess it is to be expected as the needle does cause some minor physical damage on insertion and removal.
Do you have any opinion on tens units? I have found them ineffective, but perhaps one can be modified?
If you happen to be aware of a diy poor man's hack, maybe point me yonder. I gots lots o' problems. I'm also interested in zapping me 'ead, but that's more complicated and... seemingly expensive.
As one of the “skilled electronics engineers” in the US you could count on US soil (whatever that means) I can tell you this article reads very strangely to a EE.
“we were able to take all those designs and spin up our own SMT, it's called Surface Mount Technology”
“run that through our surface mount technology by our line operators”
“meaning the printed circuit board or PCBA assembly”
So, he’s definitely not an EE. No EE talks like this when they are trying to explain the nuts and bolts to a lay person. Either that or the editor took liberties they shouldn’t have.
It's a transcript of an informal podcast interview with - clearly - a marketing guy who may or may not have 'engineer' in his title.
I've worked with dozens of guys like this over the years. They could elegantly bullshit their way through any discussion. They had an answer for every question, even when they didn't.
There's a reason they don't send the design engineers to trade shows.
Steve Jobs was one of these people. A clever marketing guy who relied on others for technical heavy lifting. I suggest going back and re-watching some of his presentations, like the unveiling of the iPhone. Every word he said was meticulously planned and very rehearsed.
Not that any of that matters, because engineering is a team sport, and that's where taking this too literally becomes a problem. Just how like a football team is made up of different skills and varying physical builds. The reason they don't send the design engineers to the conventions is because they are too honest and will spill the beans on the product's shortcomings, or inundate the customer with irrelvant details.
> * Steve Jobs was one of these people. A clever marketing guy who relied on others for technical heavy lifting. I suggest going back and re-watching some of his presentations, like the unveiling of the iPhone. Every word he said was meticulously planned and very rehearsed.*
Before Apple entered its iPod era, Jobs could do a reasonable job of taking questions from a technical audience
No single person on this planet can know everything about a product as complex as a phone or any other modern device, and the expectation of some people form execs even ones who were engineers is simply unrealistic.
If you know everything about your product down to the most low level technical detail your product is either a brick (and I think that even that is too complicated) or you greatly overestimate what you actually know.
> because they are too honest and will spill the beans on the product's shortcomings, or inundate the customer with irrelvant details
Yeah, getting upset an EE who has the skills to build a cellphone from scratch isn't actually moonlighting as a writer doing a blogspam version of a podcast interview fits that quite well
Steve Jobs was not a marketing guy. If anything, he was a designer. His technical knowledge was also way beyond most CEOs. He designed his presentations with a high attention to detail just like he designed his products, product ranges and companies. If you watch any one of the many interviews he gave you'll see that he can talk off-the-cuff, in depth on all kinds of subjects. And, unlike many modern CEOs, he pauses to think before opening his mouth.
Steve Jobs was one of these people. A clever marketing guy who relied on others for technical heavy lifting.
That's the currently-fashionable revisionist history. But the truth of the matter, from his contemporaries, was that he knew is stuff. He was also good at marketing.
I suggest going back and re-watching some of his presentations,
I suggest going back and re-reading some of the print interviews he gave to technical publications. There's no question he knew what he was talking about.
> But the truth of the matter, from his contemporaries, was that he knew is stuff.
Read anything on folklore.org, and you can see that's not really the case. He prescribes a lot of stuff that they just had to get around, typical pointy haired boss stuff.
I think that's a very myopic view of what Jobs did. I am of the opinion he was one of the greatest designers of all time.
Just because he didn't move pixels across the screen doesn't mean he wasn't setting the design language, defining taste, sweating detail and holding the vision. No-one would suggest that a show-runner didn't make TV, or that a director wasn't a filmmaker. Jony Ive's design changed (and improved) immensely one he was working closely with Jobs. Once Jobs was gone things drifted. Similarly Pixar was hyper-focused under Jobs then began to drift as soon as he was no longer involved.
Visionary and Product Designer are different jobs. Generalize them as the same thing if you like, but he was a CEO and a visionary. He didn't design products, he criticized and made demands of the designers.
You're just jealous. These guys have spun up their own RoHS and are doing a 100% EDA automation with full Verilog over there. By doing the reflow process (it's a way of building integrated circuits) they're able to offer complete impedance right here in the USA.
Before retirement my father was employed in a company certifying medical devices.
Half the descriptions provided by those who made the devices were this sort of word salad because they concerned products which were obvious scams[0].
On person in particular was editing the description on the fly and was looking for a word so dad jokingly suggested "impedance". "Yes, thank you!" replied that person - her face lighting up as she added the word.
[0] Like a vacuum cleaner which was supposed to dispense a mist of medication. Initially rejected as there was no dosage control whatsoever, but I heard that eventually somehow it was certified.
It's a complex problem, there's a lot of resistance from consumers who react badly to the price of domestic goods. Maybe tariffs will induce more demand, but I'm not sure the capacity is there in the first place.
It's not really a problem of "resistance", it's more about purchasing power, common and avoiding feeling ripped off.
People buy stuff competitively and that's it. There are modifiers, notably being rich enough that regular items prices make no difference to you, so you can buy all from your own country without affecting you too much.
But even if you are middle class, buying most items at a higher price just because they are from your country is just a waste of money from an individual utilitarian point of view.
It directly affects people and they always favor that, even if in the long run doing so might have a second order effect that will affect them in worse ways.
Tariffs, taxations and special legislation is actually the only way to make some product competitive for your own country. Especially when they are a participant in the trade willing to take a hit just to corner the market. This is basically what China did for many things, so here we are...
Not sure watt you guys are on about. Not to be too negative here or polarise the debate, but I remember the electrifying experience as a child to source local products instead of relying on imports from faraday countries. I guess technology has lost some of its radiance and has just become a mains to an end, to feed the addiction.
I'm hoping it induces the reversal in reckless culture of consumption and waste and longer end-product life cycles on the companies that design and manufacture them.
You're mixing your processes - is he making his own circuit boards (reflow) or making his own chips (verilog) - and I have no idea what "complete impedance" even means in this context - HN really needs to stop AI posting here
We really should not be doing jokes like this in times like these where the US president makes those kinds of remarks on a daily basis while being 100% serious
There's a somewhat better discussion of this phone here.[1] At least the making of the board.
Board manufacture, SMT pick and place, and soldering are all automated, and the equipment is widely available. Everybody does boards roughly the same way.
The assembly problems in phones come from all the non-board parts.
See this iPhone teardown.[2] Look at all those little subassemblies. Some are screwed down. Some use elastic adhesive. Some are held in place by other parts. They're connected by tiny flexible printed circuits. That's the labor-intensive part. Usually involves lots of people with tweezers and magnifiers. They don't show that.
So here's that part of assembly in a phone factory in India.[3] Huge workforce.
For comparison, here's a Samsung plant.[4] More robots, fewer people. Samsung made something like 229 million phones in 2024. If a US company produced phones at Samsung volumes, the price would come down.
There's another way to do it. Here's a teardown of a classic Nokia "brick" phone.[1] That's designed for automated low-cost vertical assembly. The case provides the basic structure, and everything can be put into the case with a vertical push. There are no internal wires to connect. There are simple machines for that kind of assembly. Then everything gets squeezed together, and you have a hard block of an object that's hard to damage.
If you can design something which can be assembled in that simple way, high-volume manufacturing can be automated cheaply. Smartphones are not built from parts intended to be assembled in that way, but that's a decision based on cheap labor, not one that's inherent in smartphone design.
Design for assembly was more of a thing when manufacturing was in the US. The Macintosh IIci was designed for vertical assembly. Everything installed with a straight-down move. The power supply outputs were stakes that engaged clips on the motherboard. No internal wiring.
The trade-off of the current smartphone assembly process (many parts and many steps) is driven by numerous factors, including cheap labor. It also considers: incremental design improvement, testing, defects, supply chain, model differenciation, ...
"but that's a decision based on cheap labor, not one that's inherent in smartphone design"
This is the heart of the matter. The US has abandoned skills because cheap labor in Asia. An example from the story about dealing with touch screen tests: they're employing disposable workers to toy with pinch and zoom testing; something easily automated with a simple machine and image comparisons. How sad. This is an actual regression in technology.
If the US wants to get manufacturing back, the only areas that matter are electronics and, to a lesser extent, machinery. See this chart.[1] That's an achievable goal.
Here's a useful smartphone that could become big:
- Solid state battery that will last at least 5 years.
- 5 year full warranty.
- No connectors. Inductive charging only.
- Screen as unbreakable as possible.
- Sealed unit. No holes in case. Filled with inert gas at factory.
Then Tim Cook gave up on manufacturing. Which was how it saved Apple.
Steve Jobs always had a somewhat fantasise vision of dark factory. He wasn't able to accomplish that when Apple was still fighting for survival. But now Apple has more cash then it knows what to do with it.
a bit of the problem is that modern elements like display + touch screen require a lot more bandwidth than 3110 - for example the displays require ridiculous bandwidth in comparison to the nokia, like 10 gigabit/s for Samsung Galaxy S25 (basic model, not plus/ultra), plus connectors for the cameras.
At the very least you can't really make the screen soldered-on, and the simple connectors used in Nokia might not work out for such high bandwidth use case. Same with cameras.
Thin ribbon connectors are one of the hardest things to automate from what I remember regarding Sony's efforts to automate PS5 manufacture.
> If a US company produced phones at Samsung volumes, the price would come down.
The problem is, there are no Western manufacturers left that have the brand loyalty to bring such a large volume of purchases to the table.
The giants are so giant, it's almost impossible to compete with them in the consumer mass market. The only way you can outcompete the giants is by focusing on tiny small niches where consumers are willing and able to pay a premium - the government (auditable supply chains) and eco-progressives. That's where Tesla started, that's where Purism and Frame.work live.
Your scenario is more like a best-case option, actually. I mean currently there are only 13M people employed in manufacturing in the US [0], while output is at an all time high [1]. The vast majority of this manufacturing is dependent on components imported from other countries - which just got much more expensive. So even if employment in manufacuturing would increase by 20% (unrealistic IMO), that would only translate to 2.6M people - while at the same time losing multiples of that in better-paid jobs in other industries, mostly services.
I think you might even lose a bunch of these jobs, at least in the short term, as businesses now need to free up money (they likely hadn't planned to initially) to pay for tariffs before their goods / parts are auctioned off at the port. That's even before consumer spending tightens up due to rising prices, and declining stocks.
Pretty much guaranteed. The goal of modern automation isn't more people it's less. People love to spout "but the industrial revolution just made people able to do more jobs". But the goal of modern automation is to _replace all jobs_ that it can.
Then you hire 4 guys to maintain all the automation between 5 factories they drive between as needed.
Yes, that's what civilization -> industrialization -> automation does: eliminate jobs, which opens up opportunities for new jobs.
you are no longer an animal spending most of your waking life searching for food, nor do you build your own shelter, make your own clothes, construct tools, etc
yes, automation seeks to eliminate factory jobs, most of them are pretty awful anyway. this opens up new options as every step along the way always has
and yes, the change isn't always easy for the folks that have to find something new
The goal isn’t actually specifically employment increases, that’s mostly a marketing strategy, the real goal is national security. US, Japan, and South Korea seem to have decided enough is enough with Chinese aspirations and threats to Taiwan, so US has convinced them to build additional capacity in the US and also to have those nations increase defense spending. Notice Japan has started joining NATO command and participating in NATO missions. I predict Japan will the be first “deal” announced by Trump administration, with South Korea soon afterwards. It makes sense for these allies, the logic is we should fortify our supply lines building redundant facilities in US homeland which is much harder for China to disrupt and attack, you guys start buying lots of F-47s, we start massive ship building, re-industrialize as rapidly as possible. Then should China try anything and somehow mess you guys up, the US will come back a get you out of it.
That would be sane, but it makes no sense then why Trump is threatening tarrifs on Canada or the EU - both places that also need to do the same. (move manufacturing out of China)
Sure it does. The strategy is based on chaos and reminding all the world, allies included that the US is in charge and they want some very specific changes from both Canada and EU, they need them to militarize quickly, the US military is furious that all their allies appear to be almost incapable, very little naval power specifically. I’d even argue the implied threat to leave NATO, the talk of annexation of Canada and Greenland, it’s all strategic psychological warfare on allies to shock them into action … and it’s working … take a look a Germany’s new military budget and plans. There is also an intentional devaluation of the dollar to assist in re-industrialization. This is all national security and world order driven, not economics, and it’s actually the optimal time during a strong domestic US economy to try to make these changes.
It's national security to destroy all of your alliances? In that case, what is the reason Russia is exempt from these tariffs? Reverse-psychological warfare?
First of all there is (EDIT: almost) zero trade between the US and Russia currently, same with North Korea. (EDIT: perhaps some token signal of wanting to negotiate over Ukraine? or perhaps even more “sinister” - getting US political opposition to falsely argue Trump is a Russian agent and make themselves look silly)
Second, yes part of the strategy is to force allies to self assess themselves and their dependence on US power. Trump and Nixon had a personal relationship and his fundamental strategy in business is based on creating uncertainty, it’s literally like point 1 of his “Art of the Deal” and however another part of that strategy is being willing to walk away.
We are living through a turning point in history where current US administration has reversed the open policy to China and for national security reasons are working to re-industrialize and militarize quickly as a strategy to deter Chinese ambitions.
It’s fine to disagree and argue the neoliberalism strategy of globalism isn’t dead but politically it is. Of course that world order is fighting to survive where it can, UK, France, Germany all putting up resistance to the rise of neo-mercantilism and nationalism but we will see if canceling elections, restrictions on speech and jailing politicians will work to block it.
Maybe one change, but there is far too much going on and thus diverting attention. Pick something and fix it, not a million things and divide your attention and thus get nothing done. (not that other presidents were better, but part of that is good change is slow in many cases)
I don't buy that [3] is bad and [4] is good examples. That Samsung plant reel doesn't show the same parts of assembly as the first one - I bet those videos are just focusing on different parts of fundamentally identical factories.
I was going to skip this article until I read your post, it got me curious. You're totally right, it does read really weird. It made me laugh a bit, I needed that this morning. Thanks!
I have also "spun up my own SMT". It's a 50 USD hot air rework station and maybe 20$ of consumables in a 4 meter square workshop (I live in Asia). It would be challenging, but possible, for me to assemble the PCBs in their photographs by hand. There are indeed a lot of people like me.
He certainly meant an "SMT line", because phones assembled on a manual station in the USA (outside of shit quality) would cost well in excess of $2000.
They might, if their expectations are as simple as an on ramp to better or more stable things. It would also make sense for those who are using this method for career change.
I have a coworker who "couldn't hack it" as a paralegal and is now working in the line for server assembly. Or another coworker who came from a major daytrading firm to work quality control with me.
That’s not what they do. As Tim Cook said multiple times the engineers are needed as floor and line managers, to coordinate parts of the process, to set up new lines quickly etc… those are not the ones doing the actual soldering.
It extrapolates broadly. It's kind of a funny thing. When somebody doesn't know much about something but wants to pretend they do, their vocabulary comes off sounding like a thesaurus of vernacular, but when you speak to somebody who genuinely knows something, to the point of having an intuitive feeling/understanding of it, they could easily explain, at least roughly, even the most esoteric topic in a relatable enough language that a high schooler could understand.
Space stuff is another domain that's just chock full of this.
I don't think this is true. Knowing something well and being able to explain it in simple terms are unrelated skills. Plenty of people who know their domain super well just can't explain to lay person.
The interviewee is described as "Purism's founder", who even says "we took our own electronics engineers (EEs)", implying (though not explicitly stating) he doesn't include himself in that category.
I do think there's an interesting conversation to have here though about workforce management, as someone who lives in adjacent worlds.
If you are long term greedy, like China, a great strategy to capture dominance of a discipline would be along the lines of how to boil a frog. Start by sending grad students to the top universities, ensuring they work for the PIs for cheap, bring as many of them back to China as you can, but tolerate a leaky return path so as not to stir up notice. Advertize their high post-training employment rate back to the universities to keep their valves open even as you start developing your universities internally, and eventually throttle down the outbound grad student pipeline. At some point after it's too late, the top universities, and their countries, look around, bemoan the lack of people in their discipline, and then just give up because by now they're old and tired.
Seems like something that has happened in chemistry, physics, and EE for sure. Once you start thinking this way, all sorts of things start making sense. Like maybe they looked at solar as a cheap, low threat point of entry for developing silicon fabrication capabilities. Software engineering, being a relatively soft skill, comes along for the ride.
Not sure about other fields, but if AI can take on a rapidly increasing set of fields, you start seeing this as how China primarily harvests not IP but workforce training from the global West, then technologies happen to fall out, then one day China has solved for their own graying work force at the same time they've solved for global economic dominance.
And a non-trivial contributor was the US governments (I blame the states too) defunding education.
This is an interesting suggestion. I'm curious what you mean by "sending grad students to top universities":
1.) the target universities have to accept the students, right?
2.) This implies some top-level RTS-game-esque control of the grad students when, in reality, they're making independent choices (albeit influenced by many factors, including govt promotion)
3.) Seems like the rational decision for ambitious grad students is to apply to said top universities (which may just happen to be abroad).
Same for "bringing many of them back": I read it at first like it was akin to some sort of spy agent network when in reality "bringing back" probably means various incentives, not some forced thing. Carrot, instead of stick.
1) target universities have to accept the students
Yes! Which the US incentivizes by a) underfunding K-12 education, reducing the internal applicant pool, b) competing grants in a way that incentivizes PIs to grasp for cheap labor. Additionally, the individual states also incentivize this. Look at the UC's own statistics: since 2009 the highest chance of acceptance goes to foreign, ethnically Asian applicants. https://news.ycombinator.com/item?id=20321493
2) RTS-game-esque control of the grad students
Yes! Having been in the room when one of those grad students got a very stressful call from home (like broke down crying multiple times), it's definitely not all carrots. And the removal of carrots eventually looks like a stick.
3) Seems like the rational decision for ambitious grad students
A quick Google search shows that there are somewhere in the range of 20,000 electrical engineers who graduate US universities every year. Even if not all of them do electronics, and not all of the ones are considered “skilled” (by this author’s definition), there are not a “countable” amount.
My read on this that they don't mean EEs as in IEEE, but "engineer" as in "sanitation engineer", i.e. people who assemble electronic devices in factories.
I suspect this is a case of Gell-Mann amnesia. This article is not inconsistent with the quality of articles in blogs, the news etc. I believe you (And I) notice this due to expertise in the area.
Hi, @Timot05. I’m a former EE with 20 yrs experience working in the industry and have designed dozens of very large complex mixed signal PCBAs 4-32 layers, as well as about the same number of large FPGA SoC designs.
I watched the demo video out of curiosity and here’s my 2 cents, though there is a lot to unpack here:
First if you want to know the current state of “how does git look like in hardware” as far as PCBA design is concerned look at Altium which uses git under the hood now to provide a very nice visual way to show differences in both the schematic and PCB layout between versions that solves some real pain points for an EE and therefore EEs actually want it and use it. There are ways to create reusable, version controlled sub-circuits that get put into libraries as well:
Whatever open source you build should be modeled after that.
I found the above very nice after years of manually using git to version control PCBA designs developed in other ECAD tools like PADS, Orcad, Cadence etc. I even tried to get a couple EE’s and layout people to use my methods of git version control, documented in the README of course, but to no avail, most strictly hardware EEs (with no FPGA or SW background) either can’t grasp git or don’t see the point in spending the time on it. The same people will quickly pay $10k-15k a seat for Altium to get that slick UI on top of git version control because it makes things more visual, not less, which as others have mentioned is important in PCBA design I’ll get to in a second.
But, I understand better what you actually mean because you phrased it another way “how can groups of people coordinate and share their work in hardware?”
That depends of course how / who are you coordinating and sharing with? How are you dividing up the work?
In my experience, even with a very large complex mixed signal design, there is one guy per board, and in the extremely rare case that a single board is being worked by more than one person it is usually divided up by discipline typically, RF, analog and digital and those people contribute their pages in any number of ways including using sub-circuit modules out of an Altium library.
And, EEs lean heavily on reference designs and eval hardware to do their work. Though they do spend a lot of time reading a lot of data sheets they really don’t have time to read most of them, just reference the ones they need to integrate existing designs and handle the new parts of the design (which need to be minimal). They need to get large parts of the working designs handed to them with eval hardware they can test and vet on the bench, and with reference designs (schematics, layout, firmware and some documentation) provided by and supported, somewhat, by the vendors of the major components. Not some open source developers who can’t really support the major components utilized in their designs effectively because they don’t have access to all the information they would need to do that, talk to the fab etc. They can only talk to FAEs, same as all other engineers outside the vendor.
Also, PCBA design is very process oriented. Even if a company doesn’t have a PCBA design process this will not slow down an experienced EE who I say has learned what I call “The process” with a capital T. Schematic capture and layout are 2 very hard and fast steps in that process with boundaries that been defined by decades of EE college curriculum and EDA development that aren’t likely to see any change in EE in the near future, though I think the EDA industry is in need of some disruption somewhere, it is not here.
I started designing CPLDs and FPGAs right as most people had switched from capturing their chip designs in schematic form to capturing them in verilog and VHDL. It was a disaster. Most EEs in that space (including myself for a time) were really bad at architecting their modules and then structuring and writing the HDL code to make sense and be maintainable. Good documentation and block diagrams were essential for development and maintenance. And, after 20 years, it is still a huge problem for a lot of designs. I really wish someone would make a good tool to aid in the construction of block diagrams from the code of large FPGA designs. Too often I would inherit a legacy design and find myself having to spend a week or two creating block diagrams and other documentation for the design.
I shutter to think what it would be like to try to understand and debug a PCBA for a 20 layer board that was just a bunch of code or text and didn’t have a schematic and some decent documentation. It’s bad enough that EE’s are frequently handed a large schematic and layout with little or no documentation for it except a folder full of the data sheets for each component.
Thanks for your detailed answer! Tons of information in there.
To your point about how the work gets divided, I don't think that work will necessarily be distributed amongst many people for a given design but code gives you a couple of benefits:
- you can branch and merge the changes you make, which means that you can design a given feature and have your team review only this change instead of reviewing a batch of changes.
- If someone actually does want to pickup your design and wants to contribute to it, code makes it easier for them because more information is captured in the code, so they get a better sense of what the design requirements are.
To your last point, we don't really see the code becoming the documentation of the implementation of the module. Actually the goal is to do something similar to the IC space or the software space, where the code becomes the implementation of something but the documentation lives somewhere else, perhaps in a datasheet-like document. Currently the schematic forces you to have both the documentation and the implementation baked into the same document, which forces you to find a middle ground between adding information relevant to those two objectives.
I don't think this a fundamental exclusive benefit of "code" (ie text). I am skeptical how textual merging, especially default automated, can be applied in this domain except in trivial ways. This is a usually easy sell for software because you can structure your system to localize effects (even if there it isn't perfect). The big guns have very sophisticated workflow and tooling for this.
> code makes it easier for them because more information is captured in the code
I sincerely don't understand. Perhaps it is informative to consider that SPICE has been around for 50 years. The idea of using textual description for circuit design is not new at all, and they are used when appropriate. One must ask why the basic workflow is the way it is when the alternatives have always been possible.
> Currently the schematic forces you to have both the documentation and the implementation baked into the same document
> what it would be like to try to understand and debug a PCBA for a 20 layer board that was just a bunch of code or text and didn’t have a schematic and some decent documentation.
That's a seriously important point. Open source is filled with just plain horrible code. EE's who are not generally trained in software development can deliver reasonable schematics. There's the potential for utterly unintelligible designs if they had to create them using software techniques. I have seen FPGA code that is most-definitely cringe-worthy and unintelligible without, as you said, stopping to draw a schematic.
One way to think of this --in terms of circuits, not FPGA's-- is that a schematic can stand on its own. A text based netlist, even if sophisticated, requires documentation and, more than likely, a schematic.
Customer had impossible set of latency, resolution, processing and storage requirements for their video. They also insisted we use this new H.264 standard that just came out though not a requirement.
We quickly found MJPEG was superior for meeting their requirements in every way. It took a lot of convincing though. H.264 was and would still be a complete non-starter for them.
reply