I think it's more nuanced - I agree there's no clear road to AGI like some would like you to believe, on the other hand there are new and useful techniques that are real.
Those PhDs may not have the stellar careers the they might currently expect, but they'll find plenty of work building automation and assistance systems for "boring" existing workflows.
Things like making your IDE a bit better, making some data entry jobs redundant, automating quality checks in increasingly complex cases, assisting in biomedical image interpretation ...
These things provide essentially a lower bound for how bad the "winter" might be.
PhDs in this area are sufficiently embedded into generic technical industrial practice, perhaps unlike early eras.
> assisting in biomedical image interpretation
I suspect there's going to be a (counter-)renaissance of expertise and anti-tech sentiment because of gross technical failures in these areas.
I think tech budgets into these projects are presently so massive, and so unlikely to deliver on an ROI, that reputations here are going to be destroyed.
So the question is what the nature of this counter-reaction will be. I suspect, within the decade, people will be hiding "AI" from their CVs and describing all the python/data-eng they were doing.
Those PhDs may not have the stellar careers the they might currently expect, but they'll find plenty of work building automation and assistance systems for "boring" existing workflows.
Things like making your IDE a bit better, making some data entry jobs redundant, automating quality checks in increasingly complex cases, assisting in biomedical image interpretation ...
These things provide essentially a lower bound for how bad the "winter" might be.