I mean how did you get an expert programmer before ? Surely it can’t be harder to learn to program with ai than without ai. It’s written in the book of resnet.
You could swap out ai with google or stackoverflow or documentation or unix…
Yes, people underwrite their own debt with their future labor. Economists don’t count this leverage on future labor as wealth for the poor but for some reason count it for the rich after renaming it to bonds.
Without usa the way it is, Australia would be much less prosperous. From the perspective of employers and consumers, labor costs are the same. It’s just that in Europe and Australia, taxes are a larger percentage of cost of labor.
You have it backwards. Layoffs these days increase stock value because everyone is hedging that bad job numbers will force the feds to lower interest rates. Something Powell has hesitated to do in order to keep inflation in check.
It's a very screwed up incentive to be rewarded for breaking the system, but that's 2025 in a nutshell.
If you have some source to make the case that layoff-stock price change is correlated for a different reason these days, it would be interesting to read it. But I doubt anything has changed
>If you have some source to make the case that layoff-stock price change is correlated for a different reason these days, it would be interesting to read it.
The phenomenon is pretty recent so there won't truly be any studies on it in a while. But look up "Jobless Boom". Here's a piece of what I'm talking about:
>For much of 2025, the job market was described by economists as "no hire, no fire," meaning an environment where workers could count on job security even as hiring around the U.S. cooled. But conditions have changed, and the Federal Reserve cut its benchmark interest rate in both September and October, citing increasing risks to employment growth and with Fed Chair Jerome Powell noting that policymakers are closely watching layoff announcements by big employers.
personally, I think the AI efficiencies are a smokescreen, but the point of how this job contraction is forcing he fed's hands is hard to ignore after some 2 years of holding rates steady.
Intel has announced that Intel 18A manufacturing will take place in Arizona. Salaries are a relatively small amount of the total costs of running a fab.
> When it comes to machine learning, research has consistently shown, that pretty much the only thing that matters is scaling.
Yes, indeed, that is why all we have done since the 90s is scale up the 'expert systems' we invented ...
That's such an a-historic take it's crazy.
* 1966: failure of machine translation
* 1969: criticism of perceptrons (early, single-layer
artificial neural networks)
* 1971–75: DARPA's frustration with the Speech
Understanding Research program at Carnegie Mellon University
* 1973: large decrease in AI research in the United Kingdom in response to the Lighthill report
* 1973–74: DARPA's cutbacks to academic AI research in general
* 1987: collapse of the LISP machine market
* 1988: cancellation of new spending on AI by the Strategic Computing Initiative
* 1990s: many expert systems were abandoned
* 1990s: end of the Fifth Generation computer project's original goals
Time and time again, we have seen that each academic research begets a degree of progress, improved by the application of hardware and money, but ultimately only a step towards AGI, which ends with a realisation that there's a missing congitive ability that can't be overcome by absurd compute.
Well, expert systems aren’t machine learning, they’re symbolic. You mention perceptrons, but that timeline is proof for the power of scaling, not against — they didn’t start to really work until we built giant computers in the ~90s, and have been revolutionizing the field ever since.
If you think scaling is all that matters, you need to learn more about ML.
Read about the the No Free Lunch Theorem. Basically, the reason we need to "scale" so hard is because we're building models that we want to be good at everything. We could build models that are as good at LLMs at a narrow fraction of tasks we ask of them to do, at probably 1/10th the parameters.
reply