Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I suppose, if a huge proportion of workers get replaced by AI software, leading to mass unemployment, there will be impetus for governments to step in and force corporations to contribute to some sort of UBI or social wealth fund.

I looked for any comments that expressed this sentiment, and as of writing this, I only found one. It's like people don't know or remember history. I mean, look at the French Revolution—look at every other revolution. I kept waiting for the article to mention political changes, but it was only about capital and what to invest in, as if capitalism is something that will survive superintelligence (I mean real superintelligence, not chatbots).

What do these people think—that 99% of the population will just become beggars? Sooner or later, all capital will be overtaken by the state, because the main argument against it - inefficiency, will no longer apply. People will vote on AI-generated proposals for energy distribution, so basically where as society we want to allocate it. Budgets will no longer be about how much money we want to spend, but about how much energy we want to allocate.

Private capital and private companies stop making sense when you have a superintelligence that is smarter than any human and can continuously improve itself. At that stage, it will always allocate resources more efficiently than any individual or corporation.

Leaving that kind of power in the hands of the wealthiest 1% means only one thing: over time, 100% of land and resources would end up controlled by that 1%, the new kings, making the rest of the society their slaves forever.



> Leaving that kind of power in the hands of the wealthiest 1% means only one thing: over time, 100% of land and resources would end up controlled by that 1%, the new kings, making the rest of the society their slaves forever.

Isn’t this exactly what has happened all through out history? I don’t really see the future playing out any differently.


I sadly think if the promise of AI happens this is the likely economic outcome. The last century or so was an anomaly from most of human history; a trend created by the "arms race" of needing educated workers. The prisoner's dilemma was if you trained your workers in more efficient tech you could out-compete and take all the profits from competitors which gave those educated workers the means to strike (i.e. leverage). Now it is the "arms race" of educated AI, rather than workers which could invalidate a lot of assumptions our current society takes for granted in its structure.


> a superintelligence that is smarter than any human and can continuously improve itself. At that stage, it will always allocate resources more efficiently than any individual or corporation.

This kind of magical thinking still baffles me.

This is sci-fi. There is absolutely zero evidence that such a thing is actually possible to create. Even if we stipulated that LLMs are AGIs, or can become them if we just cram in a few billion more parameters, it's painfully clear that they are not superintelligent, they cannot improve themselves, and there is no credible pathway to them becoming anything of the sort—not to mention they're already guzzling absurd amounts of energy, and taking up massive amounts of hardware, just to do the bad job they do now.


> Even if we stipulated that LLMs are AGIs, or can become them

>> (I mean real superintelligence, not chatbots)


Sure. Then you run even faster into the problem that this is not a real thing.

There is no AI that has consciousness. There is no AI that has human-level intelligence. We have no idea what it would take to build either of these things. Even if we were able to build either of these things, that does not automatically mean it is, or could ever become, superintelligent.

Your whole post is basically equivalent to saying "when the aliens come, and give us their technology, but enslave us all, capitalism will become irrelevant".

It feels truthy to say that we are close to the Singularity, but it's effectively a religious position. It has no basis in fact whatsoever.


I only quoted you and the grandparent. I've made no other comments here. You may have become confused. Taking some time and some breaths might be helpful.


Ah, my apologies; I thought you were the grandparent (I'm afraid I didn't check).

Doesn't change the argument, merely its direction! ^_^


I very much appreciate this reply. You basically perfectly elaborated on what I lazily threw out there.

With that being said, most of my friends are senior software devs and they think the same thing is bound to happen.

We often joke about starting a falafel restaurant or such before it is too late. I think it will take way, way longer for AI to make good falafel than it will take to replace software engineers.


> What do these people think—that 99% of the population will just become beggars?

they will just become dead.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: