Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can see this in Minsky time period AI research, but surely with the number of people getting into AI and coming from a purely practical right now I would expect that mindset to be diluted. As someone not in the know I could very well be wrong.

In response to the coming apocalypse, this isn't the first time everyone has a vague sense of potential doom about the future. I believe this happens during any time of fundamental change, making the future uncertain which we interpret as apocalyptical. Back during the 30 years war that apocalyptic belief manifested as God being angry with us, today it's with the (very real) problems our rapid industrialization has created. Not to minimize the problems that we face - well minimizing only in that they probably won't lead to extinction. The various predictable factors mentioned have the potential to make life really shitty and cause massive causalities.

While framing these issues as a matter of extinction may feel like a way of adding urgency to dealing with these problems, instead it's contributing, on an individual level, to fracturing our society - we all "know" an apocalypse is coming but we're fighting over what is actually causing that apocalypse. Except that there will be no apocalypse - it's just a fear of the unknown, something is fundamentally changing in the world and we have no idea how the cards will land. It's no different than a fear of the dark.



We accuse GPT of confidently giving answers on things, but man, it learned from the best.

I cannot assure you that we won't have something like a nuclear apocalypse in the next few decades, and here you are certain it's not going to happen. How can you be assured of this future when the underlying assumptions of things like value of labor will be experiencing massive changes, while asset inflation is on an ever increasing spiral up.


I think you misread what I said - I was responding to this quote:

> If we don't reach at least Kardashev scale 1 in the next hundred years or so, we're going to go extinct due to several now-predictable factors.

Many people are certain of human extinction for one reason or another, it doesn't sound like you're one of them. I'm saying that we don't know what the future will bring, and that uncertainty manifests as apocolyptic thinking. I also specifically mentioned that we are facing multiple problems that can cause huge devastation and I'm not making the argument that "Oh hey everything is ok!" Just that to frame things as apocalyptic is contributing to the schism and preventing us from doing anything because everyone refuses to listen to anything else since they believe their lives are at stake.

I guess I shouldn't say "it won't be extinction", but that's way way way lower probability than people think. It's just that a massive amount of people have thought the world would end many times through out history, so I'm skeptical of "well this time we're RIGHT".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: