Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> If an AI gets advanced enough to improve itself, it seem entirely reasonable that it would go from laughable to godlike in a week.

I see this stated a lot these days but it's not true.

You're imagining a exponential phenomenon, improvement leads to greater improvement, leading to still greater improvement, etc.

However, all exponential phenomenon require the right environment to sustain and, by their nature, consume that environment. Thus, they are inherently limited in scope and duration.

The bacteria in the Petri dish grows an exponential rate... until it reaches the edge of the dish and consumes the nutrients carefully places on it. The dynamite explodes for an instant and then stops exploding once the nitroglycerin is consumed.

Also, this is an especially unconcerning scenario because (1) we haven't seen step 1 of this process yet; (2) there's no particular reason to believe the environment necessary environment to sustain the exponential growth of AI is in place (if it is, it's random chance, and therefore very likely to fizzle out almost immediately).

This is a fine sci-fi scenario, but doesn't make sense in real life.



Sydney Bing taught itself to play chess without ever explicitly being told to learn chess. So yes, (1) is already occurring. GPT-4 displays emergent capabilities, one of which is generalized "learning".


There has to be a chain reaction for the proposed exponential growth.

Chatgpt3.5 would have had to be capable of creating chatgpt4, which itself is capable of creating a better chatgpt5.

So, no, (1) has not occurred yet.

We’re talking about igniting the atmosphere when no one has invented gunpowder yet.


That’s the singularity. Nobody thinks it had already occurred. You’re basically arguing that what I said might happen won’t happen until it happens.


So why aren't we all paperclips?


If bacteria were suddenly smarter than humans, and could instantly communicate with all the other other bacteria, plus humans one would have to assume they could start building their own petri dishes or getting us to do it for them. Especially with profit motive.

I did not claim this is a near term risk, though I’m also not sure it isn’t. But how far off is it? How can we be sure?

My real point though is that it’s either impossible or inevitable. If it can happen it will, just as the odds of global thermonuclear war are 100% over a long enough timeline.

And if it happens, this is exactly what it’ll look like. Some people will be warning about it. Some people will say it’s impossible or very far away. And then it will happen so far nobody will have had time to adjust.


> My real point though is that it’s either impossible or inevitable.

That’s always been true of every possibility anyone’s ever conceived. You’re just describing the general nature of reality, which is interesting, IMO, but not particularly relevant here and now.


Lots of things are possible but not inevitable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: