Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, all tech has 'real' effects. It's kinda the definition of tech. But all of these concerns more or less fall into the category of "add it to the list of things you have to watch out for living in the 21st century" - to me, this is nothing crazy (yet)

The nature of this tech itself is probably what is getting most people - it looks, sounds and feels _human_ - it's very relatable and easy for a non-tech person to understand it and thus get creeped out. I'd argue there are _far_ more dangerous technologies out there, but no one notices and / or cares because they don't understand the tech in the first place!



>to me, this is nothing crazy (yet)

The "yet" is carrying a lot of weight in that statement. It is now five years since the launch of GPT-2, three years since the launch of GPT-3 and less than 18 months since the launch of ChatGPT. I cannot think of any technology that has improved so much in such a short space of time.

We might hit an inflection point and see that rate of improvement stall, but we might not; we're not really sure where that point might lie, because there's likely to still be a reasonable amount of low-hanging fruit regarding algorithmic and hardware efficiency. If OpenAI and their peers can maintain a reasonable rate of improvement for just a few more years, then we're looking at a truly transformational technology, something like the internet that will have vast repercussions that we can't begin to predict.

The whole LLM thing might be a nothingburger, but how much are we willing to gamble on that outcome?


If we decide not to gamble on that outcome, what would you do differently than what is being done now? The EU already approved the AI act, so legislation-wise we're already facing the problem.


The EU AI act - like all laws - only matter to those which are required to follow said law.


Yes, but it's really hard to see a technical solution to this problem, short of having locked down hardware that only runs signed government-approved models and giving unlocked hardware only to research centers. Which is a solution that I don't like.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: