Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have some meticulous API docs I've written, which I tried to get ChatGPT to convert into swagger

It failed spectacularly

I wonder if it's because the API is quite large, and I had to paste in ~10 messages worth of API docs before I was finished.

It kept repeating segments of the same routes/paths and wasn't able to provide anything cohesive or useful to me.

Was your API pretty small? Or were your docs pretty concise?



Chatgpt has a token limit. If you exceeded it then it would have no way of delivering a good result because it would simply have dirtied what you said at first. My api was not huge, about 8 endpoints.


It can accept about 4k tokens, maybe 3000 words or 3500.

GPT-4 can now accept 8k or 32k. The 32k version is 8 times larger than the one you tried.

And these advances have come in a matter of a few months.

Over the next several years we should expect at least one, quite easily two or more orders of magnitude improvements.

I don't believe that this stuff can necessarily get a million times smarter. But 10 times? 100? In a few months the memory increased by a factor of 8.

Pretty quickly we are going to get to the point where we have to question the wisdom of every advanced primate having a platoon of supergeniuses at their disposal.

Probably as soon as the hardware scales out, or we get large scale memristor systems or whatever the next thing is which will be 1000 times more performant and efficient. Without exaggeration. Within about 10 years.


So people want to build a nuclear reactor on the moon, I think these things should probably live on the moon or better yet Mars.

That should be the place for experiments like this.

Lowery latency links back to Earth and first see how it goes.

Also you don’t think there will be resource constraints at some stage ? It’s funny we yelled at people for Bitcoin but when it’s ChstGPT, it’s fine to run probably tens of thousands of GPUs? In the middle of a climate crisis ? Not good.


Personally I don't think AI tools energy usage are comparable to BTC yet.

Also, with BTC it's literally burning it in an unproductive way for "improved security". It's like lighting a forest on fire to keep warm.

All the AI tools combined, last I heard, aren't consuming 0.5% of the world's energy usage. And even if they were, it would be absolutely bonkers to argue we should keep doing that when there were alternatives that accomplished similar goals without the energy usage (proof of stake)


So, there is money to be made from LLMs, there will be advertising injected into the models responses etc.

It's really the early days, but there's no way energy consumption won't grow exponentially now there is potential for earning money.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: