Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs don't make for a particularly good database, though. The "compression" isn't very efficient when you consider that e.g. the entirety of Wikipedia - with images! - is an order of magnitude smaller than a SOTA LLM. There are no known reliable mechanisms to deal with hallucinations, either.

So, no, LLMs aren't going to replace databases. They are going to replace query systems over those databases. Think more along the lines of Deep Research etc, just with internal classified data sources.



Maybe query UIs but RAGs like Deep Research depend on old fashion query systems.


You're right, "subsume" would be a better word here. Although vector search is also a thing that I feel should be in the AI bucket. Especially given that SOTA embedding models are increasingly based on general-purpose LLMs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: