Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Exactly. People say "we have invented X (the LLMs), now if we just invent Y (reasoning AGI) all of X's problems will be solved". Problem is, there's no indication Y is close or even remotely related to X!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: