Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's a fundamental limitation of how context works. Inputting information as context is only ever context; the LLM isn't going to "learn" any meaningful lesson from it.

You can only put information in context; it struggles learning lessons/wisdom





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: