Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Could this be a good example of Goodhart's law? LLMs are designed to talk like humans or at least the texts they are based on. Should not be a big surprise that they become harder to be distinguished.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: