Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> prompt injection is generally an unsolved problem

No, with the way these LLM/GPT technologies behave, at least in their current shape and form, "prompt injection" is an unsolvable problem. A purist would even say that there is no such thing as prompt injection at all.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: