Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
cjonas
on May 13, 2023
|
parent
|
context
|
favorite
| on:
Delimiters won’t save you from prompt injection
In my testing with GPT-4 (via API), it handles this fine if I split the prompt into multiple user inputs and provide clear instruction (provide a summary and nothing else). But ya, there are an infinite way to attack this.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: