Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think making the prompt smaller is the only goal. Instead if having 1000 tokens of general prompt instructions you could have 1000 tokens of specific prompt instructions.

There was also a paper I saw that went by that showed model performance went down when extra unrelated info was added, that must be happening to some degree here too with a prompt like that



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: