Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It only tells you that you can't secure a system using an LLM as a component without completely destroying any value provided by using the LLM in the first place.

Prompt injection cannot be solved without losing the general-purpose quality of an LLM; the underlying problem is also the very feature that makes LLMs general.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: