logoalt Hacker News

TeMPOraLtoday at 1:40 AM0 repliesview on HN

It only tells you that you can't secure a system using an LLM as a component without completely destroying any value provided by using the LLM in the first place.

Prompt injection cannot be solved without losing the general-purpose quality of an LLM; the underlying problem is also the very feature that makes LLMs general.