logoalt Hacker News

zahlmanyesterday at 11:31 PM0 repliesview on HN

If you could influence the LLM's actions so easily, what would stop it from equally being influenced by prompt injection from the data being processed?

What you need is more fine-grained control over the harness.