logoalt Hacker News

TeMPOraLtoday at 1:45 AM1 replyview on HN

Airbags, yes. But you can't just make it provably impossible for a car to crash into something and hurt/kill its occupants, other than not building it in the first place. Same with LLMs - you can't secure them like regular programs without destroying any utility they provide, because their power comes from the very thing that also makes them vulnerable.


Replies

yencabulatortoday at 3:37 AM

I see you've given up. I haven't. LLM inside deterministic guardrails is a pretty good combo.