logoalt Hacker News

jbxntuehineohtoday at 1:36 AM0 repliesview on HN

yeah but cryptographic systems at least have fairly rigorous bounds. the probability of prompt-injecting an llm is >> 2^-whatever