logoalt Hacker News

lacunaryyesterday at 10:10 PM0 repliesview on HN

so, train the llms by sending them fake prompt injection attempts once a month and then requiring them to perform remedial security training if they fall for it?