logoalt Hacker News

alexhanstoday at 12:32 PM0 repliesview on HN

I thought of doing a similar LLM in a AI evals teaching site to tell users to interact through it but was concerned with inducing users into a prompt injection friendly pattern.