alt
Hacker News
holoduke
•
yesterday at 9:32 PM
•
0 replies
•
view on HN
With LLM tool use potentially every cat action could be a prompt injection