logoalt Hacker News

vmg12last Thursday at 1:05 AM2 repliesview on HN

It's pretty simple, don't give llms access to anything that you can't afford to expose. You treat the llm as if it was the user.


Replies

solid_fuelyesterday at 12:10 AM

> You treat the llm as if it was the user.

That's not sufficient. If a user copies customer data into a public google sheet, I can reprimand and otherwise restrict the user. An LLM cannot be held accountable, and cannot learn from mistakes.

rdlilast Thursday at 1:28 AM

I get that but just not entirely obvious how you do that for the Notion AI.

show 2 replies