It's pretty simple, don't give llms access to anything that you can't afford to expose. You treat the llm as if it was the user.
> You treat the llm as if it was the user.
That's not sufficient. If a user copies customer data into a public google sheet, I can reprimand and otherwise restrict the user. An LLM cannot be held accountable, and cannot learn from mistakes.
I get that but just not entirely obvious how you do that for the Notion AI.
> You treat the llm as if it was the user.
That's not sufficient. If a user copies customer data into a public google sheet, I can reprimand and otherwise restrict the user. An LLM cannot be held accountable, and cannot learn from mistakes.