logoalt Hacker News

flipbradlast Wednesday at 9:43 PM5 repliesview on HN

"we built foundational protections (...) including (...) training our models not to retain personal information from user chats"

Can someone please ELI5 - why is this a training issue, rather than basic design? How does one "train" for this?


Replies

dust42last Wednesday at 10:38 PM

This is just marketing nonsense. You don't have to train models to not retain personal information. They simply have no memory. In order to have a chat with an LLM, every time the whole conversation history gets reprocessed - it is not just the last answer / question gets send to the LLM but all preceding back and forth.

But what they do is exfiltrate facts and emotions from your chats to create a profile of you and feed it back into future conversations to make it more engaging and give it a personal feeling. This is intentionally programmed.

show 2 replies
bo1024last Thursday at 12:16 AM

Same question. I wonder if they use ML to try to classify a chat as health information and not add it to their training data in that case.

I also wonder what the word "foundational" is supposed to mean here.

SAI_Peregrinuslast Wednesday at 9:59 PM

I assume they want to retain all other info from user chats, and they're using an LLM to classify the info as "personal" or not.

data-ottawalast Wednesday at 10:31 PM

Could be telling the memory feature not to remember these specific details

ipnonlast Wednesday at 10:08 PM

I used to work in healthtech. Information that can be used to identify a person is regulated in America under the Health Insurance Portability and Accountability Act (HIPAA). These regulations are much stricter than the free-for-all that constitutes usage of information in companies that are dependent on ad networks. These regulations are strict and enforceable, so a healthcare company would be fined for failing to protect HIPAA data. OpenAI isn't a healthcare provider yet, but I'm guessing this is the framework they're basing their data retention and protection around for this new app.