logoalt Hacker News

jacquesmtoday at 5:30 AM7 repliesview on HN

That's because there is no lock-in in the current ecosystems for AI. Yet. But once AIs become your lifetime companion that know everything there is to know about you and the lock-in is maximized (imagine leaving your AI provider will be something like a divorce with you losing half your memory) these parties will flock to it.

The blessing right now is the limit to contextual memory. Once those limits fall away and all of your previous conversations are made part of the context I suspect the game will change considerably, as will the players.


Replies

visargatoday at 8:09 AM

Export your old chats and put them in a RAG system accessible on the new LLM provider. I did it. I made my chat history into a MCP tool I can use with Claude Desktop or Cursor.

Ever since I started taking care of my LLM logs and memory, I had no issue switching model providers.

show 1 reply
IceHegeltoday at 5:42 AM

There's a chance this memory problem is not going to be that easy to solve. It's true context lengths have gotten much longer, but all context is not created equal.

There's like a significant loss of model sharpness as context goes over 100K. Sometimes earlier, sometimes later. Even using context windows to their maximum extent today, the models are not always especially nuanced over the long ctx. I compact after 100K tokens.

show 4 replies
sebastianztoday at 7:49 AM

> But once AIs become your lifetime companion that know everything there is to know about you and the lock-in is maximized

Why? It's just a bunch of text. They are forced by law to allow you to export your data - so you just take your life's "novel" and copy paste it into their competition's robot.

show 1 reply
paooltoday at 6:20 AM

in order for that lifetime companion, we'll need to make a leap in agentic memory.

how do you know memory won't be modular and avoid lock-in?

I can easily see a decentralized solution where the user owns the memory, and AIs need permission to access your data, which can be revoked.

show 2 replies
zwnowtoday at 6:06 AM

Who even wants all your previous conversations taken into account for everything you do? How do you grow from never forgetting anything, making mistakes, etc? This is highly dystopian and I sure hope this will forever just be a fantasy.

show 1 reply
diffeomorphismtoday at 6:14 AM

Basic questions: what does a GDPR request get you? Wouldn't providers like you to switch to them?

Just look at the smartphone market.

lelanthrantoday at 6:03 AM

> Once those limits fall away and all of your previous conversations are made part of the context I suspect the game will change considerably, as will the players.

I dunno if this is possible; sounds like an informally specified ad-hoc statement of the halting problem.