logoalt Hacker News

Workaccount2yesterday at 3:15 PM1 replyview on HN

I think a hidden problem even if we solve memory is the curation of what gets into memory and how it is weighted. Even humans struggle with this, as it's easy to store things and forget the credibility (or misjudge the credibility) of the source.

I can envision LLMs getting worse upon being given a memory, until they can figure out how to properly curate it.


Replies

djmipsyesterday at 5:21 PM

yes humans can be injection prompt hacked / mind poisoned - a good sales campaign is something like this. Propaganda.