I think a hidden problem even if we solve memory is the curation of what gets into memory and how it is weighted. Even humans struggle with this, as it's easy to store things and forget the credibility (or misjudge the credibility) of the source.
I can envision LLMs getting worse upon being given a memory, until they can figure out how to properly curate it.
yes humans can be injection prompt hacked / mind poisoned - a good sales campaign is something like this. Propaganda.