Me too, I already exported my data from all platforms including HN and indexed them in a RAG database but I don't feel like using them much. They are past oriented and I need present oriented stuff. With LLMs I noticed I don't like when they use chat history search or memory functions, it makes them fall into a rut and become less creative.
I even got to a point where I made an "anti-memory system" - a MCP tool that just calls a model without context from the full conversation or past conversations, to get a fresh perspective. And I instruct the host model to reveal only partially the information we discussed, explaining that creativity is sparked when LLMs get to see not too much, not too little - like a sexy dress.
"anti-memory system" Cool~!
When it comes to stimulating AI creativity, it may indeed be better to impose fewer constraints. However, in most scenarios, problems are likely still solved through simple information aggregation, refinement, analysis, and planning, right?