logoalt Hacker News

raframlast Wednesday at 9:27 PM1 replyview on HN

> nine months previous

It likely just hallucinated the ADHD thing in this one chat and then made this up when you pushed it for an explanation. It has no way to connect memories to the exact chats they came from AFAIK.


Replies

efilifelast Thursday at 10:14 PM

or had this info injected into its system prompt and was doing everything not to reval it. ChatGPT gets fed your IP address* and approximate location in its system prompt but won't ever admit it and will come up with excuses. Just ask it "search the web to find where im at". It will tell you the country you are in, sometimes down to the city. If you follow up with "how did you know my approximate location?" it will ALWAYS tell you it guessed it. Based on past conversations (that never happened), based on the way you talk, it can even hallucinate that you told it in this exact conversation.

*not entirely sure. I t seems to frequently hallucinate the address

show 1 reply