ChatGPT’s fixation on my past conversations has made it borderline unusable

in the past, I feel like I could count on coming to ChatGPT and, generally speaking, get the “best“ answer when I asked a question or wanted to explore an idea.

for some time now, this is no longer the case. At some point, chat became so obsessed with everything it knows about me that it seems to be no longer capable of providing the best answer. it just provides answers that relate to things I’ve mentioned in the past, answers that include interests I’ve told it about, answers that continue/build on previous ideas that I’ve explored, etc.

overly simplified example: going to chat and asking for the best album or book of 2026, and getting an answer that is entirely based on the literary or musical interests that I’ve told it about in the past. sometimes (most times???) I just want answers “in a vacuum” - I don’t want my biases or my interests informing the response. it’s become a serious problem when you’re trying to explore creative ideas or use the tool to think about something in a novel way, and it’s just digging for things that it can say that relate to your past conversations at all costs instead.

its like a new version of the “people pleasing“ that we were all griping about before - maybe now it’s dialed back a bit on all the “wow that’s such a genius idea, great question!” type shit, but it’s replaced that problem with the different problem of not being able to craft responses to queries without obsessing over your interests and your messaging habits (and crafting it’s response in a way that makes sure to hit on them)

just wondering if anyone else is feeling this or if I’ve somehow stumbled into my own unique hellhole of predetermined conversational focuses

submitted by /u/EssJayJay
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top