"Almost, but not quite. Let me reframe it" – is this normal?
I'm wondering if this is a common issue intrinsic to GPT, or if my personalisation settings are forcing it. I can ask it the most boring, obvious, clear questions, and it will ALWAYS disagree, and try to 'refine', 'reframe', 'ti…