"Almost, but not quite. Let me reframe it" – is this normal?

I'm wondering if this is a common issue intrinsic to GPT, or if my personalisation settings are forcing it.

I can ask it the most boring, obvious, clear questions, and it will ALWAYS disagree, and try to 'refine', 'reframe', 'tighten up', or 'correct' me. This is the angle it comes from, every single time. It's pedantic to the point of being unusable.

I could ask it if a football was round, and get something like:

"you're on the right track, but let's reframe and tighten that up a bit. A football is technically made of multiple hexagons stitched together, making it actually more.....which is not traditionally what is implied when using the term 'round'. Hope that clears things up".

It almost feels as if they are trying to combat the problem of hallucinations and misinformation by making it more careful about what it says, but they:e taken it to an absurd degree.

I always hear about people saying how LLMs will just agree with whatever you say, but man that could not be further from what GPT is like right now.

submitted by /u/No_Region_4719
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top