I'm not sure if it's just me, but seriously what the hell is up with all the AI hallucinations recently? I know it's been a thing for a while but lately it feels like every other input is causing GPT to make up literally everything, even when asked to google/search up pieces of information for confirmation it still makes things up and hallucinates.
I'm genuinely about to switch to Claude.
Edit: obviously AI and LLMs will have their quirks but why the hell are we paying for a product that’s REGRESSING?
[link] [comments]