I've been coming across it more and more lately, GPT suggesting things relating to car maintenance that are blatantly incorrect and sometimes dangerous as hell, and rather than admit fault it'll double down.
I see it at my workshop, people come in SURE of their problem because ChatGPT diagnosed it, and often times, their car doesn't even have what GPT is suggesting is the issue, or the problem is totally unrelated.
My latest experience was a customer who followed GPT's advice to bleed his brakes and lost all braking because it didn't mention that he needed a mechanics OBD2 scanner to open the ABS module.
What's the most dangerous advice you've seen it give?
[link] [comments]