GPT seems to pretend not to understand Just to say I’m wrong
This has to be one of the most infuriating things I've come across lately, more than the infinite bullet points or the "its not x, its y" pattern. Let's say I want a random thought experiment, to see what could be done in a scenario. …