I used ChatGPT for a real life decision (whether to break up with my girlfriend) and it asked me a question i’d been avoiding for a year

Posting because I don't see this use case talked about much, and I think it's the actual killer app for these models, not coding or copywriting.

Three weeks ago I was on a six hour train with no signal except patchy wifi, and my girlfriend was flying in from 5000 km away in two days. We'd been together a year and a half by then, and there were a bunch of complications stacked together: my mom had been against the relationship from day one, plus a big age gap and a big culture gap. I had ten days from her landing to either commit to something serious or end it.

Every person in my life already had a side. My mom said no, my friends who liked her said yes, her family said yes, and there was nobody neutral I could think out loud with. So I opened ChatGPT.

What I was expecting was that it would tell me what to do, weigh the pros and cons, give me a framework.

What actually happened was that it didn't tell me anything, it just kept asking questions for five hours. Some were the obvious ones (what do you each want from this in the next five years), and a couple were ones I'd never considered, including one that was something like "Describe a normal Tuesday in your life five years from now if you stay together". That one took me forty minutes to attempt and I couldn't actually do it, which was the answer.

I broke up with her three days later, we cried, and it was the right call. I still feel terrible about how it ended but not about ending it.

The thing I keep thinking about is that I've talked to actual humans about big stuff before, including friends, family, and even a therapist a few years ago, and nobody asked me that question. It wasn't because they were bad listeners, it was because they all had a relationship to me that made them want me to be ok in a particular way. ChatGPT had no skin in the game, so it could ask the question that exposed the thing.

Curious if anyone else has used it for a non-technical, non-work, real life decision. What did it ask you that surprised you?

*Used gpt for the structuring that's it*

submitted by /u/FailOk3553
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top