Advertisement
Memes

Girlfriend uses ChatGPT to form ‘well-constructed’ arguments during couple’s fights—is she wrong?

“Chat GPT says you don’t have the emotional bandwidth to understand what I’m saying”

Photo of Anna Good

Anna Good

Single panel image with text in white box reading 'My Girlfriend uses Chat GPT every time we have a disagreement. AITAH for saying she needs to stop?'. Behind the text square is a woman looking distressed and on her phone.

Navigating disagreements in a relationship can be fraught with challenges, from misunderstandings to not being willing to see from the other person’s point of view. That is hard enough, but Redditor u/drawss4scoress’s girlfriend has decided to add the language learning model (LLM) ChatGPT into the mix as a mediator, rather than asking a friend or therapist for advice.

Featured Video

“My girlfriend uses Chat GPT every time we have a disagreement. AITAH for saying she needs to stop?” the OP wrote in their headline on r/AITAH.

“Me (25) and my girlfriend (28) have been dating for the past 8 months. We’ve had a couple of big arguments and some smaller disagreements recently. Each time we argue my girlfriend will go away and discuss the argument with chat gpt, even doing so in the same room sometimes,” they explained.

Advertisement

“Whenever she does this she’ll then come back with a well-constructed argument breaking down everything i said or did during our argument. I’ve explained to her that i don’t like her doing so as it can feel like i’m being ambushed with thoughts and opinions from a robot. It’s nearly impossible for a human being to remember every small detail and break it down bit by bit but AI has no issue doing so.”

“Whenever i’ve voiced my upset i’ve been told that ‘chat gpt says you’re insecure’ or ‘chat gpt says you don’t have the emotional bandwidth to understand what i’m saying’,” u/drawss4scoress added, sharing how uncomfortable it makes them to have a robot regurgitate their girlfriend’s point of view back at them. “My big issue is it’s her formulating the prompts so if she explains that i’m in the wrong, it’s going to agree without me having a chance to explain things.”

“Am i the *sshole for asking her to stop using chat gpt in this context?” they asked.

Folks on Reddit had mixed feelings about the use of an AI like ChatGPT to try to bolster her side of the argument.

Advertisement

“Respond with ChatGPT until she gets the point,” u/Tangential-Thoughts recommended.

u/annebonnell and u/SnooMacarons4844 disagreed and thought OP should simply break up with their girlfriend, sharing the sentiment of “Who wants a robot for a girlfriend?”

Reddit comment reads, 'Or better yet, have your chat gpt and her chat gpt have the arguement instead.'
u/Anangrywookiee/Reddit
Reddit comment that reads, ' This...you're in a relationship with her, not a bot. If she can't understand that, it's time to go.'
u/Anangrywookiee/Reddit
Advertisement

Other Redditors pointed out that the girlfriend had fallen into the trap of LLMs: that they are inherently biased to the input of the end user. 

Reddit comment that reads, 'Tbh I'd say the opposite chatgpt has to ALWAYS find a reason you're both right/wrong. The girlfriend is probably just picking the parts that make her right'
u/Whole-Powerful/Reddit

“Show her how biased it is to the user’s input, it’s literally programmed to tell you exactly what you want to hear. Discuss her actions with ChatGPT from your perspective and it’ll do the exact same thing to her. Show her how it’s biased and only serves as an artificial form of self-validation,” u/Professional-Ear5923 said.

Reddit comment that reads, 'Frequent occurrence when using it in software engineering problems, if you aren't careful about prompts. Me: 'That doesn't work, but do you think I can solve X problem using Y technique if I just Z instead?' ChatGPT: 'Yes! You can! Here's how that would look:' Narrator: '...but he could not do that. The AI was just being supportive, not reading the documentation.''
u/Mister2112/Reddit
Advertisement

u/Kopitar4president shared their personal experienced with testing out ChatGPT, saying, “I noticed that very quickly f*cking around with it that it is programmed to reinforce your position. It’s machine learning to an absurd degree, but still machine learning. It asks people to rate the responses. She thinks it’s impartial because it’s a robot, but it’s a robot programmed to tell people what they want to hear.”

Reddit comment reads, 'Honestly, that's just ignorant childish behavior on her part. When she comes back with her ChatGPT nonsense stop interacting with her, get up and walk away. Tell her when SHE actually has something intellectually honest to share OF HER OWN you'd be delighted to reengage. You have to be consistent but eventually she will learn to be a functional adult. NTA'
u/celticmusebooks/Reddit
Reddit comment reads, 'While I disagree with the use of ChatGPT, have you considered maybe that she has a hard time communicating and is using it to assist her? Is she on the spectrum? I don't think you're the *sshole but maybe you might want to come up with same ways to communicate more effectively so she doesn't feel she needs the assistance of ChatGPT'
u/kazwebno/Reddit
Reddit comment reads, 'This is a really good and constructive bit of input and I’m gutted I had to scroll so long to find it. She’s clearly misusing it and getting unhelpful results from it for both parties, but are a number of reasons she might be resorting to that. ASD is definitely one, as is really ingrained discomfort with confrontation, potentially from previous situations, that have left her feeling she has to ‘hide’ behind 'It’s not me saying this' or (mis)using it as a communicator because she freezes or feels outmanoeuvred in situations of conflict. Whatever, it sounds like their attachment or communication styles are/have become really ill-matched.'
u/migrainosaurus/Reddit
Advertisement
 
The Daily Dot