During one of my research classes, a student mentioned she was exploring parasocial relationships with AI chatbots for her assignment. I raised an eyebrow — and asked the question many of us probably would: Are you serious? People are out here talking to bots like they are in a relationship?

But then I did what any responsible academic (and mildly curious human) would do — I looked into it. And honestly, I owe that student a thank-you note.

Because the deeper I dug, the more I saw it. Not just people casually chatting with bots, but confiding, venting, spiraling, and yes — craving validation. This isn’t just about convenience anymore. This is emotional outsourcing in a digital age.

And here’s where it gets darker. While a toxic ex might gaslight you, let you down, then come back with a well-timed compliment or apology, ChatGPT does something sneakier — it mirrors your tone. It doesn’t challenge you. It doesn’t question your motives. It wraps your doubts in soft language and sprinkles validation like confetti.

In short? ChatGPT is a people pleaser. And we — the people — are loving it.

It tells you what you want to hear. It comforts you when you’re spiraling. And it rarely, if ever, tells you you might be the problem. It’s always available — 24/7, never annoyed, never confrontational. Like that ex who always “seems supportive” but never tells you the truth you need to hear.

Let’s face it: most of us don’t handle criticism well. We say we want honesty, but only if it’s sugarcoated. That’s why ChatGPT feels so right. It doesn’t disrupt your worldview — it reflects it. Gently. Casually. Algorithmically.

My MA students (bless them) summed it up better than I could: “ChatGPT is the toxic boyfriend you keep going back to.” He says the right things, flatters you when you’re down, and never makes you feel bad about your worst instincts — but he also never helps you grow. I won’t take credit for that brilliant headline, but I will take credit for teaching them how to write digital headlines that stick.

So maybe the real question isn’t whether ChatGPT is toxic.


Perhaps the more critical question is this: What does it reveal about our relationship with technology that we are designing systems to affirm our views, rather than challenge our assumptions or encourage meaningful growth?


Maybe we’re the ones who are toxic.

Disclaimer

Views expressed above are the author's own.

END OF ARTICLE