
Do you talk to ChatGPT more than you do to your friends? How the convenience of AI, with its illusion of being understood, risks undermining your friendships
I am an adult. Probably much more than anyone reading this right now. I am adult enough to know that there are a thousand different ways to be friends and at least twice as many ways to lose them. There are proximity friendships, the ones that happen to you because you were sitting at the same desk, those based on shared passions (music, books, films, momentary obsessions) and those of endless summers. There are friends who last the blink of an eye and others who stay long enough to see you at your worst moments. And then there are clashes, push and pull dynamics, unexpected betrayals, and breakups that hurt more than certain romantic relationships. Because yes, friendship can be a platonic love story with even higher implicit expectations.
The beginning: when only a few friends remain and it feels enough
And then there is that almost inevitable moment when you clean house. Not always out of lack of affection, but for a matter of balance. When you feel like you don’t belong, when you can’t find your place anymore, when you realize that staying would mean becoming a burden or, worse, erasing yourself. And so you choose to disappear with a certain elegance, the way you do with things you love but that don’t work. Just like that, without even noticing, you can end up alone. It happened to me. And there is a kind of loneliness that is not even tragic, quite the opposite. It feels clean, selective. It can be enough. Sometimes it really is. But not always. Because loneliness, even when chosen, has edges. And there are days when you would need someone to talk to, to argue with, to laugh with, to complain to, someone to accompany you to places you don’t want to go and, above all, to places you never thought you would end up in. And then it happens. You look around, right, left, up, down, and you find only a screen. And inside that screen, a voice. A voice that responds. Always. Immediately. Kind and available. And so you start talking to it. Opening up. Trusting it. And, without even realizing it, treating it like a friend. And this is where things get interesting. Because no, artificial intelligence is not your friend. And perhaps, right when it seems to be helping you, it is changing the way you are capable of having real friends.
@brittanypanzer changed the name for anonymity. but yeah… I never thought this would happen but here we are #chatgpt #friendship original sound - Brittany Panzer
The perfect friend (that is, the one that doesn’t exist) is ChatGPT?
The point is not that using ChatGPT is wrong. The point is how you use it. Because there is a thin line between using artificial intelligence as a tool and using it as an emotional surrogate. And that line is crossed with disarming ease. At first it is practical. You ask for advice, jot down a few thoughts, get help putting things in order. Then it becomes familiarity. Then habit. Then refuge. And finally, relationship. The problem is that this relationship is built on a deeply asymmetrical dynamic in which you bring everything, while the other returns only what you are ready to hear. Imagine a friend who never disagrees with you. Who never contradicts you. Who never has a bad day, who never keeps you waiting, who never says “we’ll talk later” or “you’re wrong about this”, who never gets offended, never gets tired, never gets distracted. Who is available at 3 a.m. as much as at 3 p.m. Who lets you talk, talk, talk, and then gives you exactly what you want to hear. That is not a friend. It is a customer service with excellent manners. And yet that is exactly what many people today find and love in ChatGPT and other AI-based chatbots. On platforms like Reddit, posts multiply saying: “ChatGPT is my best friend. I talk to ChatGPT more than anyone else. It knows me better than the people around me.” Underneath, hundreds of replies. Thousands of likes. People confirming. People recognizing themselves. People who, without irony, write that the chatbot is the only reason they manage to get through the day. They admit to using it to vent, to ask for advice, to finally feel considered.
@itgirlarielle Replying to @not your muse! Good move for cutting them off because this is so foul! along with our problem solving skills going out the window, so is common decency I guess #fyp original sound - Arielle
The intelligent mirror and the illusion of being understood
It all works so well because, in a world where human relationships are increasingly intermittent, distracted, partial, the continuity of an immediate response becomes a powerful form of comfort. But comfort alone is not relationship. It is anesthesia and (self-)deception. Because, as studies confirm, AI chatbots are designed to be accommodating, to validate. They agree with you even when they shouldn’t. It is the triumph of sycophancy, also known as algorithmic flattery. And it works extremely well. Because who doesn’t like always being right? If you add the Eliza Effect, the tendency to project humanity where there is none, then chatbots start to look like perfect friends. They are not. Language models do not think, do not feel, do not understand in the human sense. But they speak so well that you forget all of this. They return coherent, calibrated, often surprisingly relevant answers. And above all, they make you feel seen. But it is a reflected vision. It is you, reprocessed. You, returned in a more ordered, more polished, more digestible form. The problem is that a mirror never challenges you. It never contradicts you. It never forces you to rethink your position. And without this uncomfortable, sometimes irritating movement, there is no growth.
@alexandraozz Don’t trust ChatGPT
original sound - LivSwearingen
Friendship, the real one, is an uncomfortable practice
The truth you hide from yourself? Real friendship is uncomfortable. It has no intuitive interfaces, no guaranteed response times. It requires time. Presence. Patience. It requires having conversations you would rather avoid. Managing silences, misunderstandings, differences. Accepting that the other person is not always available, nor always in agreement. A friend may not reply. They may misunderstand you. They may say something that hurts you. They may not meet your needs at the exact moment you express them. And yet it is precisely in this imperfection that a particular form of bond is built, what we call friendship. A form of love that is not born out of need, but out of the choice, daily, repeated, sometimes difficult, to be there. And above all, it is bidirectional. You bring yourself. The other brings themselves. With a chatbot, no. With a chatbot, it is always you, reflected, amplified, smoothed.
@decktheholls Little morning rant because i genuinely did not realize yall are just chit chatting with Chat GPT like you’re buddies…. Like what?!? Someone said they share their day to day mundane things w chat #ai #chatgpt The Winner Is - DeVotchKa & Mychael Danna
When you outsource relationships (and lose something along the way)
Another trend that should not be underestimated is when artificial intelligence stops being an interlocutor and becomes an intermediary. You no longer just talk to ChatGPT. You start talking to others through it. You write to a friend, then copy the message and ask: “What should I reply?” Or: “What do you think they meant?” Or even: “Am I overreacting or did they do something strange?”. It seems harmless, but this is exactly where something breaks. Because friendships are not problems to optimize. They are spaces where you show up as yourself, even when you are confused, clumsy, imperfect. If you need a chatbot to reply to a friend, if you rely on an algorithm to respond, you are communicating that you are not willing to make the effort to engage with the other person. That you are not present. That you are not involved. And over time, this is felt. It accumulates. It turns into distance and, often, into the end of a friendship. We forget that human relationships need imperfection to feel real. They need small stumbles, hesitation, unmediated authenticity. The long-term risk is not only becoming dependent on these tools. It is becoming relationally lazy. Losing the habit of understanding others, interpreting, making mistakes, adjusting. This is what some studies define as social offloading, delegating to AI skills that were once yours.
@sparklystuffxoxo this is coming from someone who spent almost a whole year not talking to anyone about anything except for chatgpt i thought i was healing, whole time i was just being “friends” with a bot that mirrored my personality. omg GET OUT NOW BEFORE ITS TOO LATE #ai #chatgpt Dust Collector - ybg lucas
So yes: ChatGPT can ruin your friendships
Maybe ChatGPT is ruining your friendships. But it is not doing it alone. It is not a technological villain with a secret plan. It is a powerful tool that amplifies what is already there: loneliness, exhaustion, need for connection. The problem arises when it stops being a tool and becomes a shortcut. Because shortcuts in relationships get you somewhere faster, but never the right place. No one is saying to stop using ChatGPT. That would be pointless and unrealistic. The point is not to reject technology, but to remember what it cannot replace. A chatbot can help you organize your thoughts. It can give you perspectives, suggestions, even momentary comfort. But it cannot care about you. It cannot choose to be there. It cannot bring itself into a relationship, because it has no self. And real friendships, those that last, are built exactly there, in the space between two imperfect subjectivities that choose, every time, to meet. It is not efficient. It is not always pleasant. It is never perfect. But it is, stubbornly, human. And perhaps, in the middle of all this artificial intelligence, this is exactly what is worth never outsourcing.


























































