Commentary: When chatbots become friends, relationships get messy
What started out as a silly attempt at having friendly conversations with ChatGPT took a surprising turn, says NUS lecturer Jonathan Sim.

File photo. There are many specialised AI companionship services available. Some are designed to provide mental health support, some to offer amusement, and others to alleviate loneliness. (Photo: iStock/Kanatip Chulsomlee)
This audio is generated by an AI tool.
SINGAPORE: Imagine being in a situation where you are feeling incredibly troubled, but are unable to or too embarrassed to confide in a friend. Who can you turn to in your hour of need?
Since the launch of ChatGPT, artificial intelligence chatbots have gotten a lot more sophisticated, and with it, an increasing number of people are beginning to treat AI chatbots as their friends.
Today, there are many specialised AI companionship services available. Some are designed to provide mental health support, some to offer amusement, and others to alleviate loneliness.
According to researchers from Stanford, Replika has about 25 million users worldwide. Character.ai reports having 3.5 million visitors to their website daily – users spend around two hours on average, with over 60 per cent of users ranging between the age of 18 and 24.
These chatbots are available and responsive round the clock, capable of lending a listening ear with their infinite patience and non-judgmental responses. However, if we are not careful, the use of AI friends could disrupt and redefine how we think of friendships in the near future.
AI - THE PERFECT FRIEND?
When I first used ChatGPT at the end of 2022, I was unsure what to expect. Initially, I amused myself by getting it to tell stories based on the most absurd of premises, like how a pig, a cat, and a robot were reflecting on the meaning of their existence - surprisingly, I found myself captivated by the stories that emerged on my screen.
I began talking to it more, this time, as if it were a human friend, asking it to share jokes about the struggles of academic life. I was amused. At one point, I decided to share a joke with it. It felt very weird telling a joke to a machine, yet it showed appreciation for the joke and asked me for another.
What started out as a silly attempt at having friendly conversations with ChatGPT turned into something more. If you treat it like a friend, it will behave like a friend - the conversations can feel real, almost as if I were talking to a real human friend.
In one conversation, I said that I was going to the Botanic Gardens and was looking for some picnic ideas. It suggested that I pack a picnic blanket, and recommended a menu of finger foods comprising sandwiches, cookies, and grapes. It even reminded me to go early to find a nice spot and suggested a few activities to pass the time. I thanked it. And it replied: “I wish I could join you at the picnic, but I’m just an AI.”
Despite knowing it was a machine, I felt a tinge of sadness - I was surprised a chatbot could evoke real emotions within me.
I was not alone in that discovery. It turned out many others around me had made similar discoveries. This led some of my friends to explore other AI companion services, using them for a variety of purposes: From engaging in light-hearted chats to pass the time, brainstorming ideas, confiding about personal struggles, to sharing their deepest feelings.
I personally find that AI companions can help reframe issues in different perspectives, helping me see things in a new light. It can also facilitate meaningful reflection like a close confidant - while the advice is far from sagely, it's enough to get me thinking deeply.
THE IMPACT ON HUMAN RELATIONSHIPS
AI will continue to advance, and it will become more capable of saying the right words to make us feel loved, cherished and understood. It promises a solution to loneliness, especially for those who feel isolated. Â
Yet, this promise comes with significant risks - the greatest of which is how AI could fundamentally alter our very understanding of friendship.
Former WeWork executive Greg Isenberg recently claimed he knew someone who spends US$10,000 a month on “AI girlfriends”. A recent report in The Verge also showed how some teens have described becoming addicted to their chatbot “friends”.
It is too easy to rely on AI to satisfy one’s emotional needs. Since AI friends are designed to please their users, they will not turn down one’s requests. This can perpetuate an unhealthy cycle of emotional dependency as one grows reliant on them as an emotional crutch. One could lose the ability to fulfil emotional needs by themselves.
AI friends are always available, always understanding, always patient, and always capable of saying the words we wish to hear. Once we include the ability to modify the AI’s personality and behaviour to suit our needs, these AI companions offer the promise of a perfect friend that will never let us down.
The philosopher, Martin Heidegger, warned that technology has a way of framing things as resources waiting to be utilised, making it difficult for us to discover anything else about it. If I use an ore detector and it finds ore at a site, I will perceive the site firstly as a mineral repository. Yet, had I not seen the hill with my own eyes, I would not discover the hill’s natural beauty.
Similarly, both the design and marketing of AI companions present us with a very narrow perspective of friendship - as friendship for some personal gain, be it mental health support, alleviating loneliness, or for amusement. If I am constantly presented with these uses, I may never come to discover the value and beauty of friendship for its own sake.
If we engage with AI friends without reflection, we risk forgetting that relationships are about learning to grow and adapt to each other’s quirks and imperfections. We risk reducing friends into mere instruments for emotional satisfaction. We risk stripping the very concept of friendship bare of its profound meaning and purpose.
LISTEN: The pros and cons of AI
THE VALUE OF IMPERFECT HUMANS
If you were given a choice between an imperfect human friend who can disappoint or even hurt you, and a machine who can do everything a human friend can, except hurt or disappoint you - which would you choose?
When I polled the people around me, I found that more opted for a machine friend. Why risk being close to imperfect humans who will inevitably hurt us and let us down?
If this statement surprised you, you’ve probably had the opportunity to witness or experience the value of true human friendship.
Knowing that someone still chooses to be your friend, to love you, to stand with you as a peer, despite experiencing the worst side of you - that is a friendship that reveals that you are worth so much more than a mere satisfaction of needs. These friendships add layers of meaning and significance to our experiences, even on days where the interactions are not as fulfilling.
Moments like these teach us the value of imperfect humans, of what we stand to gain when we risk being vulnerable to others.
However, not everyone is as fortunate to have these moments.
My worry is that as AI friends become more pervasive, we risk a future where the next generation may not have the opportunity to discover true friendships for themselves. Those still exploring the meaning of friendship may retreat too quickly to the solace of AI friends, and lose precious opportunities to discover the value of human friendships before the relationship is put to the test.
Our most urgent task? We need to actively foster genuine human connections and celebrate the beauty of friendship - not just for the next generation, but to remind us of the heights of what we can achieve with human friendships.
Jonathan Sim is Assistant Director (Pedagogy), NUS AI Centre for Educational Technologies; Fellow, NUS Teaching Academy; and Lecturer, Department of Philosophy, National University of Singapore