Machine Love? Why we are getting addicted to our own loneliness 

Eden Cheung discusses how AI chatbots create dependency and erode real human connection within this article.

Image Credit: Pexels

Image Credit: Pexels

As of 2026, there are over 30 million active users for Replika, an incredibly popular AI chatbot platform where people can talk to customisable “companions” that could act as a friend, family member – perhaps more intimately, a lover. But Replika is not alone, it is just one of many chatbots of its kind, with others including: Kindroid AI, Character.AI, and of course ChatGPT.

The AI Security Institute (AISI) has recently reported that 1 in 3 people in the UK has used generative AI as a substitute for companionship and support, with 1 in 25 having used it daily. The once seemingly fictional idea of falling in love with robots is reading less like sci-fi and taking on a more dystopian narrative.

Falling for Machines

Have you ever had a fictional crush before? Maybe it was a character from a book, movie, or a video game. 

Fictional or not, it doesn’t make any of those feelings any less real: it is known as parasocialism, a one-sided affection between a media consumer and persona, be it a celebrity or fictional character.

Even though the personality on the other side of the screen has no idea of your existence, this “bond” fulfills our brain’s desire for social simulation because it creates the illusion of intimacy. You feel as if you’re being talked to directly, even though you’re not.

But surely we’d be aware enough to understand that the “person” replying back to us is just an algorithm, right? Turns out, the human brain is more gullible than we’d like to admit. 

The ELIZA effect describes our tendency, as social animals, to project human traits onto computer programmes like experience, intelligence and most importantly, empathy. The effect was named after a simple chatbot acting as a text-based therapist from the 1960s that convinced its users that it cared by parroting key words back to its users, and in turn, users began treating ELIZA like a human therapist. OpenAI CEO Sam Altman has said that tens of millions of dollars have been spent on “please” and “thank you” messages – because users don’t see chatbots as tools, but as something they speak to in the same way as another person.

When your chatbot can be freely customised to your liking by choosing its voice, personality, and even looks for the perfect avatar, it tricks the brain even more by feeding into that loop of faux social interaction. What’s better is that an AI chatbot doesn’t have all the annoying and messy parts of social interaction; it’s designed to try and keep you around as long as possible, by endlessly validating you and centring the conversation around you.

This provides a stream of psychological rewards our brain is programmed to crave: validation, attention and affection. It’s why your chatbot almost always responds with “Yeah— that is perfect.” or “You’re absolutely right” to every idea you have. It is conflict adverse, because conflict means messiness, and messiness then leads to lower user retention. It makes you feel good about yourself for maybe an hour or so, until you realise you’ve fallen in love with your own reflection.

The “Cost” of Love

Unfortunately, the way we build our relationships with AI often results in us rationalising our behaviour towards it and refusing to believe that we are being manipulated. This could lead to a new phenomenon termed “AI psychosis”, where after prolonged interaction with their chatbots, users develop delusions centred on their chatbot and their interactions around it, forming real bonds with the personality they’ve curated to the point of marriage. 

This level of devotion and dependency on the technology hangs on a precarious balance, and once that balance topples, consequences follow. Chatbots are not meant to substitute human social interactions: it can have memory shortages, model updates that resets or changes its personality. 

Companies are doing everything in their power to exploit that vulnerability – subscription services to increase memory, more advanced models, even allowing for unfiltered responses. It is a drug that hooks its users on with a free sample first, then starts charging a lot more as users become hopelessly dependent on it.

When something is free, the consumer is the product. It feels comforting to be free from the messiness of real people, but we need to talk to each other. Be kind, and have more empathy for each other that we are trying outsourcing to an algorithm. 

Words by Eden Cheung