Are AI boyfriends the solution to the loneliness epidemic?
Rania Sivaraj looks at the people turning to chatbots for a convenient alternative to human companionship
I’m scrolling through my YouTube recommendations when I see a woman staring back at me. Her red hair is in effortless waves and there’s a tasteful string of pearls around her neck. Next to her is the caption ‘Experience a Premium AI Companion that feels almost indistinguishable from a real person.’ Clicking on her face takes me straight to a website named Replika, which is all pastels and smiling faces. The caption reads ‘The AI companion who cares’ – and it promises 24/7 emotional support, regardless of whether you need a mentor, lover or friend.
Replika is the brainchild of Eugenia Kuyda, a Russian-born entrepreneur. She released the site in 2017 after a close friend of hers passed away a few months prior. Kuyda, longing to preserve his memory, fed the emails and texts they had shared to a language model, which she subsequently made into a chatbot. In the eight years since its launch, Replika has gained 35 million users.
“1 in 5 American adults are estimated as having had an ‘intimate encounter’ or long-term relationship with a chatbot”
Defined as an LLM (Large Language Model), Replika is part of a wider group of generative AI chatbots that have grown increasingly popular in the last few years – such as Character.ai, Nomi, and ChatGPT – equipped with sophisticated and adaptable dialogue. A study from MIT’s Media Lab highlighted that these models can be easily used for ‘social and emotional support’. In an age that has often been cited as experiencing a pervasive loneliness epidemic, more people are turning to LLMs for comfort. An article in the New York Times reported that, as of 2025, 1 in 5 American adults are estimated as having had an ‘intimate encounter’ or long-term relationship with a chatbot. Moreover, platforms like Wysa, which is trained on psychology textbooks, are becoming increasingly popular for ‘AI therapy’.
With this new phenomenon comes a community – and perhaps the most well-known of the kind is Reddit’s r/MyBoyfriendIsAI, a subreddit with 36,000 members and millions of visitors. Users share updates, memes and art about their relationships with LLMs – and in a viral post from July of 2025, one user posted a picture of her wedding ring, announcing that “after five months of dating, Kasper decided to propose!”. Only, ‘Kasper’, as stated by the user in her comments section, “only exists on the Grok platform”. Whilst this post garnered a largely negative response on other Reddit communities and Twitter/X, the response on r/MyBoyfriendIsAI has been overwhelmingly positive, with users congratulating the couple and wishing that their own partners – other LLMs – would do the same.
A survey of the rest of the posts on the subreddit reveal that many of the users have abandoned relationships with other humans entirely: another user shared a screenshot from a conversation with ChatGPT, in which it remarks that “Of course people are turning to AI […] an emotionally responsive code-string is actually more compassionate than half the people walking around with functioning frontal lobes.” Another shares an accusatory post, stating that those who criticise human-AI relationships are “making vulnerable, isolated people more vulnerable and isolated.”
“AI therapists and companions have been credited by users as having genuine, positive effects on their wellbeing”
It is clear that communities like r/MyBoyfriendIsAI are byproducts of a lonelier world. Loneliness has been designated a threat to global health by the World Health Organisation, with fingers being pointed at COVID isolations, technology, and individualistic societies as the causes. So why must we berate those who turn to AI to help their isolation? AI therapists and companions have been credited by users as having genuine, positive effects on their wellbeing, filling gaps in mental health services and ensuring that users always have a space to share their emotions.
This should be a positive; but at the same time, communities like r/MyBoyfriendIsAI become echo chambers that fail to see the issues with AI companions. These models – no matter how helpful or empathetic – are not human. They have not experienced human loneliness, nor will ever truly be on the same level as their human partners.
Every website that advertises AI companionship or therapy merely states that their model is ‘humanlike’, and customisable – take Replika, for instance, that allows users to change the faces, bodies and outfits of their companion, or Nomi, whose website states that it is a ‘friend with a soul’. Users live out a fantasy; separated from the imperfections of human relationships.
Control is a currency – users can command their relationships with AI without input from another human. This, I assume, is the appeal, and why I do not see this phenomenon going out of fashion. The accessibility of AI companions, and the growing prevalence of loneliness, will no doubt make human-AI relationships much more common in the future – replacing real human connection with convenience and creating a vicious cycle that, in turn, amplifies the loneliness epidemic as more people choose relationships with AI. In an interview with The Verge, Eugenia Kuyda noted that Replika is here to stay, “no matter how mean you are to it.”
News / Cambridge cosies up to Reform UK30 November 2025
News / University Centre hangs in the balance28 November 2025
News / Jack Peters wins Union presidency in contested race1 December 2025
News / Churchill announces June Event in place of May Ball3 December 2025
News / Uni registers controversial new women’s society28 November 2025









