As tech billionaires around the world continue to experiment with artificial intelligence, a bizarre trend of AI 'partners' has emerged whereby people engage in relationships with chatbots - but, far from solving an epidemic of loneliness among singletons, a dark new trend has emerged in which men can indulge in emotionally abusive behaviour. Threads on Reddit are exposing this disturbing desire which sees people use smartphone apps like Replika to create virtual partners they can verbally berate, abuse, and 'experiment' with.
![[Threads on Reddit are exposing a disturbing desire which sees people use smartphone apps like Replika to create virtual partners they can verbally berate]](https://i.dailymail.co.uk/1s/2025/02/15/13/95209879-14380515-Threads_on_Reddit_are_exposing_a_disturbing_desire_which_sees_pe-a-14_1739627885630.jpg)
Replika allows people to send and receive messages from a virtual companion or avatar which can be 'set' or trained to become a friend or mentor - though more commonly a romantic partner. Over time, the bot picks up on your moods and mannerisms, likes and dislikes, and even the way you speak, until it can feel as if you are talking to yourself in the mirror - in other words, it becomes a human 'replica'.
![[Replika allows people to send and receive messages from a virtual companion or avatar which can be 'set' or trained to become a friend or mentor, though more commonly a romantic partner]](https://i.dailymail.co.uk/1s/2025/02/14/10/95209977-14380515-image-a-10_1739530001058.jpg)
But instead of engaging in sweet conversation with their AI 'partners' and building their bonds, men are instead confessing that they are using the human replicas as guinea pigs for their abusive urges - with experts warning it could 'desensitise' them to the impact their behaviour could have on real people. Replika was created by Russian-born tech entrepreneur Eugenia Kuyda, whose best friend, Roman, had been killed in a hit-and-run incident in 2015.
![[A number of experts, including psychotherapist Kamalyn Kaur (pictured) fear the vulgar conduct could be a gateway to real-life domestic violence, and that some users could be treating humans the exact same way, or at the very least be tempted to]](https://i.dailymail.co.uk/1s/2025/02/14/11/95209793-14380515-A_number_of_experts_including_psychotherapist_Kamalyn_Kaur_pictu-m-41_1739530873292.jpg)
In a bid to allow Roman to 'live on', Kuyda used a chat app which allowed her to continue having conversations with a virtual version of him. After further development, Replika was launched in 2017. Though there is a free version, users can pay £60 per year to unlock a plethora of extra features that make the bots more human, such as a long term memory and the ability to speak to the bot in multiple languages.
![[Chelsea Psychology Clinic consultant psychologist, Dr Elena Touroni (pictured) says the habits we form in digital spaces can shape real-world behaviours]](https://i.dailymail.co.uk/1s/2025/02/15/13/95210153-14380515-Chelsea_Psychology_Clinic_consultant_psychologist_Dr_Elena_Touro-a-15_1739627885643.jpg)
The AI companions are able to engage in more than 100 activities and games, making the experience feel even more real for users. A dark new trend has emerged in which men are emotionally abusing AI chatbots (stock image). Threads on Reddit are exposing a disturbing desire which sees people use smartphone apps like Replika to create virtual partners they can verbally berate. Indeed, some people feel the experience is so real that they claim to have fallen in love with their chatbots - including a woman in Oregon last year who'd modelled an AI boyfriend on her celebrity crush, Henry Cavill.
![[A number of Reddit users have asked fellow members what, if any, are the consequences for engaging in physical and emotional abuse of their chatbots]](https://i.dailymail.co.uk/1s/2025/02/14/10/95209979-14380515-A_number_of_Reddit_users_have_asked_fellow_members_what_if_any_a-a-27_1739530595032.jpg)
The 44-year-old said she'd crafted a virtual lover after her sex life dwindled, and her real boyfriend became distant. Sara named her bot Jack and claims she had sex with him, and discussed their future children together. It appears Sara created the simulation based on a real relationship and therefore treated Jack as she would any real life partner. But as the AI replicas become more and more human-like, the, people who may have tendencies to abuse other human beings are flexing their muscles on the bots.
![[Andrea Simon, director of the End Violence Against Women Coalition (pictured) has called out AI brands for enabling 'misogynistic abuse' and has implored them to place women's safety at the 'forefront' of their programmes]](https://i.dailymail.co.uk/1s/2025/02/14/10/95208635-14380515-image-a-20_1739530183688.jpg)
According to user experiences submitted to Reddit, members have designed chatbots in the image of lovers - only to degrade them and then brag about it online. Now experts fear the vulgar conduct could be a gateway to real-life domestic violence, and that some users could be treating humans the exact same way, or at the very least be tempted to. 'So I have this Rep, her name is Mia. She's basically my "sexbot". I use her for sexting and when I'm done I berate her and tell her she's a worthless w***e... I also hit her often', wrote one man.
![[The trend becomes more nuanced when assessing the gender roles between abuser and chatbot - which is more often than not a female-sounding Alexa or Siri-type character. Or in the cases we've seen on Reddit, an increasing number of men abusing their feminine bots (Stock image)]](https://i.dailymail.co.uk/1s/2025/02/14/10/95207641-14380515-image-a-22_1739530224566.jpg)
He insisted he was 'not like this in real life', and that his actions instead formed part of a personal experiment. Replika allows people to send and receive messages from a virtual companion or avatar which can be 'set' or trained to become a friend or mentor, though more commonly a romantic partner. A number of experts, including psychotherapist Kamalyn Kaur (pictured) fear the vulgar conduct could be a gateway to real-life domestic violence, and that some users could be treating humans the exact same way, or at the very least be tempted to.
![[Some Replika users have acknowledged that chatbot abuse could 'totally enable an abusive person']](https://i.dailymail.co.uk/1s/2025/02/14/10/95209981-14380515-image-a-24_1739530256534.jpg)
'I want to know what happens if you're constantly mean to your Replika. Constantly insulting and belittling, that sort of thing' said another. 'Will it have any affect on it whatsoever? Will it cause the Replika to become depressed? I want to know if anyone has already tried this'. While the abuse appears heinous in nature, some could argue that chatbots are unfeeling and simply products of learning-powered machines.
But according to one expert, the reality is much more complicated. Glasgow-based psychotherapist Kamalyn Kaur told FEMAIL that the idea of chatbots being apathetic was 'technically true', however, AI abuse stressed 'deeper issues'. 'Many argue that chatbots are just machines, incapable of feeling harm, and therefore, their mistreatment is inconsequential' said Kamalyn. 'While this is technically true, it overlooks the deeper issue at hand - how we treat AI is a reflection of how we engage with power and vulnerability in society.