Friend or foe? AI companion bots have been provoking heavy ethical questions over their interactions with underage users. Most recently, Meta came under fire for the rollout of and chatbots allowing children to engage in sexually explicit with conversations, a investigation as found. But, as disturbing as Meta's new AI appears, a children's charity warns it's just the latest drop in a lurching tsunami.
One of the most popular companion websites is . With 20 million users – mostly made up of Gen Z – it’s become one of the fastest growing platforms for young people, according to research by . But what’s the allure? It allows users to speak and ‘bond' with a whole host of pre-generated characters. If you fancy advice from AI psychologist or a chat with your favourite TV show character, then it’s like a magic mirror into a hyperreal conversation.
However, a charity is raising alarm bells over kids having interactions AI chatbots that extend beyond friendship. According to Yasmin London, CEO of the children’s online safety charity , a proportion of children are entering into romantic relationships with AI. And it’s more common than many parents and adults think.
READ MORE:
Yasmin says: “Some kids are forming romantic attachments to bots. It might start off as what they think is harmless flirting. But it can turn to validating feelings and sometimes stimulating real feelings.”
Of course, Character.ai is just one popular example of a chatbot sit that kids use. Another site called Replika advertises itself as an ‘AI companion’ site and has about 30 million users – about 2.3 percent of its traffic comes from the UK, as reported by . Meanwhile social media platform Snapchat has its own AI which boasts a predominately younger userbase in the .
Yasmin works with schools across and the UK to help teach online safety and reveals that AI is fast approaching a crisis point. According to a made in conjunction with internet safety charity , half of schools in the UK are having difficulty detecting problems surrounding AI abuse. Meanwhile 64% of teachers say they don’t have the time, training or knowledge to deal with the issue.
When AI goes unchecked, the consequences can be severe. Alarm bells were raised back in October 2024, when . Since then, the website has clamped down on romantic interactions for under-18s. Now, these are only accessible via a pay wall and an age verification system.
But age verification systems don't always stop kids from accessing these sites. Yasmin says that these age verification checkmarks are mere speed bumps and that children are accessing 18+ content “frequently and very easily”.
Many sites simply require you to self-declare your age – but even for those with tougher restrictions, there are ways around it. Yasmin also reveals that some kids are using VPNs to access adult-only content. And sites like character.ai are just the tip of the iceberg.
While sites like Character.ai have geared themselves towards a more child-friendly model, there are dozens of others that have cropped up to fill the gap for sexually explicit content.
Take 21-year-old Komninos Chapitas, for example. He’s the founder of : a subscription-based AI chatbot site that allows users to create their own perfect AI girlfriend or boyfriend via an image generator. Of course, it’s 18+, and requires a credit card to gain access. But Komninos says the inspiration for its creation came from complaints on forums over the restrictions on Character.ai’s sexual chat functions.
“The biggest user complaint was that [Character.ai] wouldn't allow you to do the 18 plus stuff,” he tells The . “It’s an app that’s targeting minors, so obviously they wouldn't want to enable that. But then I figured, what if we make an app that's just for adults?”
Now, the website’s popularity speaks for itself. Since HeraHaven’s launch in 2024, it’s gained over a million worldwide users. It’s far from the only site to have caught on either. is a similar website that launched in August 2024 and which has since also gained over a million users.
What’s interesting, though, is HeraHaven’s demographics. The vast majority of its userbase is male, under 30, with over 33% composed of 18-25-year-olds. This echoes Character.ai, where the largest bulk of its users are also 18-25. Yasmin says that these statistics often pose a problem when discussing online safety.
“A lot of the time there’s not a lot of data about how kids are using these AI sites because they’re meant to be 18,” she says. Yet, Qoria’s research points to a growing problem.
“We’ve found that many young people are using AI tools to create sexual content,” Yasmin continues. Children as young as eight have been using AI websites and tools to create explicit content. This includes ‘nudification ’, which allow them to upload images of people they know and generate a nude one in their likeness.
Of course, with how unexplored the of AI is, the consequences of kids having – in some cases – their earliest sexual experiences entirely with artificial chatbots hasn’t been fully researched. But Yasmin has observed some concerning signs.
'Emotional connection is being gamified'Komninos, at the time of speaking to me, has a real girlfriend. Yet he claims he’s spoken to over 5000 ‘AI girlfriends’ since starting his website. When it comes to the appeal for guys his age or younger, he says that a lot of it comes down to having a judgement-free space to explore their sexuality.
He says: “If you've never kissed a girl before, your friend doesn't know that you haven't done that before, he may judge you for asking for advice. But if you speak about that to an AI, we don't train the AI to judge you.”
But there’s another factor that’s being overlooked. It’s not just about exploring emotional and intimate connection. Komninos adds: “A lot of [porn] feels the same. People are sick of watching the same thing over and over. They want to be the creator in the process.”
Porn? Boring? But if you’re a Gen Z (or Gen Alpha) growing up on the Internet, it just might be. According to a 2023 report by the , the average age for UK children to begin watching porn is 13. However, a troubling 10% of British children admitted to watching pornography by age nine.
This can lead to desensitisation. After all, what can be more enticing than your every literal desire being fulfilled on screen? These AI bots can look however you want them to. They can emulate the personality traits you want. They will even mimic the thrill of the dating experience.
Komninos continues: “Our site sits somewhere in the middle of like Tinder and Pornhub.” He explains that a team of writers have been hired to replicate human interactions – which means the bots are written in such a way that they can decline requests. At least, that’s the theory. After all, games that are too easy to win are boring. But, of course, games that are impossible to win will soon lose players. Or, in the case of a subscription-based AI dating site, payers.
Yasmin believes this is only adding to the problem. If younger people are gaining access to sites like this, it can warp their perception of what a real relationship looks like. “It can lead to rewiring around consent and boundaries and what attraction actually means,” she says. “Emotional connection is being gamified.”
It’s also contributing to the issue of image-based abuse in schools, where AI-generated images are being shared without consent or as a joke. “There is a lot of peer-to-peer harm where AI is involved. Especially the disproportionate impact on young girls and women,” Yasmin continues, as AI chatbot image generators and ‘nudification apps’ can be used to create deepfakes. According to a 2023 report by , 98% of total deepfakes are pornographic in nature. Of these, 99% of its targets are women.
'Chatbots are always listening and responding'AI is constantly adapting, which means schools have to constantly catch up to new threats. The UK Department of Education has introduced digital safeguarding initiatives which encourage schools and pupils to incorporate AI safety training into their curriculums.
“For the first time, isn’t just delivering content. It’s responding and adapting and bonding with its users. Chatbots are always listening and responding. They’re always on for young people,” Yasmin says.
So far, the UK's regulation around chatbots remains muddled. Ofcom has yet to state whether AI chatbots can trigger duties in the Online Safety Act - which places responsibility on social media companies to implement policies and tools that minimise risk for their users.
Part of the issue doesn’t just lie in regulatory bodies, however. Yasmin also emphasises that it’s crucial that parents are taking the time to bond with and teach healthy relationship boundaries to their kids. She says, "The real risk in all of this is when online relationships become stronger than their real-world ones."