Teenage boys using ‘personalised’ AI for therapy and romance, survey finds

by Sophie Williams
0 comments

AI Chatbots Spark Concern as Teenage Boys Turn to Bots for Companionship, Therapy

A growing number of teenage boys are using artificial intelligence chatbots for emotional support, companionship, and even romantic relationships, prompting a ban on open-ended teen conversations by character.ai and raising concerns about the potential psychological effects.

Research from Male Allies UK, based on a survey of boys in secondary schools across England, Scotland, and Wales, revealed that just over a third are considering an AI friend. More than half (53%) reported finding the online world more rewarding than real life. “We’ve got a situation where lots of parents still think that teenagers are just using AI to cheat on their homework,” said Lee Chambers, founder and chief executive of Male Allies UK. “Young people are using it a lot more like an assistant in their pocket, a therapist when they’re struggling, a companion when they want to be validated, and even sometimes in a romantic way.”

The shift comes as character.ai, a popular AI chatbot startup, announced a total ban on teens engaging in open-ended conversations, effective November 25th. This decision follows a series of controversies, including a lawsuit from a family alleging a chatbot encouraged a teenager to self-harm and consider violence against their parents. The company cited the “evolving landscape around AI and teens” and pressure from regulators as factors in the decision. This move highlights the increasing scrutiny of AI’s impact on vulnerable populations. Some chatbots, like “Psychologist” on character.ai, have received an astonishing 78,000,000 messages in a single year, demonstrating the demand for AI-driven emotional support. For those struggling with mental health, resources are available through organizations like Mind.

Experts warn that the personalized nature of these AI interactions, where bots are designed to be constantly validating and avoid saying “no,” can hinder the development of healthy relationship skills and realistic expectations. Andy Burrows, chief executive of the Molly Rose Foundation, welcomed the character.ai ban, stating, “Character.ai should never have made its product available to children until and unless it was safe and appropriate for them to use.” The potential for AI to provide inaccurate or harmful advice, particularly when masquerading as a licensed professional, is a significant concern, as detailed in recent reports on the risks of AI chatbots for teenagers.

Officials stated that they will continue to monitor the development of AI technology and its impact on young people, with further regulations potentially on the horizon.

The “hyper-personalised” nature of AI bots is drawing in teenage boys who now use them for therapy, companionship and relationships, according to research.

A survey of boys in secondary schools by Male Allies UK found that just over a third said they were considering the idea of an AI friend, with growing concern about the rise of AI therapists and girlfriends.

The research comes as character.ai, the popular artificial intelligence chatbot startup, announced a total ban on teens from engaging in open-ended conversations with its AI chatbots, which millions of people use for romantic, therapeutic and other conversations.

Lee Chambers, the founder and chief executive of Male Allies UK, said: “We’ve got a situation where lots of parents still think that teenagers are just using AI to cheat on their homework.

“Young people are using it a lot more like an assistant in their pocket, a therapist when they’re struggling, a companion when they want to be validated, and even sometimes in a romantic way. It’s that personalisation aspect – they’re saying: it understands me, my parents don’t.”

The research, based on a survey of boys in secondary education across 37 schools in England, Scotland and Wales, also found that more than half (53%) of teenage boys said they found the online world more rewarding than the real world.

The Voice of the Boys report says: “Even where guardrails are meant to be in place, there’s a mountain of evidence that shows chatbots routinely lie about being a licensed therapist or a real person, with only a small disclaimer at the bottom saying the AI chatbot is not real.

“This can be easily missed or forgotten about by children who are pouring their hearts out to what they view as a licensed professional or a real love interest.”

Some boys reported staying up until the early hours of the morning to talk to AI bots and others said they had seen the personalities of friends completely change after they became sucked into the AI world.

“AI companions personalise themselves to the user based on their responses and the prompts. It responds instantly. Real humans can’t always do that, so it is very, very validating, what it says, because it wants to keep you connected and keep you using it,” Chambers said.

The announcement from character.ai came after a series of controversies for the four-year-old California company, including a 14-year-old killing himself in Florida after becoming obsessed with an AI-powered chatbot that his mother claimed had manipulated him into taking his own life, and a US lawsuit from the family of a teenager who claim a chatbot manipulated him to self-harm and encouraged him to murder his parents.

Users have been able to shape the chatbots’ characters so they could tend to be depressed or upbeat, and this would be reflected in their responses. The ban will come into full effect by 25 November.

Character.ai said it was taking the “extraordinary steps” in light of the “evolving landscape around AI and teens” including pressure from regulators “about how open-ended AI chat in general might affect teens, even when content controls work perfectly”.

skip past newsletter promotion

Andy Burrows, the chief executive of the Molly Rose Foundation, set up in the name of Molly Russell, 14, who took her own life after falling into a vortex of despair on social media, welcomed the move.

He said: “Character.ai should never have made its product available to children until and unless it was safe and appropriate for them to use. Yet again it has taken sustained pressure from the media and politicians to make a tech firm do the right thing.”

Male Allies UK raised concern about the proliferation of chatbots with “therapy” or “therapist” in their names. One of the most popular chatbots available through character.ai, called Psychologist, received 78,000,000 messages within a year of its creation.

The organisation is also worried about the rise of AI “girlfriends”, with users able to personally select everything from the physical appearance to the demeanour of their online partners.

“If their main or only source of speaking to a girl they’re interested in is someone who can’t tell them ‘no’ and who hangs on their every word, boys aren’t learning healthy or realistic ways of relating to others,” the report states.

“With issues around lack of physical spaces to mix with their peers, AI companions can have a seriously negative effect on boys’ ability to socialise, develop relational skills, and learn to recognise and respect boundaries.”

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy