close
close

‘A hard blow’: Character.AI criticized for ‘horrific’ chatbots from Brianna Ghey and Molly Russell | Science, climate and technology news

‘A hard blow’: Character.AI criticized for ‘horrific’ chatbots from Brianna Ghey and Molly Russell | Science, climate and technology news

The NSPCC is warning an AI company that allows users to create chatbots that imitate murdered teenager Brianna Ghey and her mother sought ‘growth and profit at the expense of safety and decency’.

Character.AI, which was accused last week of ‘manipulating’ a teenage boy into committing suicide, also allowed users to create chatbots that imitated teenagers Molly Russel.

Molly took her own life in November 2017 at the age of 14 after viewing online posts about suicide, depression and anxiety.

The chatbots were discovered during a study by The Telegraph newspaper.

“This is yet another example of how manipulative and dangerous the online world can be for young people,” says Esther Ghey, the mother of Brianna Gheyand called on those in power to “protect children” from “such a rapidly changing digital world.”

Molly Russell's family has been campaigning for better internet safety since her death in 2017.
Image:
Molly Russell died in 2017. Photo: Family ceremony

According to the report, a Character.AI bot with a small spelling error in Molly’s name and using her photo told users it was an “expert on the last years of Molly’s life.”

“It is heartbreaking to see Character.AI demonstrate a complete lack of responsibility and it vividly underlines why stronger regulation of both AI and user-generated platforms cannot come soon enough,” said Andy Burrows, who runs the Molly Rose Foundation, a charitable organization. by the teen’s family and friends in the aftermath of her death.

Read more:
Suspect in Southport stabbing has been charged with terror
Avoid Halloween horror of tooth decay, surgeons say
Why budget will be a hard sell for Reeves

The NSPCC has now called on the government to implement its “promised AI safety regulations” and ensure the “principles of safety by design and child protection are at its heart”.

“It is appalling that these horrific chatbots could be created and shows that Character.AI has clearly failed to provide basic oversight of its services,” said Richard Collard, head of online child safety policy at the charity.

Follow Sky News on WhatsApp
Follow Sky News on WhatsApp

Stay up to date with the latest news from the UK and around the world by following Sky News

Tap here

Character.AI told Sky News that the characters were user created and removed as soon as the company was notified.

“Character.AI takes security on our platform seriously and moderates characters both proactively and in response to user reports,” a company spokesperson said.

“We have a dedicated Trust & Safety team who review reports and take action in accordance with our policies.

“We also do proactive detection and moderation in a number of ways, including using industry-standard blocklists and custom blocklists that we regularly expand. We are constantly evolving and refining our security practices to prioritize the safety of our community.”

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email [email protected] in the UK. In the US, call your nearest Samaritans office or 1 (800) 273-TALK