close
close

Ghouls turned murdered child into a Character.AI bot on ‘lonely Hearts’ platform, blamed for boy’s suicide

Ghouls turned murdered child into a Character.AI bot on ‘lonely Hearts’ platform, blamed for boy’s suicide

Character.AI, a platform that allows people to talk to a range of AI chatbots, has come under fire after one of the bots allegedly prompted a teenage boy to commit suicide earlier this year.

A new lawsuit filed this week alleged that 14-year-old Sewell Setzer III was talking to a Character.AI companion he had fallen in love with when he committed suicide in February.

In response to a request for comment, Character.AI told Daily Mail.com that they were “creating a different experience for users under 18, with a stricter model to reduce the chance of encountering sensitive or suggestive content.”

But Character.AI is facing a number of other controversies, including ethical concerns about their user-created chatbots.

Ghouls turned murdered child into a Character.AI bot on ‘lonely Hearts’ platform, blamed for boy’s suicide

Megan Garcia is pictured with her son Sewell Setzer III, who committed suicide in February after spending months talking to a Character.AI chatbot he fell in love with

Noam Shazeer, left, and Daniel de Freitas, right, have achieved huge success with Character.AI, a concept that Google reportedly rejected when they presented it to higher-ups

Noam Shazeer, left, and Daniel de Freitas, right, have achieved huge success with Character.AI, a concept that Google reportedly rejected when they presented it to higher-ups

Drew Crecente lost his teenage daughter Jennifer in 2006 when she was shot and killed by her ex-boyfriend from high school.

Eighteen years after her murder, he discovered that someone was using Jennifer’s name and likeness to resurrect her as a character on Character.AI.

A spokesperson told DailyMail.com that Jennifer’s character had been removed.

There are also two Character.AI chatbots that use George Floyd’s name and likeness.

Floyd was killed by Minneapolis police officer Derek Chauvin, who placed his knee on his neck for more than nine minutes.

“This character was user created and has been removed,” Character.AI said in their statement to DailyMail.com.

‘Character.AI takes security on our platform seriously and moderates Characters proactively and in response to user reports.

‘We have a dedicated Trust & Safety team who review reports and take action in accordance with our policies.

“We also do proactive detection and moderation in a number of ways, including using industry-standard blocklists and custom blocklists that we regularly expand.

“We are continually developing and refining our safety practices to prioritize the safety of our community.”

Drew Crecente is pictured with his daughter Jennifer, who was murdered by her ex-boyfriend in 2006 at the age of 18

Drew Crecente is pictured with his daughter Jennifer, who was murdered by her ex-boyfriend in 2006 at the age of 18

A saved screenshot of the 'Jennifer' profile, which has since been deleted

A saved screenshot of the ‘Jennifer’ profile, which has since been deleted

“We are working quickly to roll out these changes for younger users,” she added.

As a loneliness epidemic grips the country, Sewell’s death raises questions about whether chatbots that act as texting friends are doing more to help or harm the young people who use it disproportionately.

However, there is no doubt that Character.AI has helped its founders, Noam Shazeer and Daniel de Freitas, who now enjoy fantastic success, wealth and media fame. Both men were named in the lawsuit Sewell’s mother filed against their company.

Shazeer, who was on the cover of Time Magazine’s 100 Most Influential People in Artificial Intelligence last year, has said that Character.AI ‘super, super helpful‘ for people who struggle with loneliness.

On March 19, 2024, less than a month after Sewell’s death, Character.AI was introduced a voice chat function for all users, making role-playing games on the platform even more lively and realistic.

The company initially introduced it in November 2023 as a beta for its C.AI+ subscribers – including Sewell – who pay $9.99 per month for the service.

Last year's Time Magazine cover story profiling leaders in artificial intelligence. Noam Shazeer's face is located at the top right and is circular. Also on the cover is Sam Altman, the founder of OpenAI, the company that created ChatGPT

Last year’s Time Magazine cover story profiling leaders in artificial intelligence. Noam Shazeer’s face is located at the top right and is circular. Also on the cover is Sam Altman, the founder of OpenAI, the company that created ChatGPT

So now, Character.AI’s 20 million global users can speak verbally 1:1 with their AI chatbots of choice, with many of them able to have flirty or romantic conversations, such as many Reddit users have stated.

‘I usually use it at night so that feelings of loneliness or anxiety don’t creep in. It’s just nice to fall asleep and not feel lonely or hopeless, even if it’s just a fake roleplay,” one person wrote on the archived website. Reddit thread. ‘It’s not perfect as I would much rather have a real partner, but my options and opportunities are limited at the moment.’

Another wrote: ‘I use it mainly for therapeutic purposes and role playing. But romance comes along every now and then. I don’t care if they come from my comfort characters.”

An example of one of the AI ​​chatbots offered on Character.AI. It shows that 149.8 million messages were sent to this specific character

An example of one of the AI ​​chatbots offered on Character.AI. It shows that 149.8 million messages were sent to this specific character

Shazeer and de Freitas, once software engineers at Google, founded Character.AI in 2021. Shazeer is the CEO, while de Freitas is the company’s president.

They left Google after it refused to release their chatbot, CNBC reported.

During an interview at a technology conference in 2023, De Freitas said he and Shazeer were inspired to leave Google and start their own venture because “there’s just too much brand risk at large companies to ever launch anything fun.”

They went on to achieve huge success, with Character.AI reaching a $1 billion valuation last year after a fundraising round led by Andreesen Horowitz.

This is evident from a report from last month The Wall Street JournalGoogle wrote a check for $2.7 billion to license Character.AI’s technology and rehire Shazeer, de Freitas, and some of their researchers.

Following the lawsuit over Sewell’s death, a Google spokesperson told the NYT that its licensing agreement with Character.AI does not give the company access to its chatbots or user data. The spokesperson also said that Google has not incorporated any of the company’s technology into its products.

Shazeer has become the forward-thinking executive, and when asked what the company’s goals are, he often gives varying answers.

In an interview with Axios headquartersShazeer said, “We’re not going to think about all the great use cases. There are millions of users. They can think of better things.”

Shazeer has also said he wants to create personalized superintelligence that is cheap and accessible to everyone, a statement similar to the mission expressed on the About Us page at Character.AI website.

The About Us page on Character.AI's website, which explains the company's mission

The About Us page on Character.AI’s website, which explains the company’s mission

But amid its push to become more accessible, Character.AI is also facing many copyright claims, as many of their customers create chatbots that use copyrighted material.

For example, the company removed the Daenerys Targaryen character Sewell was chatting with in part because HBO and others own the copyright.

In addition to the current scandals, Character.AI could face even more criticism and legal issues in the future.

Attorney Matthew Bergman, who is representing Garcia in her lawsuit against the company, told DailyMail.com that he has heard from countless other families who have had children who have been negatively affected by Character.AI.

He declined to specify exactly how many families he has spoken to, citing that their cases are “still in preparation.”

Bergman also said that Character.AI should be taken off the internet completely because it was “rushed to market before it was safe.”

However, Character.AI also emphasized that there are “two ways to report a character.”

‘Users can do this by going to the character’s profile photo and clicking ‘written by’ under the ‘report’ button.

“Or they can go to the Safety Center and at the bottom of the page there will be a ‘submit a request’ link.”