Lawsuit claims harmful content on TikTok contributed to the deaths of two teens

What just happened? The impact social media platforms have on the mental health of young users is coming under renewed scrutiny after French families of seven teenage girls filed a lawsuit against TikTok. They claim the platform exposed their teenage children to harmful content, which led to two of them committing suicide at the age of 15.

The lawsuit, filed in the Créteil court, claims that TikTok’s algorithm suggested videos to the teens that promoted suicide, self-harm and eating disorders.

“The parents want TikTok’s legal liability to be recognized by the court,” lawyer Laure Boutron-Marmion told broadcaster franceinfo. “This is a commercial company that offers a product to consumers who are also minors. They must therefore answer for the shortcomings of the product.”

In September 2023, the family of 15-year-old Marie filed criminal charges against TikTok following her death, accusing the platform of “inciting suicide,” “failing to assist a person in danger,” and “promoting and advertising of methods of suicide’. cause harm,” writes Politics. TikTok’s algorithm is said to have trapped Marie in a bubble of toxic content related to bullying she experienced because of her weight.

TikTok is facing numerous lawsuits in the US over claims that it is damaging the mental health of young people. In 2022, the families of several children were killed while trying to participate in a dangerous TikTok challenge indicted the company and its parent company, ByteDance, after the app allegedly recommended videos of the “blackout” choke challenge to the minors, all of whom were 10 years old or younger.

Last month, a group of fourteen attorneys general met submitted lawsuits against TikTok, accusing the company of harming children’s mental health and violating consumer protection laws. It is alleged that TikTok uses manipulative features to keep young users on the platform longer. These include endless scrolling, autoplay videos, and frequent push notifications.

It’s not just TikTok that remains in the spotlight for the perceived harm it can cause to young people. All social media platforms face the same scrutiny. In October last year, the attorneys general of more than 40 US states were appointed indicted Facebook for harming children’s mental health.

During an online Senate hearing on child safety in January, Meta CEO Mark Zuckerberg apologized to parents in the audience who said Instagram contributed to the suicides or exploitation of their children.

The impact of social media on the mental health of not only children but also adults led to the US Surgeon General calling Urges Congress to add cigarette-style labels to these sites and apps that warn users of the potential harm they cause.

Social media companies typically hide behind Section 230 of the Communications Decency Act of 1996, which protects them from liability for content posted by users.

TikTok still faces a possible ban in the US. Amid national security concerns about its Chinese ownership, President Joe Biden signed legislation in April requiring ByteDance to divest its U.S. operations by Jan. 19, 2025 or face a nationwide ban.