CASSIS, France — Three years ago, Stephanie Mistre’s life changed irrevocably when she discovered her 15-year-old daughter, Marie, had taken her own life in her bedroom. Mistre recalled the moment, saying, “I went from light to darkness in a fraction of a second,” and since that tragic day in September 2021, she has been waging a fight against TikTok, the popular Chinese-owned video platform, which she believes contributed to her daughter’s despair.
Investigating her daughter’s phone after the incident, Mistre found a troubling array of videos that promoted methods of suicide, tutorials, and comments that encouraged extreme actions beyond mere attempts. Mistre expressed her concerns about TikTok’s algorithm, which she asserts relentlessly exposed her daughter to harmful content. She described it as a form of “brainwashing” that normalized depression and self-harm while creating a distorted sense of community among users.
In light of these findings, Mistre has joined forces with six other families to sue TikTok France, claiming that the platform has failed to properly regulate harmful content and has put children at risk by exposing them to dangerous material. Among the families involved, two have tragically lost a child to suicide. In response to the lawsuit, TikTok stated that its community guidelines prohibit the promotion of suicide and noted that they employ 40,000 safety professionals worldwide, including many French-speaking moderators, to remove dangerous content. The company also mentioned that it directs users searching for suicide-related material to mental health resources.
Before her death, Marie made several videos to articulate her struggles and even referenced a song from the emo rap duo Suicideboys, who have gained popularity on TikTok. Mistre claimed that her daughter faced ongoing bullying both in school and online. Alongside the lawsuit against TikTok, she and her husband have filed a separate complaint against five of Marie’s classmates and the school she attended.
Mistre believes TikTok’s platform is especially dangerous when placed in the hands of young, sensitive users, describing it as akin to handing a ticking bomb to a vulnerable teenager who cannot distinguish reality from harmful fantasy.
While some experts acknowledge the correlation between social media and mental health issues, definitive links remain elusive. Grégoire Borst, a psychology professor at Paris-Cité University, emphasized that isolating specific causes and effects in this context is challenging, referring to a significant study indicating that only a small fraction of youth well-being can be linked to social media use. Furthermore, he mentioned that current research does not conclusively show TikTok is more damaging than similar platforms like Snapchat or Instagram.
Many teenagers engage with social media without severe repercussions. Yet, Borst argues that the real dangers emerge for those already dealing with personal challenges. He noted that when young people who are struggling spend time on platforms rife with harmful comparisons or distorted portrayals, it can exacerbate their suffering.
Laure Boutron-Marmion, the attorney representing the families, asserts that their case is bolstered by substantial evidence. She argued that TikTok cannot evade responsibility by claiming it merely hosts user-generated content. The lawsuit contends that TikTok’s algorithm purposefully entraps susceptible users in cycles of despair for profit and seeks reparations for the bereaved families.
Mistre described TikTok’s tactics as insidious, claiming they ensnare youth in depressive material to keep them engaged on the app and transform them into profitable users. Boutron-Marmion also drew attention to the stricter content regulations in TikTok’s Chinese counterpart, Douyin, which mandates a “youth mode” that limits screen time for users under 14 and provides only approved content.
A report commissioned by French President Emmanuel Macron, which includes insights from Borst, has called for the regulation of certain algorithmic practices deemed addictive and recommended restricting access to social media for minors under 15. However, these proposals have yet to be implemented.
On a global scale, TikTok has faced scrutiny, especially in the U.S., where parents have similarly filed lawsuits. In one instance in Los Angeles County, the company, alongside Meta’s platforms and Snapchat, is accused of designing products that can lead to serious harm, with specific cases involving teenagers who have died by suicide. Other legal actions have emerged from tribal nations alleging that major social media companies contribute to high suicide rates among Native youth.
In December, Australia enacted groundbreaking legislation banning social media for children under 16, reflecting growing international concern regarding youth engagement on these platforms. Boutron-Marmion anticipates that TikTok Limited Technologies will respond to the lawsuits in early 2025, with a trial date yet to be determined.
As discussions about TikTok’s regulatory practices continue, critics maintain that claims of effective moderation are inadequate. Imran Ahmed, CEO of the Center for Countering Digital Hate, expressed skepticism about TikTok’s assertions that they successfully remove the vast majority of harmful videos. He pointed out that many users have developed coded language to evade detection by algorithms, complicating TikTok’s moderation efforts.
Ahmed’s research emulated a 13-year-old girl’s experience on TikTok, revealing that harmful content related to self-harm appeared within minutes of usage.
For Mistre, this battle is profoundly personal. She continues to preserve her daughter’s room untouched, highlighting the urgent need for parents to be aware of social media’s potential harms. Reflecting on her unawareness of the content Marie encountered, she vowed to continue her fight for awareness and accountability from social media platforms in honor of her daughter’s memory.