Delving into her daughter’s phone after her death, Mistre discovered videos promoting suicide methods, tutorials and comments encouraging users to go beyond “mere suicide attempts.” She said TikTok’s algorithm had repeatedly pushed such content to her daughter.
Families sue TikTok in France over teen suicides they say are linked to harmful content In the moment when her world shattered three years ago, Stephanie Mistre found her 15-year-old daughter, Marie, lifeless in the bedroom where she died by suicide.
“I went from light to darkness in a fraction of a second,” Mistre said, describing the day in September 2021 that marked the start of her fight against TikTok, the Chinese-owned video app she blames for pushing her daughter toward despair.
Asked about the lawsuit, TikTok said its guidelines forbid any promotion of suicide and that it employs 40,000 trust and safety professionals worldwide — hundreds of which are French-speaking moderators — to remove dangerous posts.
Now Mistre and six other families are suing TikTok France, accusing the platform of failing to moderate harmful content and exposing children to life-threatening material.