Meta, Snap and TikTok have founded a new program called Thrive to help stop the spread of graphic content that depicts or encourages self-harm and suicide. Thrive allows participating companies to share “signals” to alert each other about violative content on their platforms.
Thrive has been created in collaboration with the Mental Health Coalition, a charity that says it works to remove the stigma around discussions of mental health. Meta says it provides the technical infrastructure behind Thrive that enables “safe signal sharing.” It uses the same cross-platform signal-sharing technology used in the Lantern program, which is designed to help combat online child abuse. Participating companies can share hashes that match offending media to send to each other.
Meta says it has already made such content harder to find on its platform, but is trying to make room for people to talk about their stories of mental health, suicide and self-harm, as long as they don't promote them or provide graphic descriptions.
facebook/”>According to Meta's chartsThe company takes action against millions of posts of suicidal and self-harm content every quarter. Last quarter, it restored about 25,000 of those posts, most of them after a user filed an appeal.