Meta has announced a significant new initiative, partnering with social media giants Snap and TikTok to combat the spread of harmful content related to suicide and self-harm.
The programme, known as Thrive, aims to stop the circulation of graphic material that promotes or encourages these dangerous behaviours.
Through shared alerts, the initiative will enable these platforms to act quickly and collaboratively.
Thrive, developed in partnership with the Mental Health Coalition, will allow the companies to share “signals” across platforms whenever harmful content appears.
This cooperation builds on Meta’s Lantern program, which is used to combat child abuse by securely sharing flagged content between platforms.
Using hashed data, a unique code generated from offending material.
Thrive will ensure that once harmful content is flagged on one platform, others can be notified immediately, reducing the risk of its spread.
While Meta has already made it more difficult for users to find self-harm and suicide-related content, it continues to ensure that people can still discuss mental health issues openly.
As long as posts don’t cross the line into promotion or graphic detail, users are allowed to share their personal experiences.
The Thrive initiative strengthens these efforts, ensuring that dangerous material is quickly addressed.
Meta, Snap, and TikTok will now be able to respond swiftly, preventing such content from reaching vulnerable users.
Meta’s data highlights the scope of the challenge, revealing that the company deals with millions of pieces of content related to self-harm and suicide each quarter.
In the most recent period, approximately 25,000 posts were restored after being appealed by users, reflecting the complexity of managing this type of content.
As social media continues to play a central role in people’s lives, particularly among younger users, the Thrive initiative represents a crucial step toward ensuring safer online spaces.
By sharing alerts and acting collaboratively, these platforms aim to limit the reach of harmful content and better protect their users.
Meta’s data shows that the company addresses millions of pieces of suicide and self-harm content each quarter.
In the last quarter, approximately 25,000 posts were reinstated, primarily following user appeals.