
Meta, TikTok, and Snap will participate in an external grading process that evaluates social platforms on their protection of adolescent mental health. YouTube and Roblox are also participating in this program. Discord will also participate.
The Mental Health Coalition‘s Safe Online Standards (SOS) initiative created this program. SOS includes approximately two dozen standards. These standards cover areas such as platform policy, functionality, governance, transparency, and content oversight.
Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention, leads the SOS initiative. Participating companies will voluntarily submit documentation regarding their policies, tools, and product features. An independent panel of global experts will evaluate this information.
The SOS initiative aims to establish user-informed data on how digital platforms design products, protect users aged 13–19, and address exposure to suicide and self-harm content.
Platforms will receive one of three ratings after evaluation:
- “use carefully” is the highest rating. This rating includes a blue badge for compliant platforms. Requirements include accessible and easy-to-use reporting tools, clear and easy-to-set privacy, default, and safety functions for parents, and platforms and filters that help reduce exposure to harmful or inappropriate content.
- “partial protection” indicates that some safety tools exist but may be difficult to find or use.
- “does not meet standards” is given when filters and content moderation do not reliably block harmful or unsafe content.
The Mental Health Coalition, founded in 2020, has collaborated with Meta (formerly Facebook) since its early days.
- In 2021, the organization announced plans to partner with Facebook and Instagram to destigmatize mental health and connect individuals to resources during the COVID-19 pandemic.
- In 2022, the nonprofit published a case study supported by Meta. The study found that mental health content on social media can reduce stigma and increase the likelihood of individuals seeking resources.
- In 2024, the MHC, in partnership with Meta, launched the Time Well Spent Challenge. This campaign encouraged parents to have conversations with teens about healthy social media use. The campaign focused on keeping teens on-platform in a “time well spent” manner, including reduced screen time, using social media for good, and reviewing feeds together.
- Also in 2024, the MHC partnered with Meta to establish “Thrive.” Thrive is a program that allows tech companies to share data on materials violating self-harm or suicide content guidelines.
The Mental Health Coalition website lists Meta as a “creative partner.”
Last year, allegations surfaced that Meta concealed internal data, known as “Project Mercury” (started in 2020), showing negative effects of its products on users’ mental health. Meta has since introduced measures such as Instagram teen accounts. Meta is currently facing a class-action lawsuit in California over allegations of child harm from addictive products.
Roblox has faced accusations regarding child well-being on its platform. Discord has increased its age-verification processes due to child endangerment concerns.
Featured image credit
































