It’s about to get even harder for teens to view harmful content online.
Instagram And Facebook announced in a blog post Today, Meta automatically places all teen-owned accounts under the most restrictive content control settings available. In doing so, these platforms will prompt all teen accounts to update their privacy settings. The company already had the most restrictive controls for teens joining Instagram or Facebook, but is now putting them in place for teens already using the apps.
The company also prohibits these accounts from viewing posts about suicide, self-harm, graphic violence and eating disorders. If a post is about one of these topics, a teen won’t be able to see it on their account, even if the content is shared by someone they follow. The new rules are currently rolling out to users under 18 and will be fully rolled out “in the coming months.”
“While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have focused on ways to make it harder to find.” Meta said in a blog post. “Now when people search for terms related to suicide, self-harm, and eating disorders, we will begin to hide these associated results and direct them to expert resources for help. We already hide the results for suicide and self-harm search terms that inherently violate our rules and we are expanding this protection to include more terms.”
What is digital self-harm?
This comes at a time when Meta remains under fire for its failure to keep young people safe on its platforms.
Meta is expected to testify before the Senate on child safety later this month, alongside TikTok, Snapchat, Discord and X. Meanwhile, more than 40 states are suing Meta for contributing to youth mental health issues, alleging that Meta “profoundly altered the psychological and social realities of a generation of young Americans.” Time will tell if this update quells everyone’s growing fears.