SAN FRANCISCO — Meta Platforms said that it would hide more content from teens on Instagram and Facebook, after regulators around the globe pressed the social media giant to protect children from harmful content on its apps.
All teens will now be placed into the most restrictive content control settings on the apps and additional search terms will be limited on Instagram, Meta said in a blogpost.
The move will make it more difficult for teens to come across sensitive content such as suicide, self-harm and eating disorders when they use features like Search and Explore on Instagram, according to Meta.
The company said, according to Reuters, that the measures, expected to roll out over the coming weeks, would help deliver a more “age-appropriate” experience.
Meta is under pressure both in the United States and Europe over allegations that its apps are addictive and have helped fuel a youth mental health crisis.
Attorneys general of 33 US states including California and New York sued the company in October, saying it repeatedly misled the public about the dangers of its platforms.
In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.
The regulatory pressure followed testimony in the US Senate by a former Meta employee who alleged the company was aware of harassment and other harms facing teens on its platforms but failed to act against them.
The employee called for the company to make design changes on Facebook and Instagram to nudge users toward more positive behaviors and provide better tools for young people to manage unpleasant experiences.
Children have long been an appealing demographic for businesses, which hope to attract them as consumers at ages when they may be more impressionable and solidify brand loyalty.
For Meta, which has been in a fierce competition with TikTok for young users in the past few years, teens may help secure more advertisers, who hope children will keep buying their products as they grow up.