Meta Platforms to Introduce Stricter Content Controls for Teen Users on Facebook and Instagram
Meta Platforms, the parent company of Facebook and Instagram, announced on Tuesday that it will be implementing new measures to protect teen users by hiding “age-inappropriate” content on its platforms. The move comes as Meta faces increasing scrutiny and legal challenges over the impact of its apps on young users’ mental health.
In a blog post, Meta outlined the upcoming changes, stating that it will soon begin removing “age-inappropriate” content related to sensitive topics such as self-harm, suicide, and eating disorders from the feeds of teen users on both Facebook and Instagram. These changes are expected to be fully implemented within the coming months.
To further enhance protections for users under the age of 18, Meta revealed that accounts belonging to this demographic will be automatically placed in the most restrictive content control settings. Additionally, the company is making it more difficult for teens to search for content on sensitive topics.
The announcement of these new restrictions coincides with Meta’s ongoing legal battles, with dozens of states suing the company in October. The lawsuit, involving 33 states, alleges that Meta’s platforms, including Facebook and Instagram, have had a profound and harmful impact on the psychological and social realities of a generation of young Americans.
The legal action contends that Meta misled users by downplaying the prevalence of harmful content while being fully aware that its platform features caused significant physical and mental harm to young users. Specifically, the lawsuit claims that Meta’s recommendation algorithm encourages compulsive use, which the company fails to disclose. It also asserts that features like “Likes” contribute to mental health issues, while visual filters promote body dysmorphia and eating disorders.
In response to the lawsuit, Meta expressed disappointment, stating, “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” as reported by Reuters.
The tech giant is not the only platform facing legal challenges over the impact of its services on young users. Google’s YouTube and TikTok, owned by Chinese company ByteDance, have also been targeted in numerous lawsuits alleging that their platforms are addictive and contribute to mental health concerns among children and teenagers.
As Meta moves forward with its new content controls, it remains to be seen how these measures will address the concerns raised by regulators and the ongoing legal battles surrounding the impact of social media on the well-being of young users.