top of page
  • Voltaire Staff

Meta to filter teens' access to 'self-harm,' 'violent' content


Image of a person holding a phone with Instagram Login page open on it


The major social media platform Meta is changing how it filters content based on age to safeguard children under 13, with the changes likely to come into effect over the next few weeks.


Meta plans to automatically limit harmful content on teen Instagram and Facebook accounts, such as videos and posts related to self-harm, violence, and eating disorders.


According to Wall Street Journal, this is the most significant step the tech giant has taken to ensure younger users have a more suitable experience on its social media sites. The move comes against the backdrop of more than 40 states suing Meta, accusing the company of misleading the public about the dangers its platforms pose to young people.


Some of these dangers were exposed in a 2021 Wall Street Journal series called Facebook Files, revealing how Instagram knew its platform was harmful to many teen girls.


Previous lawsuits against Meta


In their October lawsuit against Meta, state attorneys general cited internal documents indicating that the company designed its products to take advantage of young users' inclination towards peer pressure and potentially risky behaviour. Meta denied designing its products to be addictive for teens in November.




New restrictions will automatically apply to teen accounts, placing them in the most restrictive content settings. Teens under 16 won't see sexually explicit content.


Previously, teens could choose less strict settings, but they can't opt out of the new settings. Teens won't be able to see or search for harmful content, even if shared by friends. For instance, posts about dieting by a teen's friend will no longer be visible. However, content related to a friend's recovery from an eating disorder might still be seen.


Meta consulted experts in adolescent development to determine inappropriate content for teens. Algorithms already avoid recommending harmful content to teens in video Reels and the Explore page. With the changes, such content won't appear in teens' Feed and Stories. The changes will be applied automatically to existing teen accounts this week, and newly created accounts will also be restricted to age-appropriate content.


Meta is introducing a tool to enhance teens' sharing settings on Instagram, making them more private. A notification will prompt teen users to "turn on recommended settings" in situations involving interactions with unknown accounts.


Once these settings are activated, accounts will limit who can repost content, tag or mention them, or include their content in Reels Remixes. Only teens' followers can message them with these settings.

Comentários


bottom of page