Seeking to rebuild trust with parents and regulators, Meta is giving Instagram a significant safety makeover by introducing a PG-13 style rating system for all its teenage users. The new framework will automatically apply stricter content filters to the accounts of users under 18.
The system’s design makes safety the default. The “13+” setting will be the standard for all teens, and loosening these restrictions will not be a unilateral decision for the young user. Instead, it will require a parent to actively approve the change, ensuring they are part of the conversation.
This PG-13 version of Instagram will filter a wider array of content. The company has specified that it will hide or down-rank posts with strong language, risky physical challenges, and content that appears to promote harmful behaviors. It will also block searches for sensitive words, closing a common loophole for accessing restricted material.
This move is a direct response to a storm of criticism, including a recent independent report that concluded Instagram’s existing safety features were ineffective. The pressure from this report, along with demands from regulators for a “safety-first” approach, has clearly pushed Meta to take more decisive action.
The feature will be launched initially in four countries, including the US and UK, with a full global deployment planned for early next year. Yet, skepticism persists among child safety groups, who insist that true progress can only be measured through transparent, independent testing of the new system’s effectiveness.
In a Bid for Trust, Meta Gives Instagram a PG-13 Makeover for Teens
4