In a move clearly aimed at appeasing critics and reassuring parents, Meta is imposing a PG-13 style content setting as the new default for all teenage Instagram users. This marks one of the company’s most significant steps yet to address concerns about child safety on its platform.
The “13+” setting will be automatically activated for all users under the age of 18. This protective measure can only be lifted if a parent or guardian gives their explicit permission, a mechanism designed to ensure adult supervision over the content a teen consumes.
The new rules will tighten restrictions further than before. Instagram will now hide or refuse to recommend posts that contain strong language, dangerous stunts, and themes that could promote harmful activities. It will also block users from searching for a predefined list of sensitive keywords.
The backdrop to this announcement is a recent, damaging report from independent researchers who found that two-thirds of Instagram’s safety tools were ineffective. The report, co-authored by a former Meta engineer, concluded that kids were not safe on the platform, putting immense pressure on the company to act.
This new PG-13 system will first be available in the US, UK, Australia, and Canada, with a wider global rollout planned. However, advocacy groups are skeptical, arguing that Meta’s history of “PR announcements” requires that these new claims of safety be rigorously and independently verified.