Instagram to Default Teens to PG-13 Content, Requiring Parental Approval for Changes
Meta, the parent company of Instagram, announced Tuesday that it will restrict teenagers on the platform to content similar to what is seen in a PG-13 movie. This move aims to provide a safer online environment for young users, limiting exposure to certain types of content.
Under the new policy, teen accounts will automatically be set to filter out posts with strong language, risky stunts, or content that promotes potentially harmful behaviors, such as depictions of marijuana use.
Anyone under 18 who creates an Instagram account will be placed into the restrictive setting, and they will not be able to change it unless they have permission from a parent or guardian. While the platform already uses artificial intelligence to identify accounts where users have misrepresented their age, this new policy adds another layer of protection.
Meta is also introducing an even stricter setting that parents can implement for their children, which further limits the content they can view.
The changes come as social media companies face increasing scrutiny over their impact on young users. Meta says it will not show inappropriate content to teens, such as posts about self-harm, eating disorders, or suicide. The company states that this new policy goes beyond its previous safeguards. Teens will no longer be able to follow accounts that regularly share age-inappropriate content, and these accounts will not be able to follow teens.
Meta also says it will block a broader range of search terms related to sensitive topics, such as alcohol or gore. The PG-13 update will also apply to artificial intelligence chats and experiences, ensuring that AI responses are appropriate for a younger audience.
While some critics believe Meta’s announcement is a public relations move, others see it as an opportunity for parents to engage with their teens about responsible social media use.


