Meta, the parent company of Instagram, announced significant updates aimed at ensuring the safety and well-being of its teenage users. Starting Tuesday, Instagram will implement new content settings for users under the age of 18, which will align with the guidelines of PG-13 rated media. The company expects to roll out these automatic protections progressively, fully operational by year-end.
In a statement, Meta expressed its commitment to providing a safer online environment for teens. “We hope this update reassures parents that we’re working to show teens safe, age-appropriate content on Instagram by default, while also giving them more ways to shape their teen’s experience,” the company said.
One of the primary changes involves automatically placing accounts of Instagram users under 18 under a 13+ content setting. Any attempt to opt out of this setting will require parental consent. This measure aims to restrict teens’ exposure to potentially harmful or inappropriate content.
Under the new guidelines, users classified as teens will no longer be able to view search results related to sensitive subjects, such as alcohol and gore. This ban adds to existing limitations on topics like suicide, self-harm, and eating disorders, from which Meta already protects young users. Additionally, teens will not be permitted to follow accounts that frequently post age-inappropriate content. For those who currently follow such accounts, Meta will restrict their ability to interact, limiting them from viewing, commenting on, or sending direct messages to these accounts.
Meta is also incorporating these new regulations into its artificial intelligence systems, ensuring that responses provided by AI do not include age-inappropriate material that would be unsuitable for a PG-13 audience.
Moreover, the company is introducing a “Limited Content” setting designed specifically for parents seeking a more stringent filtering experience for their children. This setting will further restrict the types of content visible in teens’ feeds and limit their engagement with posts by curbing their ability to see, leave, or receive comments.
These latest updates build on Meta’s previous initiative to create “Teen Accounts,” aimed at safeguarding the interests of younger users. The move aligns with recent similar announcements from other platforms, including YouTube and OpenAI, as they too recognize the importance of protecting minors in digital spaces.