Instagram has launched a new feature called “teen accounts,” designed to offer a safer online experience for younger users. These accounts, automatically assigned to users under 18, come with built-in protections that limit interactions and filter content to reduce exposure to inappropriate or harmful material. Teens under the age of 16 will require parental approval to adjust any settings, ensuring that they remain within Instagram’s most protective default options.
In addition to privacy controls, Instagram’s teen accounts offer features that allow teens to explore content in a more positive and customized way. Through the “Explore” section, teens can select topics they want to see more of, focusing on areas that interest them, such as hobbies, sports, or entertainment. This is aimed at creating a more engaging and safer online environment that encourages constructive interactions while keeping negative influences at bay.
One of the most notable aspects of the new system is the introduction of parental supervision tools. Parents can now monitor who their children are interacting with and limit their Instagram usage by setting daily time limits. Parents of teens under 16 must approve any changes that would lessen the built-in protections, and even older teens (16+) can be subject to these parental controls if parents opt to enable them.
This update is part of Meta’s broader effort to improve child safety online. The changes will first be rolled out in the U.S., UK, Canada, and Australia within 60 days, with a gradual expansion to the European Union by the end of 2024. Global implementation is expected to follow in early 2025. Meta also plans to extend these protective measures to its other platforms next year.
To further secure the platform, Instagram is requiring more robust age verification, such as requesting government IDs or video selfies when teens attempt to create new accounts with adult birth dates. Meta is also developing AI technology that can identify users who falsely claim to be adults but are likely teenagers, ensuring they are assigned the correct account type. This proactive approach aims to safeguard teens, even if they attempt to bypass age restrictions.
These updates reflect growing regulatory pressure on social media platforms to protect minors from harmful content and online predators. Meta’s efforts are part of a broader trend in the tech industry, where companies like YouTube have also introduced features to limit exposure to harmful content for teens.