LONDON — Instagram has introduced new safety measures to enhance parental control over teenagers’ online activities. Under the new regulations, users under 16 will need parental consent to livestream or to unblur nude images received in direct messages. This initiative by Meta Platforms aims to fortify protections as concerns about social media’s impact on younger users grow.
The company, extending these protections, will now also include users under 18 on Facebook and Messenger. This move builds on the teen account program launched by Meta on Instagram in September, intended to offer parents greater oversight of their children’s interactions on the platforms.
Initially, these additions will be available to users in the United States, Britain, Canada, and Australia, with plans to roll them out to global users in the coming months. According to the recent changes, teens under 16 will be prohibited from using Instagram Live without parental consent. Furthermore, parental approval will be required to disable the feature that automatically blurs suspected nudity in direct messages.
In a key update, Meta is widening the scope of Instagram’s teen safeguards to both Facebook and Messenger. These include privacy settings that make teen accounts private by default, the prevention of unsolicited messages from unknown contacts, controls on exposure to sensitive content such as violent videos, built-in reminders to log off after an hour of use, and disabling notifications during nighttime to encourage better sleep habits.
Meta expressed that the “Teen Accounts on Facebook and Messenger” will offer similar automatic protections designed to curb inappropriate interactions and assure that young users’ engagement with the platforms is healthy. Since the initiative’s inception in September, more than 54 million teen accounts have been created, highlighting the widespread acceptance of these enhanced safeguard measures.