Why Meta Has Blocked Livestreaming By Teenagers On Instagram
News Update April 08, 2025 11:24 PM

Meta has expanded its safety measures for teens on Instagram by blocking access to the platform’s Live feature for users under 16 unless they have parental permission, foreign media reported. As part of the change, which is aimed at protecting young users from potential online risks, teens will also need parental approval to turn-off a feature that blurs images with suspected nudity in their direct messages, the report said.

Instagram’s Teen Accounts and Extended Protections
Along with these changes, Meta is also extending its teen account system on Instagram to Facebook and Messenger. Teen accounts were introduced last year and automatically apply certain safety features, including:

  • Allowing parents to set daily usage limits
  • Blocking access to Instagram at specific times
  • Letting parents view the accounts their child is messaging with.

These measures will be rolled out to Facebook and Messenger in the US, UK, Australia, and Canada, the report said, adding that users under 16 will need parental permission to change the default settings, while those aged 16-17 can adjust them on their own.

According to the report, to this date, Instagram’s teen account system is used by 54 million under-18s globally. More than 90% of 13- to 15-year-olds have stuck with their default safety settings. NSPCC, a child protection charity in the UK, welcomed these new safety measures but urged Meta to do more. Matthew Sowemimo, Associate Head of policy for child safety online at NSPCC, told The Guardian that “for these changes to be truly effective, they must be combined with proactive measures, so (that) dangerous content doesn’t proliferate on Instagram, Facebook and Messenger in the first place.”

Online Safety Act and Child Protection Laws

The development came as the UK enforced its new Online Safety Act –  a law that requires platforms like Instagram, Facebook, and others to prevent harmful content such as child sexual abuse or self-harm material. If dangerous content does appear, it must be removed.

The Act also ensures that platforms take steps to protect minors from harmful material, such as suicide- content. However, there are concerns that parts of the law could be weakened in trade talks between the UK and the US, a move that child safety groups have strongly opposed, the report stated.

 

© Copyright @2025 LIDEA. All Rights Reserved.