Meta Platforms Inc. is stepping up efforts to protect teenage users across Instagram, Facebook, and Messenger, introducing new restrictions aimed at shielding them from harmful content and online risks.
Users under 16 can no longer host live videos without parental approval or send images flagged for possible nudity through direct messages. The company is also expanding its “Teen Accounts” feature—which launched on Instagram last year—to Facebook and Messenger. These accounts apply stricter privacy defaults for users under 18.
Teen Accounts block access to sensitive content, prevent certain private messages, and limit public discoverability. Users aged 13–15 cannot adjust these settings without a guardian’s permission. Meta reports that 97% of young teens have kept these protections on, with 54 million users now using Teen Accounts.
The company also clarified that its new, more lenient rules on hate speech—such as controversial language directed at transgender and non-binary individuals—will not be shown to teen users. Additionally, policies against content that promotes self-harm, eating disorders, or child exploitation remain unchanged for underage accounts.
The move comes as Meta faces ongoing backlash over teen safety, especially after past revelations showed its platforms could negatively affect young users’ mental health. These latest safeguards are part of a broader push to demonstrate stronger accountability and user protection.
- Toxic metals detected in popular baby formulas sparks fresh safety alarm
- Meta fires engineer a day for sharing public information
#TeenSafetyOnline
#MetaPrivacyUpdate
#SocialMediaReform
#ProtectingYouth