If your child uses Instagram, Facebook or Messenger, you’ve likely worried about what they see, who they connect with and how much of their private lives end up online. Meta—the company behind these platforms—has recently announced sweeping changes aimed at making its apps safer for young users.
These moves include deleting more than 635,000 accounts linked to sexualized activity toward children, rolling out new teen safety features and expanding protections across Facebook, Instagram and Messenger.
Here’s what parents need to know about the changes, how they work and where family involvement is still crucial.
635,000 accounts removed: A massive purge
In July 2025, Meta announced it had taken down 135,000 accounts that posted sexualized comments or requested inappropriate images from adult-run accounts of children under 13. Another 500,000 linked accounts that interacted inappropriately were also deleted.
Why does this matter for parents? These accounts often target younger users through comments, direct messages, or algorithm recommendations. By removing them in bulk, Meta reduced opportunities for predators to contact kids directly.
Still, no system is perfect. Parents should continue encouraging their teens to report suspicious accounts and block strangers who reach out unexpectedly.
RELATED: Officials Warn Parents, Youth and Educators About Rising Sextortion Crimes
Stronger safety features for teens
Meta has also introduced several in-app protections explicitly designed for young users. These updates can increase teens’ awareness and reduce risky interactions.
1. Context in direct messages
When a teen receives a message, they now see when the account was created and where it’s based. If the sender is new or appears to be in another country, the teen gets a safety notice. This can help kids spot scams, sextortion attempts or fake profiles more easily.
2. Nudity protection
By default, images suspected to contain nudity are now blurred in teens’ inboxes. Teens receive a warning before deciding whether to view or forward the image. According to Meta, 40% of blurred images were not shared after teens saw the warning—proof that a slight pause can prevent risky decisions.
3. Easier block-and-report tools
A one-tap option to block and report an account now appears directly in safety notices. This simplifies the process for teens who may otherwise hesitate or be unsure about what to do.
4. Protection for younger children
Accounts run by adults that showcase kids—such as family influencers—now receive stricter filters. Offensive comments are hidden automatically and suspicious accounts are prevented from sending messages.
Teen accounts across Instagram, Facebook and Messenger
Meta has extended its Teen Account system to all major platforms.
- Who qualifies? Users under 16 are automatically placed into Teen Accounts.
- What does it do? It sets stricter privacy defaults, limits livestreaming, blurs images in DMs and requires parental permission to adjust settings.
- How many are affected? Over 54 million teens are now enrolled in Teen Accounts on Instagram. Surveys show 97% of 13–15-year-olds kept the protections, and 94% of parents approved of the safeguards.
This means your child’s experience on Facebook or Messenger should now mirror the safer settings already seen on Instagram.
Why Meta is making these moves
The changes come amid intense legal and public pressure. Dozens of U.S. states have sued Meta, alleging that its apps contribute to teen addiction, anxiety and exposure to harmful content. Lawmakers are also debating the Kids Online Safety Act (KOSA), which could require all platforms to provide stronger default settings for minors.
Meta’s new measures may be both a response to this scrutiny and a way to rebuild trust with parents who have grown skeptical of the company’s commitment to safety.
What parents can do now
Even with these new protections, no filter or AI system can fully replace the role of parental involvement. Here are practical steps you can take:
- Talk about online interactions. Ask your teen about who they chat with, how they handle unwanted messages, and whether they’ve seen the new safety pop-ups.
- Review privacy settings together. Check that their Teen Account protections are active. Show them how to block and report accounts.
- Set family boundaries. Discuss screen time, app limits and rules regarding the sharing of photos or personal details.
- Stay informed. Social media evolves quickly. Regularly check for updates to safety tools, both from Meta and other platforms your child uses.
- Keep communication open. Teens are more likely to report uncomfortable encounters to their parents if they know they won’t face consequences for being honest.
The bottom line
Meta’s latest steps—wiping out hundreds of thousands of predatory accounts, strengthening safety notices, and expanding teen protections—represent a significant shift in how the company approaches youth safety.
For parents, this is encouraging news, but it’s not a green light to step back. Social media safety still works best when technology and parenting work hand-in-hand. By combining Meta’s tools with open family conversations, parents can give their teens both freedom and protection in the digital world.

