
The Federal Trade Commission accused Meta of failing to protect minors on Instagram, but can the company prove its commitment to user safety?
Quick Takes
- The FTC presented evidence focusing on Instagram’s child protection issues in an antitrust trial against Meta.
- A 2019 report revealed Instagram’s algorithms suggested minors to potential predators.
- 27% of follow recommendations to predatory accounts were for minors, flagging two million minor accounts in three months.
- Emails indicate Mark Zuckerberg may have withheld security resources for Instagram.
FTC’s Case Against Meta
The Federal Trade Commission presented a strong case against Meta, focusing on severe safety issues related to minors on Instagram. A 2019 internal report, “Inappropriate Interactions with Children on Instagram,” was highlighted. This document revealed shortcomings in Meta’s approach to child safety, with Instagram allegedly suggesting minors to known predators. These issues form a crucial part of the FTC’s current antitrust proceedings against Meta.
Facebook’s role in underfunding Instagram’s safety measures was raised. Internal reports showed 27% of follow suggestions were minors, nearly two million minor accounts flagged. This situation raised alarm over potential grooming activities occurring via the platform, reflecting a critical oversight in Instagram’s algorithmic functions related to child protection.
Instagram platform used automated algorithms that suggested children for groomers and predators to follow on the app, according to a 2019 internal company document presented by the FTC during the ongoing Meta antitrust trial. https://t.co/agBuaRinUR
— Breitbart News (@BreitbartNews) May 7, 2025
Internal Struggle and Leadership Decisions
Emails and testimony during the trial suggest Mark Zuckerberg did not allocate sufficient resources to Instagram’s security. Zuckerberg reportedly worried that bolstering Instagram’s security might overshadow Facebook. Executives acknowledged Instagram lagged behind Facebook in dealing with critical issues like child exploitation, highlighting internal struggles between business metrics and user protection responsibilities.
Meta’s acquisition of Instagram is a point of contention. Critics argue it led to reduced investments in user safety, adversely affecting consumers. This notion is supported by internal documents that showed Instagram’s safety teams were understaffed with insufficient resources to tackle significant risks effectively.
🇺🇸 FTC: 2 MILLION TEENS SERVED TO GROOMERS BY INSTAGRAM’S ALGORITHM
A 2019 internal report revealed Instagram’s algorithm was suggesting minors to adult predators – 2 million teen accounts were shown to flagged “groomers.”
That’s 27% of follow recommendations going to creeps.… https://t.co/6qMGsjBRNZ pic.twitter.com/b105bC8vrz
— Mario Nawfal (@MarioNawfal) May 7, 2025
Meta’s Defense and Safety Claims
Meta countered with a spokesperson asserting that Instagram has implemented safety protocols, such as default private accounts for teens and limiting adult interactions. Meta claims that since 2018, efforts have focused on child safety, including limiting risky recommendations and endorsing updated child protection laws. It remains to be seen whether these measures meet the challenges presented in the FTC trial.
Meta’s current legal battle raises important questions about the responsibility of platforms to their user base, especially regarding minors’ protection. While the outcome remains undetermined, the trial is making clear the ramifications of business strategies prioritizing growth over safety in the tech industry.
Sources:
- Meta Antitrust Trial: FTC Says Instagram Urged ‘Groomers’ to Connect With Minors – Bloomberg
- FTC Describes Instagram as a Groomer’s Paradise at Meta Antitrust Trial