Families of teenage boys who died by suicide have filed lawsuits against Meta Platforms, accusing the company of failing to stop sextortion schemes operating on Instagram and Facebook. The lawsuits claim Meta ignored repeated warnings and reports, allowing online predators to target minors with devastating consequences. The cases intensify scrutiny of Big Tech’s role in protecting children online.
The lawsuits name Meta Platforms, the parent company of Instagram and Facebook. According to court filings, the teenage boys were contacted by scammers posing as young women. After persuading the teens to send explicit images, the scammers allegedly demanded money and threatened to share the photos with family members, friends, or classmates.
The families argue Meta knew its platforms were being used for sextortion but failed to act quickly or effectively. Complaints state that reports of extortion accounts were ignored or processed too slowly to prevent harm. The lawsuits allege Meta prioritized user growth and engagement over basic safety measures for minors.
One case involves a 14-year-old boy who was allegedly targeted on Instagram and pressured to send money within hours. The lawsuit says the threats escalated rapidly, leaving the teen terrified of public humiliation. Similar patterns appear across multiple cases cited in the legal filings.
Fox Business reports that many sextortion operations originate overseas and rely on speed and psychological pressure. Law enforcement agencies have warned that teenage boys are increasingly targeted and that the emotional impact can be severe.
Meta has said it is working to combat sextortion through artificial intelligence tools, account removals, and cooperation with law enforcement. Critics argue those steps have not kept pace with the scale of abuse.
The lawsuits add momentum to calls from parents, lawmakers, and faith leaders for stronger accountability. Advocates argue that protecting children online requires enforceable standards, not voluntary promises from tech companies.



