Lawsuit says Facebook, Snapchat, Instagram Knowingly Endanger Children

A woman who says she was raped as a child by a man she met on Facebook is demanding unspecified damages against several social media companies in a lawsuit that says they put profit above their duty to protect young users from sex predators and other online dangers.

The 80-page personal injury lawsuit filed Friday U.S. District Court in San Francisco claims the platforms designed defective and dangerous products with algorithms that, among other problems, “attract, enable and facilitate child predators’ recruitment of unsuspecting child users.”

Citing the Child Online Privacy Act, which prohibits the collection of children’s personally identifiable information without parental consent, lawyers for the woman say Meta — which says children under the age of 13 cannot have accounts — “knowingly lacks age-verification protocols” and had at least 600,000 underage users in 2021.

The woman identified only as “D.H.” said she began using Facebook at the age of 10, made a profile without parental consent and soon became addicted to social media by fixating on increasing the number of friends on her profile, the lawsuit stated.

She then created her first Instagram account at the age of 12 and started interacting with strangers online, revealing private information about herself, eventually selling explicit pictures of herself through the platforms, according to the lawsuit.

At the age of 13 she attended a party after a 30-year-old man used Facebook to send her an invitation and was sexually assaulted after being given alcohol, according to the lawsuit. Two years later she was hospitalized for mental health treatment after revealing her social media use to her parents.

Shortly before she was hospitalized, an individuals she interacted with on one of the platforms sent some of her explicit images to her immediate and extended family members, the lawsuit added.

Lawyers for the woman argue that Meta and Snap Inc. designed their products to “be addictive and take advantage of the chemical reward system of users’ brains (especially young users) to create addiction and additional mental and physical harm.”

“Defendants’ social media products, as designed, were unreasonably dangerous, posed a substantial likelihood of harm, and were therefore defective because of reasons enumerated in the complaint, including, but not limited to, risks of social media addiction, depression, body dysmorphia, anxiety, suicidal ideation, selfharm, thoughts of self-harm, insomnia, eating disorder, anorexia nervosa, bulimia nervosa, death by suicide, death by eating disorder, lack of focus, ADHD, difficulty sleeping, fatigue, headaches, migraines, loss of vision, eye strain, increased risky behavior, exposure to child predators, sexual exploitation, among other harmful effects,” the suit reads.

Meta, which runs Instagram and Facebook, and Snap Inc., which runs Snapchat, could not be immediately reached on Friday.

Reporting by The San Francisco Chronicle.

MORE STORIES