New Twitter owner Elon Musk says child sexual exploitation content is the first problem the social media platform will tackle under his authority.
- Elon Musk says addressing the issue of child sexual exploitation content is “Priority #1” for his newly acquired Twitter.
- Previously, the social media platform did not put forth any real effort into removing content that exploits sexualized content of children.
- The Twitter chief has also criticized previous management for not taking the issue seriously and blames their lack of action on the frequency of these posts allowed on the site.
- Musk explained that the primary hashtags that target CSE have been taken down from the platform and he plans to do more to ensure Twitter stays clear of such content.
HUMAN TRAFFICKING ADVOCATE ELIZA BLEU ON MUSK’S TWITTER UPDATES ELIMINATING CSE:
“To those who aren’t aware yet, last week Twitter did add a direct reporting option for child sexual exploitation. (ONLY on tweets with content images/videos) this was not previously available and was a separate form that wasn’t easy to find. I’m grateful to see these changes,” Bleu tweeted.
- In August 2022, a lawsuit was filed by a woman who claimed Facebook, Snapchat, and Instagram knowingly endangered children.
- The woman, identified only as “D.H.”, said she was raped as a child by a man she met on Facebook and demanded unspecified damages against several social media companies in the lawsuit, saying they put profit above their duty to protect young users from sex predators and other online dangers, according to the San Francisco Chronicle (SFC).
- Citing the ‘Child Online Privacy Act,’ which prohibits the collection of children’s personally identifiable information without parental consent, lawyers for the woman said Meta, which says children under the age of 13 cannot have accounts, “knowingly lacks age-verification protocols” and had at least 600,000 underage users in 2021, SFC went on to report.
- The 80-page personal injury lawsuit filed in the U.S. District Court in San Francisco claimed the platforms designed defective and dangerous products with algorithms that, among other problems, “attract, enable and facilitate child predators’ recruitment of unsuspecting child users.”