National Center on Sexual Exploitation's 'Dirty Dozen' List Includes Apple, Roblox, Spotify
- Michael Foust Crosswalk Headlines Contributor
- Updated Apr 12, 2024
Roblox, Spotify, and Meta’s Instagram are popular among children, and they’re also popular among sexual predators according to a new report by the National Center on Sexual Exploitation that placed all three companies on its 2024 Dirty Dozen List of mainstream contributors to sexual abuse and exploitation.
The game-playing app Roblox, according to the report, exposes children to “highly sexualized content and themes” and has been used by countless predators to sexually abuse and exploit minors.
In recent years, an 11-year old New Jersey girl was kidnapped by a predator and an 13-year-old Utah boy fell victim to an abuser -- all because Roblox had insufficient safeguards, according to the report.
The music service Spotify includes “sexually explicit images, sadistic content, and networks trading child sex abuse material,” the report says. “We were genuinely shocked at how pervasive hardcore pornography and sexual exploitation was on the popular streaming app."
Meta’s platforms -- Instagram, Facebook, Messenger and WhatsApp – have consistently been ranked as “the top hotspots for a host of crimes and harms,” including pedophile networks sharing child sex abuse material, exploitative algorithms promoting children to adults, sex trafficking and image-based sexual abuse, the report says.
Apple also made the list. It abandoned plans to detect child sex abuse material on iCloud, the report says. Further, it refuses to “turn on Communication Safety for teens by default.”
“While all companies should ensure the safety of its users, these billion-dollar titans have an even greater responsibility to do so,” said Lina Nealon, Vice President and Director of Corporate Advocacy for the National Center on Sexual Exploitation. “Yet instead of dedicating the necessary resources to prevent exploitation of both children and adults, they are prioritizing profit and engaging in an AI arms race.
“... Apple refuses to detect child sexual abuse material on iCloud or to protect teens. Meta’s ‘Family of Apps’ -- especially Instagram -- have been well-established as primary places for a host of crimes and ills,” Nealon said. “But rather than sufficiently addressing these abuses, Meta made the move to enact end-to-end encryption -- effectively blinding itself to the most egregious harms on its platforms. And Microsoft’s GitHub -- though not as well-known -- is the source for the vast majority of deepfake pornography that is so rapidly proliferating.”
The other companies on the list were: Cash App, Cloudflare, Discord, LinkedIn, Reddit, Microsoft’s GitHub and Telegram.
“We encourage the public to press on these companies and on policymakers to make the necessary changes with the urgency these harms require. Human dignity must not come at the cost of another dollar,” Nealon said.
The Dirty Dozen also included Section 230 of the Communications Decency Act, which the report says gives “Big Tech blanket immunity for any and all types of abuses they facilitate.”
“Until we amend CDA 230, corporations can’t be held accountable,” the report says.
Image credit: ©Pexels/ Amar Preciado
Michael Foust has covered the intersection of faith and news for 20 years. His stories have appeared in Baptist Press, Christianity Today, The Christian Post, the Leaf-Chronicle, the Toronto Star and the Knoxville News-Sentinel.