WASHINGTON (BP) – Many turn to LinkedIn for updates on industry insiders, but among the billion professionals featured alongside respected companies are sexual exploitation leaders such as Pornhub and OnlyFans.
CashApp is popular for electronic payments, but a 17-year-old boy committed suicide after he became a victim of sexual extortion or “sextortion” by criminals who threatened to ruin his life unless he paid them, and only through CashApp.
Nude photos of your daughter are all over the internet, but she pleads innocence. Turns out, her classmates snapped her photo and generated “deepfake” nude images, likely using software shared on Microsoft’s GitHub, where more than 100 million software code writers worldwide collaborate in developing programs.
GitHub’s open-source design allows “anyone to access, use, change, and share software” developed by such giants as Google, Amazon, Twitter, Meta, and Microsoft, NCOSE said, making GitHub the “most prolific space” for AI development and “a major facilitator of the growing crimes of image-based sexual abuse.”
LinkedIn, CashApp and GitHub are among those making the National Center on Sexual Exploitation’s (NCOSE) 2024 Dirty Dozen List for “facilitating, enabling, and even profiting from sexual abuse and exploitation.”
“No Corporation should be hosting any type of sexual abuse and exploitation but we certainly don’t expect places like LinkedIn to be hosting and perpetuating sexual abuse and exploitation,” Lina Nealon, NCOSE vice president and director of corporate advocacy, said April 10 in revealing the list. “So we found that LinkedIn is providing a platform for many exploitative companies, most particular PornHub. LinkedIn is normalizing them as a job like any other, as a company like any other.”
NCOSE accused the industry leaders of various forms of exploitation including child sexual abuse, rape, sexual extortion, prostitution, sex trafficking, image-based abuse and other evils, documented by NCOSE’s staff including researchers and legal experts.
“These (12) entities exert enormous influence and power politically, economically, socially and culturally, with several corporations on this list enjoying more resources in global recognition than entire nations,” Nealon said. “Most of the companies we’re calling out have lofty corporate responsibility statements and have launched ethical AI task forces,” Nealon said. “We’re challenging them to actually live up to those statements and fulfill their social obligations to do something.”
NCOSE calls out:
- Apple, accusing the tech giant of facilitating abuse by refusing to scan for child sex abuse material, hosting dangerous apps with “deceptive” age ratings and descriptions, and neglecting to set default safety features for teens.
- Cloudflare, a “a platform for sex buyers and traffickers” that claims a desire to “build a better internet,” but provides services “to some of the most prolific prostitution forums and deepfake sites.”
- Discord as a “hotspot for dangerous interactions and deepfakes.” Exploiters and pedophiles easily contact and groom children on the site, luring children away from home, enticing children into sending sexually explicit images, and sharing sexually explicit images and deepfakes with each other.
- Meta, with its launch of end-to-end encryption, open-sourced AI, and virtual reality “unleashing new worlds of exploitation.” Meta platforms Facebook, Messenger, Instagram, and WhatsApp “have consistently been ranked for years as the top hotspots for a host of crimes and harms,” NCOSE said, noting pedophile networks where members share child sex abuse material, contact children and promote children to abusers. The sites enable sex trafficking, sextortion, and image-based sexual abuse, NCOSE said.
- Reddit is a hotspot for sexploitation, NCOSE said, citing child sex abuse material, sex trafficking, and image-based sexual abuse and pornography. The content will be further monetized if Reddit succeeds in going public, NCOSE said.
- Roblox, where users with such names as “RaipedLittleGirl” regularly target children among Roblox’s 54 million daily users, bombarding them with sexually explicit content generated through artificial intelligence, grooming them for sexual abuse and luring them from their homes. NCOSE calls out the $2.8 billion platform, popular with preteens, for not embracing “common sense child protection measures.”
- Spotify, a music streaming app NCOSE said also hosts sexually explicit images, sadistic content and networks trading child sex abuse material. NCOSE accused Spotify of pervasive hardcore pornography and sexual exploitation.
- Telegram, promoted as a dark web alternative, has instead unleashed a new era of exploitation, NCOSE said, describing the app as a safe haven for criminal communities globally including sexual torture rings, sextortion gangs, deepfake bots and others.
NCOSE encourages public advocacy for change at EndSexualExploitation.org, and notes past successes, including Apple’s cooperation in removing nudity apps when NCOSE flagged them.
The full list and other resources are available here.
(EDITOR’S NOTE – Diana Chandler is Baptist Press’ senior writer.)