A child sex abuse survivor’s plea to Elon Musk reveals deep flaws in how social media giants, including X, continue to fail victims of online exploitation.
![]() |
Global tech platforms claim to fight child abuse content, but evidence shows repeat failures to stop its circulation—amplifying trauma for survivors. Image: CH |
Tech Desk – August 26, 2025:
A harrowing plea from a survivor of child sexual abuse has reignited urgent questions about how social media platforms continue to enable the circulation of exploitative content. Zora, an American woman abused more than two decades ago, is calling on Elon Musk to remove links to images of her abuse still being traded openly on X, the platform Musk owns.
Zora’s case is not isolated—it's a brutal reminder of how digital technology has extended the reach of abuse, allowing criminals to repeatedly exploit survivors long after the initial crime. “My body is not a commodity,” she told the BBC. “Those who distribute this material are not passive bystanders, they are complicit perpetrators.”
X, formerly Twitter, claims to have “zero tolerance” for child sexual abuse material (CSAM), yet Zora's images are among thousands reportedly shared via accounts on the platform. An investigation revealed that traders use X to link customers to Telegram, where curated “VIP packages” of abuse content—some featuring children as young as seven—are openly offered for sale.
These traders operate with disturbing boldness. One seller told an Anonymous activist, “I have baby. Kids young 7–12,” before offering thousands of videos for sale. The same trader was linked to over 100 nearly identical accounts on X, each one reappearing after the previous was reported and removed.
The National Center for Missing and Exploited Children (NCMEC) received more than 20 million CSAM-related reports in 2024 alone. Yet platforms continue to act reactively, not proactively.
Lloyd Richardson, Director of Technology at the Canadian Centre for Child Protection (CCCP), says account takedowns are the “bare minimum.” The problem lies in how easily abusers re-enter platforms and in the lack of coordinated prevention. “We need better systems to stop repeat offenders and prevent re-registration,” Richardson said.
Telegram, meanwhile, reported banning over 565,000 CSAM-related channels in 2025, but that hasn’t stopped the spread. The messaging app says it now employs over 1,000 moderators and scans public content proactively, but its scale continues to outpace moderation capacity.
Zora’s story reveals the devastating personal impact. Her images, once only available on the dark web, now circulate through mainstream platforms. Though her abuser was imprisoned years ago, the footage remains out of her control.
In adulthood, Zora has faced online threats from stalkers who discovered her identity. “I feel bullied over a crime that robbed me of my childhood,” she said. And every time those images are shared, the abuse is renewed.
The mental and emotional toll is incalculable. For Zora and thousands of survivors like her, justice isn’t just about prison sentences—it’s about digital erasure. “If you would act without hesitation to protect your own children,” she told Elon Musk, “I beg you to do the same for the rest of us.”
When Musk took over Twitter in 2022, he promised that removing child abuse content would be a top priority. Yet the system in place appears unable to keep up with the scale and complexity of the problem.
Hacktivist groups like Anonymous have stepped in, attempting to identify and report traders, but their efforts are often met with futility. The same perpetrators return under new aliases and with fresh accounts.
Social media platforms, flush with revenue and AI capabilities, have the resources to do far more. Yet survivors and experts argue that without stronger global regulation, coordinated cross-platform action, and real accountability for enabling repeat offenders, this crisis will persist.
Child sexual abuse material is a multi-billion-dollar black market, according to child protection organizations. Its reach spans continents—from the U.S. to Southeast Asia—while its victims, like Zora, suffer silently.
What emerges from this investigation is not only the scale of the crime but the indifference of the systems meant to stop it. Until tech companies treat survivor safety with the same urgency as profit margins, children will remain vulnerable—and the internet will remain complicit.
“The time to act is now,” Zora said. The question is—will the platforms listen?