Musk’s ‘Priority #1’ Disaster: CSAM Problem Worsens While ExTwitter Stiffs Detection Provider
from the not-such-a-priority-apparently dept
One of Elon Musk’s first “promises” upon taking over Twitter was that fighting child exploitation was “priority #1.”

He falsely implied that the former management didn’t take the issue seriously (they did) and insisted that he would make sure it was a solved problem on the platform he now owned. Of course, while he was saying this, he was also firing most of the team that worked on preventing the sharing of child sexual abuse material (CSAM) on the site. Almost every expert in the field noted that it seemed clear that Elon was almost certainly making the problem worse, not better. Some early research supported this, showing that the company was now leaving up a ton of known CSAM (the easiest kind to find and block through photo-matching tools).
A few months later, Elon’s supposed commitment to stomping out CSAM was proven laughable when he apparently personally stepped in to reinstate the account of a mindless conspiracy theorist who had posted a horrific CSAM image.
A new NBC News investigation now reveals just how spectacularly Musk has failed at his self-proclaimed “priority #1.” Not only has the CSAM problem on ExTwitter exploded beyond previous levels, but the company has now been cut off by Thorn—one of the most important providers of CSAM detection technology—after ExTwitter simply stopped paying its bills.
At the same time, Thorn, a California-based nonprofit organization that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X.
Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities. X said Wednesday that it was moving toward using its own technology to address the spread of child abuse material.
Let’s pause on this corporate-speak for a moment. ExTwitter claims it’s “moving toward using its own technology” to fight CSAM. That’s a fancy way of saying they fired the experts and plan to wing it with some other—likely Grok-powered— nonsense they can cobble together.
Now, to be fair, some platforms do develop effective in-house CSAM detection tools and while Thorn’s tools are widely used, some platforms have complained that the tools are limited. But these types of systems generally work best when operated by specialized third parties who can aggregate data across multiple platforms—exactly what organizations like Thorn (and Microsoft’s PhotoDNA) provide. The idea that a company currently failing to pay its bills to anti-CSAM specialists is simultaneously building superior replacement technology is, shall we say, optimistic.
The reality on the ground tells a very different story than Musk’s PR spin:
The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material.
Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was “a bit old hat” at this point, and that X’s response “has been woefully insufficient.” “It seems to be a little bit of a game of Whac-A-Mole that goes on,” he said. “There doesn’t seem to be a particular push to really get to the root cause of the issue.”
NBC’s investigation found that Musk’s “priority #1” has become a free-for-all:
A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute.
Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today.
That seems bad! Read it again: hashtags that were flagged as CSAM distribution channels in 2023 are still active and being used for the same purpose today. This isn’t the kind of mistake that happens when you’re overwhelmed by scale—this is what happens when you simply don’t give a shit.
Look, I’m usually willing to defend platforms against unfair criticism about content moderation. The scale makes perfection impossible, and edge cases are genuinely hard. But this isn’t about edge cases or the occasional mistake—this is about leaving up known, previously identified CSAM distribution channels. That’s not a content moderation failure; that’s a policy failure.
As the article also notes, ExTwitter tried to get praised for all the work it was doing with Thorn, in an effort to show how strongly it was fighting CSAM. This post from just last year looks absolutely ridiculous now that they stopped paying Thorn and the org had to cut them off.

But the real kicker comes from Thorn itself, which essentially confirms that ExTwitter was more interested in the PR value of their partnership than actually using the technology:
Pailes Halai, Thorn’s senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn’s software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn’t clear if they ever fully implemented it.
“They took part in the beta with us last year,” he said. “So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it’s not very clear to us at this point how and if they used it.”
So there you have it: ExTwitter signed up for anti-CSAM tools, used the partnership for good PR, then perhaps never bothered to fully implement the system, and finally stopped paying the bills entirely.
This is what “priority #1” looks like in Elon Musk’s world: lots of performative tweets, followed by firing the experts, cutting off the specialized tools, and letting the problem explode while pretending you’re building something better. I’m sure like “full self-driving” and Starships that don’t explode, the tech will be fully deployed any day now.
Filed Under: child safety, csam, elon musk, prevention
Companies: thorn, twitter, x
BestNetTech is off for the holidays! We'll be back soon, and until then don't forget to




Comments on “Musk’s ‘Priority #1’ Disaster: CSAM Problem Worsens While ExTwitter Stiffs Detection Provider”
Well duh. The consumers aren’t going to actually go after the content they love.
Elon is friendly with the leader of a political party that often opposes laws banning child marriage and stands by conservative clergy accused of crimes against children, and he’s a billionaire besides, so him not taking the issue of CSAM seriously shouldn’t surprise anyone but people who uncritically kissed his ass.
Re:
He also sure likes accusing other people of being pedophiles.
Re: Re:
*baselessly accusing
As one like yourself is naturally so tired of belaboring the point I must pedantically do so…
Re: Re:
Given the account he reinstated and what they’d posted to get banned I strongly suspect that’s another case of ‘Every accusation a confession, every self-given label a rejection of.’
Re:
He’s also in the Epstien files. People have even tracked his jet to the island. And with this, he’s all but admitted to it.
The lie they want us to believe is that they have turned off one anti-CSAM system well before the replacement is ready to be turned on.
That is an impressively tone deaf lie.
Telling you more about the person than the platform
Remember this story the next time you see someone claiming that twitter is ‘better than ever’ since Elon took it over, as they might have just told you far more about themselves than they realized.
Republicans only care about children as a means to control women, the instant they’re born, all their professed love dries up and replaced by a vague anger that poor white kids can’t be used as a servant class to replace the brown people they’re shipping off to concentration camps.
Re:
Republicans only care about children as a means to control women
Unfortunately that’s not the only time/way they care about children, as Stephen noted above they also ‘care about’ children for other reasons…
Re: Re:
But if the children are being harvested for adenochrome – FFS, i can’t even.
And just a reminder if a company does not want their ads appearing next to this content that should be illegal according to the current party in power…
The problem is that CSAM is not rocket science
Musk believes that firing the workers and getting the best experts together will result in fixing the problem.
But fighting CSAM is not a major accomplishment of human genius over complex subject matter. It is humans against a massive number of humans. Geniuses can overcome humongous factual obstacles. They can’t overcome waves and waves of determined opponents constantly moving the goalposts. For that you need approaches that scale, and that means humans doing the grunge work, and keeping doing it.
And even using AI and whatnot, you still need non-homeopathical doses of humans keeping the automatisms primed on the moving problems.
That is not akin to the problem of rocket science in the 21st century precision-navigating vast swaths of the infinite space but rather akin to the problem of rocket science in the 24th century when the main challenge will be weaving through all the space junk without major collisions.
Re:
Musk believes that firing the workers and getting the best experts together will result in fixing the problem.
Except no, because they were some of the first people he fired and, as this article notes the ones he stopped paying.
A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute.
Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today.
The issue here as it related to twitter isn’t that the problem is hard to deal with, the issue is that Elon doesn’t seem to consider CSAM a problem worth addressing with anything beyond empty platitudes to begin with(when he’s not reinstating the accounts of those that post it).
Re: Here let me fix it for you…
Should be
This goes -way- deeper than just CSAM…
It includes all sorts of human maliciousness, insanities, and base desires.
Porn, Anti-Semitism, Islamphobia, Germ Denialism, Flat Earth Cults, Racism, Grift, Engagement Farming, and pure gullibility.
Human Nature doesn’t boil down to a neat stack of equations like a rocket trajectory.
For all practical purposes it’s fractal chaos…
It’s not suceptable to the same lines of attack as a technical problem
And let me end with another note…
… some insight from the folks at Armadillo Aerospace…
They found out, that all things considered, “rocket science”s distinct quality was NOT that it was difficult skill to master so much as it was an uncommonly used skill set.
They found that out when they tried using a common technology to land their rocket.
It turns out that “helicopter science” was much harder..
Yeah, it’ll happen right after all the rest of the users leave, and market incentives force them to course correct.
ESTRAGON: Well, shall we go?
VLADIMIR: Yes, let’s go.
They do not move.
Also worth noting...
…is that misogynist content on Twitter is going through the roof. Not only are the threats against women on the site constant, but there are human and bot accounts posting nonstop advocacy for rape and other forms of violence against women 24 hours a day, fully supported, endorsed, and encouraged by Musk – who hates women just as much as Zuckerberg.
Re:
That’s dark side of this “free speech absolutism” everybody were fearing.
As for the bright side, there is only promises that last few months then get magically forgotten.
Re:
I wouldn’t know, I deleted my account on Twitter the moment I heard Musk had bought it.
Re:
More, perhaps. I don’t know that Zuckerberg hates any of his daughters.
So if someone registers csam.com (thought I’ve not checked if it’s available), and actually just made it a redirect to x.com, that would probably be criminal? (and NOT, for example, a mockery of x/twitter)
Re:
To be fair, one could claim that “CSAM” is an initialism for anything other than “child sex abuse material”. But anyone who wants to try that little gag is probably asking for a federal investigation as soon as the credit card transaction goes through, so I wouldn’t recommend it.
Re: Re:
First hit for a DuckDuckGo search indicates “CSAM” is:
Construction Safety Association of Manitoba
www[.]constructionsafety[.]ca
Is searching this going to get the RCMP sent over here?
easy pickings
Assuming the twitter rules are clear on the subject, the material should be gathered under a hashtag (`#’) CSAM. Automated tools should find it easily.
If there is any sort of blocking, then those who do not want that sort of material can avoid viewing it. Think of the 10 commandment posters in Louisiana, except automated. I do not use twitter and so cannot speak to the effectiveness of any available blocking; neither do I attend grade school in Louisiana to verify effectiveness of governor’s advice to not look at the offending posters.
I presume that twitter can restrict users who post such material without including the hashtag, or who include the hashtag without the requisite material.
Thorn is actually a really bad org with an anti sex work leaning. They have a notorious history. Ideally, Twitter would develop their own system for handling CSAM, rather than outsourcing it to them.
Just blocking hashtags is not a silver bullet as bad actors can misuse tags for other purposes. This has been a huge issue with the new Twitter where they’d bluntly block tags (or tweaked policies or processes in a manner that hit a lot more legitimate content), rather than identifying duplicate abusive posts (there were dozens or hundreds of posts with the same content).
At the same time, Twitter seems to want to have their cake and to eat it too. To use Thorn and child safety as a PR prop. And to project a techno-solutionist bravado.