Hide Three days left! Support our fundraiser by January 5th and get the first BestNetTech Commemorative Coin »

Senator Durbin’s ‘STOP CSAM Act’ Has Some Good Ideas… Mixed In With Some Very Bad Ideas That Will Do More Harm Than Good

from the taking-away-230-doesn't-protect-kids dept

It’s “protect the children” season in Congress with the return of KOSA and EARN IT, two terrible bills that attack the internet, and rely on people’s ignorance of how things actually work to pretend they’re making the internet safer, when they’re not. Added to this is Senator Dick Durbin’s STOP CSAM Act, which he’s been touting since February, but only now has officially put out a press release announcing the bill (though, he hasn’t released the actual language of the bill, because that would actually be helpful to people analyzing it).

CSAM is “child sexual abuse material,” though because every bill needs a dumb acronym, in this case it’s the Strengthening Transparency and Obligation to Protect Children Suffering from Abuse and Mistreatment Act of 2023.

There is a section by section breakdown of the bill, though, along with a one pager summary. And, given how bad so many other internet “protect the children” bills there are, this one is… not nearly as bad. It actually has a few good ideas, but also a few really questionable bits. Also, the framing of the whole thing is a bit weird:

From March 2009 to February 2022, the number of victims identified in child sexual abuse material (CSAM) rose almost ten-fold, from 2,172 victims to over 21,413 victims. From 2012 to 2022, the volume of reports to the National Center for Missing & Exploited Children’s CyberTipline concerning child sexual exploitation increased by a factor of 77 (415,650 reports to over 32 million reports).

Clearly, any child sexual abuse material is too much, but it’s not at all clear to me that the numbers represented here show an actual increase in victims of child sexual abuse material, or merely a much bigger infrastructure and setup for reporting CSAM material. I mean, from March of 2009 to February of 2022 is basically the period in which social media went mainstream, and with it, much better tracking and reporting of such material.

I mean, back in March of 2009, the tools to track, find and report CSAM were in their infancy. Facebook didn’t start using PhotoDNA (which was only developed in 2009) until the middle of 2011. It’s unclear when Google started using it as well, but this announcement suggests it was around 2013 — noting that “recently” the company started using “encrypted “fingerprints” of child sexual abuse images into a cross-industry database” (which describes PhotoDNA).

This is what’s frustrating in all of this. For years, there were complaints that these companies didn’t report enough CSAM, so they built better tools that found more… and now the media and politicians are assuming that the increase in reporting means an increase in actual victimization. Yet, it’s unclear if that’s actually the case. It’s just as (if not more) likely that since the companies are getting better at finding and reporting, that this is just presenting a more accurate number of what’s out there, and not any indication of whether or not the problem has grown.

Notice what’s not talked about? It’s not mentioned how much law enforcement has done to actually track down, arrest, and prosecute the perpetrators. That’s the stat that matters. But it’s missing.

Anyway, again, stopping CSAM remains important, and there are some good things in Durbin’s outline (though, again, specific language matters). It will make reporting mandatory for youth athletic programs, which is a response to a few recent scandals (though, might also lead to an increase in false reports). It increases protections for child victims and witnesses. Another good thing it does is make it easier for states to set up Internet Crimes Against Children (ICAC) task forces, which specialize in fighting child abuse, and which can be helpful for local law enforcement who are often less experienced in how to deal with such crimes.

The law also expands the reporting requirements for online providers, who are already required to report any CSAM they come across, but this expands that coverage by a bit, and increases the amount of information the sites need to provide. It makes at least some move towards making those reports more useful to law enforcement by authorizing NCMEC to share a copy of an image with local law enforcement from its database.

Considering that, as we keep reporting, the biggest issue with CSAM these days is that law enforcement does so little with the information reported to NCMEC’s CyberTipline, hopefully these moves actually help on the one key important area: having law enforcement bring the actual perpetrators to justice and stop them from victimizing children.

But… there remain some pretty serious concerns with the bill. It appears to crack open Section 230, allowing “victims” to sue social media companies:

The legislation expands 18 U.S.C. § 2255, which currently provides a civil cause of action for victims who suffered sexual abuse or sexual exploitation as children, to enable such victims of to file suit against online platforms and app stores that intentionally, knowingly, recklessly, or negligently promote or facilitate online child sexual exploitation. Victims are able to recover actual damages or liquidated damages in the amount of $150,000, as well as punitive damages and equitable relief. This provision does not apply to actions taken by online platforms to comply with valid legal process or statutory requirements. The legislation specifies that such causes of action are not barred by section 230 of the Communications Act of 1934 (47 U.S.C. 230).

Now, some will argue this shouldn’t have a huge impact on big companies that do the right thing because it’s only for those that “intentionally, knowingly, recklessly, or negligently promote or facilitate” but that’s actually a much, much bigger loophole than it might sound at first glance.

First, we’ve already seen companies that take reporting seriously, such as Reddit and Twitter, get hit with lawsuits making these kinds of allegations. So, plaintiffs’ lawyers are going to pile on lawsuits even against the companies that are trying to do their best on this stuff.

Second, even if the sites were doing everything right, now they have to go through the long and arduous process of proving that in every one of these lawsuits. The benefit of Section 230 is to get cases like this kicked out early. Without 230, you have to go through a long and involved process just to prove that you didn’t “intentionally, knowingly, recklessly, or negligently” do any of those things.

Third, while “intentionally” and “knowingly” are perhaps more defensible, adding in “recklessly” and (even worse) “negligently” again just makes every lawsuit a massive crapshoot, because every lawyer is going to argue that any site that doesn’t catch and stop every bit of CSAM will be acting “negligently.” And the lawsuits over negligently are going to be massive and going to be ridiculous and going to be expensive.

So, if you’re a social media site — say a mid-sized Mastodon instance — and it’s discovered that someone posted CSAM to your site, the victimized individual can sue, and insist that you were negligent in not catching it, even if you were actually good about reporting and removing CSAM.

Basically, this opens up a flood of litigation.

There may also be some concerns about some of the new reporting requirements, in that I fear that (like this very bill misuses the “reported” stats as proof that the problem is growing) the new reports will be used down the line to justify more draconian interventions just because “the numbers” are going up, when that might just be a result of the reporting itself. I also worry that some of the reporting requirements will lead to further (sketchy) justifications for future attacks on encryption.

Again, this bill has elements that seems good, and would be useful contributions. But the Section 230 carveout is extremely problematic, and it’s not at all clear that it would actually help anyone other than plaintiffs lawyers filing a ton of vexatious lawsuits.

On top of all that Durbin’s floor speech on introducing the bill was, well, problematic full of moral panic nonsense mostly disconnected from reality and he goes hard against Section 230, though it’s not clear he understands it at all. Even worse he talks about how EARN It and STOP CSAM together would lead to a bunch of frivolous lawsuits, which he seems to think is a good thing.

How can this be, you ask? Here is how. The Communications Decency Act of 1996—remember that year—contains a section, section 230, that offers near-total immunity to Big Tech. As a result, victims like Charlotte have no way to force tech companies to remove content posted on their sites—not even these child sexual abuse horrible images.

My bill, the Stop CSAM Act, is going to change that. It would protect victims and promote accountability within the tech industry. Companies that fail to remove CSAM and related imagery after being notified about them would face significant fines. They would also be required to produce annual reports detailing their efforts to keep children safe from online sex predators, and any company that promotes or facilitates online child exploitation could face new criminal and civil penalties.

When section 230 was created in 1996, Mark Zuckerberg was in the sixth grade. Facebook and social media sites didn’t even exist. It is time that we rewrite the law to reflect the reality of today’s world.

A bipartisan bill sponsored by Senators Graham and Blumenthal would also help to do that. It is called the EARN IT Act, and it would let CSAM victims—these child sexual abuse victims—have their day in court by amending section 230 to eliminate Big Tech’s near-total immunity from liability and responsibility.

There are serious ways to fight CSAM. But creating massive liability risks and frivolous lawsuits that misunderstand the problem, and don’t even deal with the fact that sites already report all this content only to see it disappear into a blackhole without law enforcement doing anything… does not help solve the problem at all.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Senator Durbin’s ‘STOP CSAM Act’ Has Some Good Ideas… Mixed In With Some Very Bad Ideas That Will Do More Harm Than Good”

Subscribe: RSS Leave a comment
14 Comments
This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

It is time that we rewrite the law to reflect the reality of today’s world.

The people who keep saying shit like this never outline exactly how they’d rewrite 230 in a way that would still protect websites from frivolous lawsuits that threaten to shut down all but the most successful platforms. Then again, drowning all the smaller platforms by allowing people to flood said platforms with lawsuits might be the point.

Cat_Daddy (profile) says:

Re:

The problem is that congress is approaching digital policy from the wrong, outdated perspective. They assume that altering Section 230 is their golden goose, but it’s far from the truth. It’s an assumption that websites should always be guilty for their users’ crimes, but that is very backwards. The true answer is to view it through the perspective of a website.

With that being said, I will say that Durbin’s bill is better than Earn It, but only slightly. Mainly because it doesn’t immediately take away illability protection and that’s about it. It is still a bill made in bad-faith, will hurt a lot of the internet, and made by someone who doesn’t understand the complexities of the internet. Durbin’s bill is a D: slightly better than an F, but still a failing grade.

Anonymous Coward says:

Re: Re:

That was the consensus I came to. It’s better than EARN IT but only by a slight margin. It still doesn’t give NCMEC any additional funding, opens up the spigot for more reports that could be false positives (we saw this with a parent who sent pictures of their baby to their family doctor that was mistakenly labeled as CSAM) and opens up 230 to frivolous lawsuits which Durbin says is a good thing?

If EARN IT is an F, then STOP CSAM is a D-.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Now, some will argue this shouldn’t have a huge impact on big companies that do the right thing because it’s only for those that “intentionally, knowingly, recklessly, or negligently promote or facilitate”

Except, they will be hammered for any CSAM that they fail to find, as by reporting some, they mist be negligent not to report it all… At least that is a reason ambulance chasing lawyers will use.

This comment has been deemed funny by the community.
Ethin Probst (profile) says:

Can these lawmakers read?

I find it incredibly depressing that lawmakers don’t even know what section 230 contains. Here’s what we need: the LMRTLTABAT Act (the “Lawmakers Must Read The Laws They Amend Before Amending Them Act”). It should say:

Section 3: Definitions.
(a) “Lawmaker” means any elected official or appointed representative who has the authority to propose or amend legislation.
(b) “Amend” means to make changes to an existing law.
(c) “Read” means to thoroughly review and understand the contents of a law.
Section 4: Requirements.
(a) Before proposing any amendment to an existing law, lawmakers must read and fully comprehend the text of the law they intend to amend.
(b) Lawmakers must sign a statement attesting to the fact that they have read and understood the law they are proposing to amend.
(c) The signed statement must be included in the legislative record for the proposed amendment.

That way, these lawmakers actually know what they’re talking about. Oh wait, that would be too much to ask for.

Anonymous Coward says:

Re:

(b) Lawmakers must sign a statement attesting to the fact that they have read and understood the law they are proposing to amend.

You’ve never signed a mortgage, a car loan, or bought a toaster lately, have you? There’s this button you can click – it’s right there … – and you get your car, or your toaster. All those words above it? Those won’t matter. Probably.

Hobbes says:

All of these discussions seem to take it for granted that the lawmakers are operating in good faith — that they’re terribly concerned about the children (or whoever), and they need to tweak 230 to accomplish this truly, obviously, worthwhile goal.

In a word, bullshit.

One of the tells is that these laws never seem to take any other concrete steps toward a solution.

Let me be a little cynical here. The problem they’re trying to solve IS section 230. Imagine how upset they (or rather the people who have bought them) must have been when, JUST as they had succeeded in gaining complete control of the press, and of the national marketplace of ideas, along comes this stupid little clause that allows the rabble to publish — AND AMPLIFY — all these ideas that they have been trying to quash for the last 100 years. Ideas like liberty, like equality, like organizing and building communities. It must be absolutely galling to them.

Too cynical? A little paranoid? I really doubt it.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a BestNetTech Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

BestNetTech community members with BestNetTech Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the BestNetTech Insider Shop »

Follow BestNetTech

BestNetTech Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the BestNetTech Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
BestNetTech needs your support! Get the first BestNetTech Commemorative Coin with donations of $100
BestNetTech Deals
BestNetTech Insider Discord
The latest chatter on the BestNetTech Insider Discord channel...
Loading...