Congress Moving Forward On Unconstitutional Take It Down Act

from the what-a-joke dept

Here’s a puzzle: How do you write a law that’s so badly designed that (1) the people it’s meant to help oppose it, (2) the people who hate regulation support it, and (3) everyone involved admits it will be abused? The answer, it turns out, is the Take It Down Act.

The bill started with the entirely reasonable goal of addressing non-consensual intimate imagery online. But then something went wrong. Instead of building on existing successful systems, or within the parameters of the First Amendment, Congress decided to create a new framework combining vague “duty of care” requirements with harsh criminal penalties — a combination that, as we’ve previously detailed, practically begs to be weaponized for censorship.

Most tellingly, Donald Trump — in endorsing the bill during his address to Congress — openly bragged about how he plans to abuse its provisions to censor content he personally dislikes. When the person championing your anti-abuse legislation is promising to use it for abuse, you might have a problem.

The bill is so bad that even the Cyber Civil Rights Initiative, whose entire existence is based on representing the interests of victims of NCII and passing bills similar to the Take It Down Act, has come out with a statement saying that, while it supports laws to address such imagery, it cannot support this bill due to its many, many inherent problems.

While supportive of the bill’s criminal provision relating to authentic nonconsensual intimate images, which closely resembles CCRI’s model federal law and state laws that have survived constitutional challenge, CCRI has serious reservations about S. 146’s reporting and removal requirements. Encouraging speedy removal of nonconsensual intimate imagery from platforms is laudable, but the provision as written is unconstitutionally vague, making it difficult for individuals and platforms to understand what conduct is prohibited or required. The provision is also unconstitutionally overbroad, extending well beyond unlawful imagery. Finally, the provision lacks adequate safeguards against abuse, increasing the likelihood of bad faith reports and chilling protected expression. Such flaws would be alarming under any circumstances; in light of the current administration’s explicit commitment to selectively enforcing laws for political purposes, they are fatal. CCRI cannot support legislation that risks endangering the very communities it is dedicated to protecting, including LGBTQIA+ individuals, people of color, and other vulnerable groups. 

These warnings echo what digital rights groups like the Center for Democracy & Technology and EFF have been shouting for months — only to be completely ignored by Congress. The concerns are not theoretical: the bill’s vague standards combined with harsh criminal penalties create a perfect storm for censorship and abuse.

Yet despite these clear red flags, Ted Cruz announced that the House will take up the Senate’s fatally flawed version of the bill. This comes after leadership dismissed substantive criticisms during markup, including explicit warnings from Alexandria Ocasio-Cortez about the bill’s potential for abuse.

That’s Cruz saying:

I am thrilled that the TAKE IT DOWN Act will be getting a vote on the House Floor early next week.

Thank you to [Speaker Johnson, Steve Scalise, and Brett Gurthrie] for their leadership and action to protect victims of revenge and deepfake pornography and give them the power to reclaim their privacy and dignity.

When this bill is signed into law, those who knowingly spread this vile material will face criminal charges, and Big Tech companies must remove exploitative content without delay.

The weird thing about this bill is that we already have systems to handle non-consensual intimate imagery online. There’s NCMEC’s “Take It Down” system, which helps platforms identify and remove this content. There’s StopNCII.org, a non-profit effort that’s gotten virtually every major platform — from Meta to TikTok to Pornhub — to participate in coordinated removal efforts. These systems work because they’re precise, transparent, and focused on the actual problem.

But apparently working solutions aren’t exciting enough for Congress. Instead of building on these proven approaches, they’ve decided to create an entirely new system that somehow manages to be both weaker at addressing the real problem and more dangerous for everyone else.

The problem here is pretty simple: If you give people a way to demand content be taken down, they will abuse it. We already have a perfect case study in the DMCA. Even with built-in safeguards like counternotices and (theoretical) penalties for false claims, the DMCA sees thousands of bogus takedown notices used to censor legitimate speech.

The Take It Down Act looks at this evidence of widespread abuse and says “hold my beer.” Not only does it strip away the DMCA’s already-inadequate protections, it adds criminal penalties that make false claims even more attractive as a censorship weapon. After all, if people are willing to file bogus copyright claims just to temporarily inconvenience their opponents, imagine what they’ll do when they can threaten prison time.

And imagine what the current Trump administration would do with those threats of criminal charges over content removals.

CDT’s Beeca Branum put out a statement this morning about how stupid all of this is:

“The TAKE IT DOWN Act is a missed opportunity for Congress to meaningfully help victims of nonconsensual intimate imagery. The best of intentions can’t make up for the bill’s dangerous implications for constitutional speech and privacy online. Empowering a partisan FTC to enforce ambiguous legislation is a recipe for weaponized enforcement that risks durable progress in the fight against image-based sexual abuse.”

“The TAKE IT DOWN Act, while well-intentioned, was written without appropriate safeguards to prevent the mandated removal of content that is not nonconsensual intimate imagery, making it vulnerable to constitutional challenge and abusive takedown requests. Moreover, its ambiguous text can be read to create an impossible requirement for end-to-end encrypted platforms to remove content to which they have no access.”

The most baffling aspect of this debacle is watching self-proclaimed progressive voices like Tim Wu and Zephyr Teachout champion a bill that hands unprecedented censorship power to an administration they claim to oppose. This morning, both of them appeared at a weird press conference in support of the bill. While their recent embrace of various unconstitutional and censorial internet regulations is disappointing, their willingness to hand Donald Trump a censorship weapon he’s openly bragging about abusing is genuinely shocking.

The Take It Down Act will likely become law, and then we’ll get to watch as the Trump administration — which has already announced its plans to abuse it — gets handed a shiny new censorship weapon with “totally not for political persecution” written on the side in extremely small print. The courts might save us, but they’re already drowning in unconstitutional nonsense from this administration. Perhaps not the best time to add “government-enabled censorship framework” to their to-do list.

Update: Welp, late today this passed the House overwhelmingly, 409-2. The only two nay vote were from Republicans Thomas Massie and Eric Burlison.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Congress Moving Forward On Unconstitutional Take It Down Act”

Subscribe: RSS Leave a comment
43 Comments
Anonymous Coward says:

Before any panics I do want to point out that if it becomes law, it won’t come into force for another year if i’m reading this right?.

(A) ESTABLISHMENT .—Not later than year after the date of enactment of this Act, covered platform shall establish a process whereby an identifiable individual (or an au- thorized person acting on behalf of such indi- vidual)

https://www.congress.gov/119/bills/s146/BILLS-119s146es.pdf

Not sure if this is ture but I heard the FTC has to bring out guidance in 6 months and the FTC is a huge mess right now and this will be taken to court.

Arianity (profile) says:

Re:

Before any panics I do want to point out that if it becomes law, it won’t come into force for another year if i’m reading this right?.

For that portion on platforms, yes. It’d be unlikely for any fix/repeal (from Congress. Courts might potentially move faster to an injunction) to happen in that time period, though. Practically speaking, it just gives companies time to make a policy.

There are other portions that aren’t aimed at the platforms (i.e., the parts with penalties for individuals who posted it) that kick in immediately.

Anonymous Coward says:

Maybe congress things this will be a distraction from their abject failure to do anything at all (for most of them) about the Executives on going seizure of powers it’s not entitled.

Or maybe they just want to stick their heads in the sand and pretend everything is OK, and crapping out stuff and calling it legislation is just another days work.

In either case, they should be soundly mocked.

Arianity (profile) says:

These systems work because they’re precise, transparent, and focused on the actual problem….Instead of building on these proven approaches

The reason there’s so much appetite for this is because they’re still pretty flawed, between major platforms not always reliably responding to requests, and more problematically, the fact that many minor platforms don’t use it. They’re obviously a lot better than nothing, but in practice they have pretty gaping holes.

It’s a bad bill, but it’s not at all weird that people are looking for a more robust system considering current failures. Saying the current systems work is not productive in getting people to build on existing approaches, it’s going to push people towards alternatives (including, unfortunately, bad ones).

This comment has been deemed insightful by the community.
David says:

Re:

Saying the current systems work is not productive in getting people to build on existing approaches, it’s going to push people towards alternatives (including, unfortunately, bad ones).

Laws created relying on the premise “surely nobody would be as naughty as to abuse them” are completely morose since in the absence of bad faith actors laws would not be required in the first place.

It is an exercise in idiocy to follow the approach “legislate first, worry about bad consequences later”. Because creating a law is a process that needs to carve out a narrow path of least unfavorable consequences to the desired outcome. You cannot save worrying about the consequences for later: that is like planning a railroad line and worrying about mountains and lakes later.

Arianity (profile) says:

Re: Re:

It is an exercise in idiocy to follow the approach “legislate first, worry about bad consequences later”.

Yeah, the problem is Congress is full of idiots who will follow that approach, and we’re stuck with them. Mike mentioned the DMCA, but he just as easily could’ve included FOSTA/SESTA, (most of) the CDA, etc. Congress is really susceptible when it comes to

Anonymous Coward says:

Re:

It’s not a requirement, but it’s the only real safe way for a website operator to operate when faced with criminal penalties and a vague law for how they’re applied. Too much risk to allow random uploading. Most sites are already struggling with revenue due to bandwidth costs and one lawsuit could knock out any of the small to medium players (4chan, Imgur, Reddit) and cause any of the large players (Alphabet, Meta) to reconsider if they really want to host user submitted content anymore.

Given that this mostly aims at making images public, non-image hosts like pure text blogs are going to be okay and some sites like tumblr or X or bluesky might just deeply restrict image posting. I could also see Snapchat disallowing image uploading and image saving in app so that only pictures taken in app with the camera would post and they’d automatically disappear due to the architecture, meaning the app becomes self-cleaning without repost bots or saved images.

Stephen T. Stone (profile) says:

Re: Re: Re:

How likely is any of that to happen

Nobody here can tell you, with the certainty of God, what the mathematical odds of anything like this happening are. Someone could say 99% and someone else could say 1%, and there’s no real way to tell you which one is right.

You’re looking for certainty in an uncertain world. I get it, I do. But that is always a fool’s errand because it’s equally possible that I either die of a heart attack ten minutes after I post this comment or I keep on living. The future is not set, nor will it ever be.

Get comfortable with not knowing.

That One Guy (profile) says:

If someone tells you they're a pacifist and then takes a swing at you...

The most baffling aspect of this debacle is watching self-proclaimed progressive voices like Tim Wu and Zephyr Teachout champion a bill that hands unprecedented censorship power to an administration they claim to oppose. This morning, both of them appeared at a weird press conference in support of the bill. While their recent embrace of various unconstitutional and censorial internet regulations is disappointing, their willingness to hand Donald Trump a censorship weapon he’s openly bragging about abusing is genuinely shocking.

What someone says tells you what sort of person they want you to think they are.

What someone does tells you what sort of person they actually are.

In honest and consistent people words will match pretty closely to action as they ‘practice what they preach’ as it were. In dishonest people their words will clash strongly with their actions, and of the two actions are the more reliable way to judge a person.

Between groups who’s jobs are to fight such content coming out against the bill due to the potential for abuse and convicted felon Trump outright saying he will use it against those he doesn’t like anyone who voted in favor of the bill did so knowing that it would be used in such manner and as such deserve to be condemned for it rather than allowed to falsely claim down the line that ‘they had no idea’ when, not if, it’s used as yet another censorship tool.

Tdestroyer209 says:

Stupidity is leaking bad like deepwater horizon spill in Congress

Ugh the fact that this badly designed and unconstitutional bill got 409-2 with all democrats voted for it in the house just makes me go WTF were they thinking?

Add in the FTC where several commissioners are fired and quite a bit of workers were too with potentially more cuts from DOGE in the future and how in the literal hell is going to be properly enforced?!

JFC I think Congress has lost their minds with the “moral crusade” bullshit because they pull this crap because fake “advocacy” groups wanted this while the real groups opposed this.

Its just so tiring and frustrating seeing the stupidity just gushing out like Deepwater Horizon at an alarming rate like yikes.

That One Guy (profile) says:

Re: 'That's a price I'm willing to have them pay.'

Ugh the fact that this badly designed and unconstitutional bill got 409-2 with all democrats voted for it in the house just makes me go WTF were they thinking?

While I can’t read minds I suspect I can make a pretty accurate guess: ‘Screw everyone else, so long as I get to issue a PR statement crowing about how this shows how much I care about revenge porn the fact that I just gave a dictator the power to silence those he doesn’t like, something he’s already admitted he plans on doing, is a price I’m willing to have the public pay.’

This comment has been deemed insightful by the community.
Adrian Lopez says:

YouTube Takedowns

I shudder to think what this will mean for YouTube videos exposing charlatans, grifters, and other liars. The subjects of these videos have no qualms about abusing YouTube’s reporting systems to shut down criticism, and the worst of them are litigious to a fault. This new law will facilitate this kind of censorship, and YouTube being YouTube will simply take down the videos, and possibly shut down the channels, without any chance for appeal because the content has to be removed within 48 hours and deleted.

A new era of censorship awaits us, and I’m not at all confident the courts will overturn this.

Stephen T. Stone (profile) says:

Re: Re:

The people on United Airlines Flight 93 on the morning of 9/11 probably figured out soon after the hijacking that they were all going to die. None of them could’ve prevented their deaths that day. But the brave souls who fought the hijackers for control of the plane still fought anyway.

The fight is everything. Sometimes, it’s all we really have.

Anonymous Coward says:

The weird thing about this bill is that we already have systems to handle non-consensual intimate imagery online. There’s NCMEC’s “Take It Down” system, which helps platforms identify and remove this content. There’s StopNCII.org, a non-profit effort that’s gotten virtually every major platform — from Meta to TikTok to Pornhub — to participate in coordinated removal efforts. These systems work because they’re precise, transparent, and focused on the actual problem.

And that’s the whole problem for MAGAts, isn’t it? These systems are focused on actual problematic content, not content of the “I know it when I see it” or “If it waddles like a duck and honks like a goose, it’s a duck” variety.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a BestNetTech Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

BestNetTech community members with BestNetTech Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the BestNetTech Insider Shop »

Follow BestNetTech

BestNetTech Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the BestNetTech Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
BestNetTech Deals
BestNetTech Insider Discord
The latest chatter on the BestNetTech Insider Discord channel...
Loading...