Hide BestNetTech is off for the holidays! We'll be back soon, and until then don't forget to check out our fundraiser »

NetChoice Sues California Once Again To Block Its Misguided ‘Social Media Addiction’ Bill

from the slow-down-california,-and-read-the-constitution dept

Earlier this year, California passed SB 976, yet another terrible and obviously unconstitutional bill with the moral panicky title “Protecting Our Kids from Social Media Addiction Act.” The law restricts minors’ access to social media and imposes burdensome requirements on platforms. It is the latest in a string of misguided attempts by California lawmakers to regulate online speech “for the children.” And like its predecessors, it is destined to fail a court challenge on First Amendment grounds.

The bill’s sponsor, Senator Nancy Skinner, has a history of relying on junk science and misrepresenting research to justify her moral panic over social media. Last year, in pushing for a similar bill, Skinner made blatantly false claims based on her misreading of already misleading studies. It seems facts take a backseat when there’s a “think of the children!” narrative to push.

The law builds on the Age Appropriate Design Code, without acknowledging that much of that law was deemed unconstitutional by an appeals court earlier this year (after being found similarly unconstitutional by the district court last year). This bill, like a similar one in New York, assumes (falsely and without any evidence) that “algorithms” are addictive.

As we just recently explained, if you understand the history of the internet, algorithms have long played an important role in making the internet usable. The idea that they’re “addictive” has no basis in reality. But the law insists otherwise. It would then ban these “addictive algorithms” if a website knows a user is a minor. It also has restrictions on when notifications can be sent to a “known” minor (basically no notifications during school hours or late at night).

There’s more, but those are the basics.

NetChoice stepped up and sued to block this law from going into effect.

California is again attempting to unconstitutionally regulate minors’ access to protected online speech—impairing adults’ access along the way. The restrictions imposed by California Senate Bill 976 (“Act” or “SB976”) violate bedrock principles of constitutional law and precedent from across the nation. As the United States Supreme Court has repeatedly held, “minors are entitled to a significant measure of First Amendment protection.” Brown v. Ent. Merchs. Ass’n, 564 U.S. 786, 794 (2011) (cleaned up) (quoting Erznoznik v. Jacksonville, 422 U.S. 205, 212-13 (1975)). And the government may not impede adults’ access to speech in its efforts to regulate what it deems acceptable for minors. Ashcroft v. ACLU, 542 U.S. 656, 667 (2004); Reno v. ACLU, 521 U.S. 844, 882 (1997). These principles apply with equal force online: Governments cannot “regulate [‘social media’] free of the First Amendment’s restraints.” Moody v. NetChoice, LLC, 144 S. Ct. 2383, 2399 (2024).

That is why courts across the country have enjoined similar state laws restricting minors’ access to online speech. NetChoice, LLC v. Reyes, 2024 WL 4135626 (D. Utah Sept. 10, 2024) (enjoining age-assurance, parental-consent, and notifications-limiting law); Comput. & Commc’n Indus. Ass’n v. Paxton, 2024 WL 4051786 (W.D. Tex. Aug. 30, 2024) (“CCIA”) (enjoining law requiring filtering and monitoring of certain content-based categories of speech on minors’ accounts); NetChoice, LLC v. Fitch, 2024 WL 3276409 (S.D. Miss. July 1, 2024) (enjoining ageverification and parental-consent law); NetChoice, LLC v. Yost, 716 F. Supp. 3d 539 (S.D. Ohio 2024) (enjoining parental-consent law); NetChoice, LLC v. Griffin, 2023 WL 5660155 (W.D. Ark. Aug. 31, 2023) (enjoining age-verification and parental-consent law).

This Court should similarly enjoin Defendant’s enforcement of SB976 against NetChoice members

As we’ve discussed, the politics behind challenging these laws makes it a complex and somewhat fraught process. So I’m glad that NetChoice continues to step up and challenge many of these laws.

The complaint lays out that the parental consent requirements in the bill violate the First Amendment:

The Act’s parental-consent provisions violate the First Amendment. The Act requires that covered websites secure parental consent before allowing minor users to (1) access “feed[s]” of content personalized to individual users, § 27001(a); (2) access personalized feeds for more than one hour per day, § 27002(b)(2); and (3) receive notifications during certain times of day, § 27002(a). Each of these provisions restricts minors’ ability to access protected speech and websites’ ability to engage in protected speech. Accordingly, each violates the First Amendment. The Supreme Court has held that a website’s display of curated, personalized feeds is protected by the First Amendment. Moody, 144 S. Ct. at 2393. And it has also held that governments may not require minors to secure parental consent before accessing or engaging in protected speech. Brown, 564 U.S. at 799;

So too do the age assurance requirements:

The Act’s requirements that websites conduct age assurance to “reasonably determine” whether a user is a minor, §§ 27001(a)(1)(B), 27002(a)(2), 27006(b)-(c), also violate the First Amendment. Reyes, 2024 WL 4135626, at 16 n.169 (enjoining age-assurance requirement); Fitch, 2024 WL 3276409, at 11-12 (enjoining age-verification requirement); Griffin, 2023 WL 5660155, at *17 (same). All individuals, minors and adults alike, must comply with this age-assurance requirement—which would force them to hand over personal information or identification that many are unwilling or unable to provide—as a precondition to accessing and engaging in protected speech. Such requirements chill speech, in violation of the First Amendment. See, e.g., Ashcroft, 542 U.S. at 673; Reno, 521 U.S. at 882.

It also calls out that there’s an exemption for consumer review sites (good work, Yelp lobbyists!), which highlights how the law is targeting specific types of content, which is not allowed under the First Amendment.

California Attorney General Rob Bonta insisted in a statement to GovTech that there are no First Amendment problems with the law:

“SB976 does not regulate speech,” Bonta’s office said in an emailed statement. “The same companies that have committed tremendous resources to design, deploy, and market social media platforms custom-made to keep our kids’ eyes glued to the screen are now attempting to halt California’s efforts to make social media safer for children” the statement added, saying the attorney general’s office would respond in court.

Except he said that about the Age Appropriate Design Code and lost in court. He said that about the Social Media Transparency bill and lost in court. He said that about the recent AI Deepfake law… and lost in court.

See a pattern?

It would be nice if Rob Bonta finally sat down with actual First Amendment lawyers and learned how the First Amendment worked. Perhaps he and Governor Newsom could take that class together so Newsom stops signing these bills into law?

Wouldn’t that be nice?

Filed Under: , , , , , , , , ,
Companies: netchoice

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “NetChoice Sues California Once Again To Block Its Misguided ‘Social Media Addiction’ Bill”

Subscribe: RSS Leave a comment
46 Comments
This comment has been deemed insightful by the community.
williamperry (profile) says:

Oh no! The Fearsome Dopamine Hit!

People that talk about “Dopamine Hit” are totally misdirect/sleight-of-handing you, it sounds so much like free-basing it must be bad for you! But literally everything you do in life that is positive gives a ‘Dopamine Hit’

Shame,depravity – Ive become addicted to experienceing things. Why did no one protect me from this?

Tdestroyer209 says:

Can’t say I’m surprised California keeps trying these “think of the children” laws and it usually gets defeated like they never fucking learn at all.

Ive been looking into a lot of these “think of the children” laws lately and noticed that Britain/Australia are hellbent on forcing their nanny state bullshit on others and of course they want the US to join in their crusade to censor the internet in the name of the children.

Of course KOSA had help from Beeban Kidron who is obsessed with the think of the children bullshit and I suspect because the California AADC got blocked she is absolutely hellbent on getting KOSA by any means necessary with other groups/individuals who have the same twisted ideology as her.

I think Kidron should shove her twisted ideology up where the sun don’t shine and keep it there til the end of days.

This comment has been flagged by the community. Click here to show it.

Stephen T. Stone (profile) says:

Re: Re: Re:2

It’s hard to do that when the way you express your concerns sounds like a bunch of defeatist bullshit. We get it, you think the sky is going to fall directly on your head and kill you if that bill passes⁠—you don’t need to bring it up on every article. Even if you do feel that compulsion, you don’t need to sound like you’re watching the sky break into pieces while you post comments.

Anonymous Coward says:

Re: Re:

I’m not being a doomer, it stands a chance of passing with a simple majority in the house this wednesday. My only comfort is that it likely can’t overcome the senate filibuster, and that maybe Biden would veto it.

Maybe I am dooming by thinking of the worst possible scenario here, but I’m struggling to find any arguments for how this won’t end badly, currently.

I’m not saying a bill like this would spell the end of a free US or any of that doom bullshit, but I do think losing NGOs like these would be a pretty fucking big problem if it happened, y’know?

Anonymous Coward says:

Re: Re: Re:

“I’m not being a doomer, it stands a chance of passing with a simple majority in the house this wednesday. My only comfort is that it likely can’t overcome the senate filibuster, and that maybe Biden would veto it.”
it has to pass the senate am pretty sure and even then it could be challenged

Stephen T. Stone (profile) says:

Re: Re: Re:5

Anyone can tell you that. But would you believe it? The whole point of your whining is to tell yourself “a bad outcome will happen no matter what so I don’t need to bother trying to fight it”. That is defeatism; that is your whole schtick in re: this bill.

I’m going to guess that you keep track of this bill via social media. How many times do you see accounts that keep saying shit like “it’s over” or “say goodbye to the Internet” when they mention that bill? Because that’s as good a measure as any of how stressed you are about this. And make no mistake, doomers and defeatists want you as stressed as them. That’s how they help ensure the bad outcome will happen: They make everyone else feel so stressed that they give up all hope. And when you give up, the fascists you don’t want to win will win.

Look, I’m stressed about the next four years. Anyone who isn’t kissing Trump’s ass probably feels the same way. But what you won’t catch me doing is being so stressed that I shut down and give up. I can worry in the back of my mind about what Trump and his cronies will do while I do what I can with what I have to keep from falling prey to the same doomerism that you embrace. The world isn’t really in a good place right now, sure. But if I sit here worrying all the live-long day about when the sky is going to fall, I’d never get anything done. (I’ve got other excuses for that problem besides.)

Manage your stress and stop letting social media stress you out so much that it ruins your perspective on life. Worry about things like HR 9495, but not so much that it makes you give up trying to do something about it. Maybe you can’t stop HR 9495 from passing⁠—but you can help convince others why it’s a bad bill and ask them to fight it. If there’s nothing you can do but you can’t do nothing, you do the best you can. And if you can’t even bother trying to do that, stop bothering us with your doomer bullshit. You can’t throw a pity party for yourself and expect other people to attend.

Anonymous Coward says:

Re: Re: Re:6

For what it’s worth, I am seeing efforts to sway house representives to vote against it are paying off to some degree. (One changed from yes to no so far, at minimum.)

Despite the admittedly mind-clouding anxiety, I do have a feeling this won’t be able to overcome the senate filibuster anyway.

This comment has been flagged by the community. Click here to show it.

Arianity says:

This bill, like a similar one in New York, assumes (falsely and without any evidence) that “algorithms” are addictive.

As we just recently explained, if you understand the history of the internet, algorithms have long played an important role in making the internet usable.

This is disingenuous. What the bill talks about is some social media platforms have evolved to include addictive features, including the algorithmic delivery of content and other design features. That’s not “algorithms” in general.

The bill is bad enough, you don’t have to misrepresent it to make it sound like they don’t know what an algorithm is.

This comment has been deemed insightful by the community.
Strawb (profile) says:

Re:

This is disingenuous. What the bill talks about is some social media platforms have evolved to include addictive features, including the algorithmic delivery of content and other design features.

An algorithm that delivers content a certain way isn’t an “addictive feature”.

The bill is bad enough, you don’t have to misrepresent it to make it sound like they don’t know what an algorithm is.

Suggesting that the politicians who wrote this bill don’t know what an algorithm is isn’t a misrepresentation.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re:

So… gambling isn’t addictive, because money isn’t addictive. A setup that delivers money in a certain way isn’t an “addictive feature”.

I’m not actually taking a particular side on whether or not “social media” has made an “addictive” design. I don’t even use any of these sites – tried a couple out, found ’em incredibly boring and then intrusive; left. But, isn’t it a bit disingenuous to say that because a base item (money, or other people’s posts, or…) isn’t generally addictive in and of itself, it can’t possibly ever be delivered in an addictive way?

You may also think there is no such thing as a “gambling problem” and therefore think that there is no question to answer here. *shrug* I don’t get it either, but then I also don’t gamble: it is likewise boring and unrewarding. So I’m not really the person to speak on that. A lot of people say they have one.

Anonymous Coward says:

Re: Re: Re:

Here’s the thing, gambling can be addictive even without the money element, and I know what I’m talking about because I’m a recovering gambling addict who still gets that hit from free-to-play slot machine apps. I highly doubt that social media algorithms would be addictive without the posts they promote, they would just be a functionless bunch of code.

Strawb (profile) says:

Re: Re: Re:

Something being addictive and people getting addicted are two different problems. Even if the algorithm was changed to exclusively serve up posts in chronological order, there’d be people getting addicted to it, because that’s how dopamine works.

So no, I don’t think that people becoming addicted to their social media feeds inherently makes the algorithm behind them addictive.

Arianity says:

Re: Re:

An algorithm that delivers content a certain way isn’t an “addictive feature”.

Eh, I think it depends on what you mean. You can facilitate access to something that is addictive, and make the addiction worse indirectly. The addiction itself is to the underlying content, but you can make the addiction worse by say, making it easier to indulge.

As analogy, many alcoholics don’t keep alcohol in the house, because the proximity makes it easier to relapse. (to be clear: This isn’t a substance abuse issue, just using it as analogy). The underlying thing that is addictive is the alcohol itself, but the access does make it harder for them to control the addiction. There’s a reason retention changes depending on which algorithm is used. That is going to affect addicts as well.

There is, of course, a question of how much that actually happens. But addiction isn’t a binary, where someone is an addict or not an addict. Many addicts will be addicted regardless, that’s true, in the same way that many alcoholics still struggle even without alcohol in the house. But that doesn’t imply that it has no impact; making it easier/harder to relapse/indulge can have an impact on marginal addicts.

Suggesting that the politicians who wrote this bill don’t know what an algorithm is isn’t a misrepresentation.

There is nothing wrong with suggesting that politicians who wrote the bill don’t know what an algorithm is. But that can be done without misleadingly misquoting them to do it. The misquoting is the problem, not the criticism.

Strawb (profile) says:

Re: Re: Re:

You can facilitate access to something that is addictive, and make the addiction worse indirectly. The addiction itself is to the underlying content, but you can make the addiction worse by say, making it easier to indulge.

And in what way does an algorithmic feed make it “easier to indulge”?

There is nothing wrong with suggesting that politicians who wrote the bill don’t know what an algorithm is. But that can be done without misleadingly misquoting them to do it. The misquoting is the problem, not the criticism.

How are they being misquoted?

Anonymous Coward says:

Re: Re: Re:

Holy fuck, dude. You literally quoted the text of the bill that proves you wrong.

From the goddamn bill text:

(b)However, some social media platforms have evolved to include addictive features, including the algorithmic delivery of content and other design features, that pose a significant risk of harm to the mental health and well-being of children and adolescents.

Algorithmic delivery of content is the delivery of content via an algorithm. So it definitely fucking says something about “algorithms.” This is just basic reading comprehension. Quit while you’re behind. But you won’t because you’re going to keep getting defensive despite clear proof you’re wrong.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a BestNetTech Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

BestNetTech community members with BestNetTech Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the BestNetTech Insider Shop »

Follow BestNetTech

BestNetTech Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the BestNetTech Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
BestNetTech needs your support! Get the first BestNetTech Commemorative Coin with donations of $100
BestNetTech Deals
BestNetTech Insider Discord
The latest chatter on the BestNetTech Insider Discord channel...
Loading...