Congress Wants To Hand Your Parenting To Big Tech
from the ted-cruz-wants-to-be-everyone's-daddy dept
Lawmakers in Washington are once again focusing on kids, screens, and mental health. But according to Congress, Big Tech is somehow both the problem and the solution. The Senate Commerce Committee recently held a hearing on “examining the effect of technology on America’s youth.” Witnesses warned about “addictive” online content, mental health, and kids spending too much time buried in screen. At the center of the debate is a bill from Sens. Ted Cruz (R-TX) and Brian Schatz (D-HI) called the Kids Off Social Media Act (KOSMA), which they say will protect children and “empower parents.”
That’s a reasonable goal, especially at a time when many parents feel overwhelmed and nervous about how much time their kids spend on screens. But while the bill’s press release contains soothing language, KOSMA doesn’t actually give parents more control.
Instead of respecting how most parents guide their kids towards healthy and educational content, KOSMA hands the control panel to Big Tech. That’s right—this bill would take power away from parents, and hand it over to the companies that lawmakers say are the problem.
Kids Under 13 Are Already Banned From Social Media
One of the main promises of KOSMA is simple and dramatic: it would ban kids under 13 from social media. Based on the language of bill sponsors, one might think that’s a big change, and that today’s rules let kids wander freely into social media sites. But that’s not the case.
Every major platform already draws the same line: kids under 13 cannot have an account. Facebook, Instagram, TikTok, X, YouTube, Snapchat, Discord, Spotify, and even blogging platforms like WordPress all say essentially the same thing—if you’re under 13, you’re not allowed. That age line has been there for many years, mostly because of how online services comply with a federal privacy law called COPPA.
Of course, everyone knows many kids under 13 are on these sites anyways. The real question is how and why they get access.
Most Social Media Use By Younger Kids Is Family-Mediated
If lawmakers picture under-13 social media use as a bunch of kids lying about their age and sneaking onto apps behind their parents’ backs, they’ve got it wrong. Serious studies that have looked at this all find the opposite: most under-13 use is out in the open, with parents’ knowledge, and often with their direct help.
A large national study published last year in Academic Pediatrics found that 63.8% of under-13s have a social media account, but only 5.4% of them said they were keeping one secret from their parents. That means roughly 90% of kids under 13 who are on social media aren’t hiding it at all. Their parents know. (For kids aged thirteen and over, the “secret account” number is almost as low, at 6.9%.)
Earlier research in the U.S. found the same pattern. In a well-known study of Facebook use by 10-to-14-year-olds, researchers found that about 70% of parents said they actually helped create their child’s account, and between 82% and 95% knew the account existed. Again, this wasn’t kids sneaking around. It was families making a decision together.
A 2022 study by the UK’s media regulator Ofcom points in the same direction, finding that up to two-thirds of social media users below the age of thirteen had direct help from a parent or guardian getting onto the platform.
The typical under-13 social media user is not a sneaky kid. It’s a family making a decision together.
KOSMA Forces Platforms To Override Families
This bill doesn’t just set an age rule. It creates a legal duty for platforms to police families.
Section 103(b) of the bill is blunt: if a platform knows a user is under 13, it “shall terminate any existing account or profile” belonging to that user. And “knows” doesn’t just mean someone admits their age. The bill defines knowledge to include what is “fairly implied on the basis of objective circumstances”—in other words, what a reasonable person would conclude from how the account is being used. The reality of how services would comply with KOSMA is clear: rather than risk liability for how they should have known a user was under 13, they will require all users to prove their age to ensure that they block anyone under 13.
KOSMA contains no exceptions for parental consent, for family accounts, or for educational or supervised use. The vast majority of people policed by this bill won’t be kids sneaking around—it will be minors who are following their parents’ guidance, and the parents themselves.
Imagine a child using their parent’s YouTube account to watch science videos about how a volcano works. If they were to leave a comment saying, “Cool video—I’ll show this to my 6th grade teacher!” and YouTube becomes aware of the comment, the platform now has clear signals that a child is using that account. It doesn’t matter whether the parent gave permission. Under KOSMA, the company is legally required to act. To avoid violating KOSMA, it would likely lock, suspend, or terminate the account, or demand proof it belongs to an adult. That proof would likely mean asking for a scan of a government ID, biometric data, or some other form of intrusive verification, all to keep what is essentially a “family” account from being shut down.
Violations of KOSMA are enforced by the FTC and state attorneys general. That’s more than enough legal risk to make platforms err on the side of cutting people off.
Platforms have no way to remove “just the kid” from a shared account. Their tools are blunt: freeze it, verify it, or delete it. Which means that even when a parent has explicitly approved and supervised their child’s use, KOSMA forces Big Tech to override that family decision.
Your Family, Their Algorithms
KOSMA doesn’t appoint a neutral referee. Under the law, companies like Google (YouTube), Meta (Facebook and Instagram), TikTok, Spotify, X, and Discord will become the ones who decide whose account survives, whose account gets locked, who has to upload ID, and whose family loses access altogether. They won’t be doing this because they want to—but because Congress is threatening them with legal liability if they don’t.
These companies don’t know your family or your rules. They only know what their algorithms infer. Under KOSMA, those inferences carry the force of law. Rather than parents or teachers, decisions about who can be online, and for what purpose, will be made by corporate compliance teams and automated detection systems.
What Families Lose
This debate isn’t really about TikTok trends or doomscrolling. It’s about all the ordinary, boring, parent-guided uses of the modern internet. It’s about a kid watching “How volcanoes work” on regular YouTube, instead of the stripped-down YouTube Kids. It’s about using a shared Spotify account to listen to music a parent already approves. It’s about piano lessons from a teacher who makes her living from YouTube ads.
These aren’t loopholes. They’re how parenting works in the digital age. Parents increasingly filter, supervise, and, usually, decide together with their kids. KOSMA will lead to more locked accounts, and more parents submitting to face scans and ID checks. It will also lead to more power concentrated in the hands of the companies Congress claims to distrust.
What Can Be Done Instead
KOSMA also includes separate restrictions on how platforms can use algorithms for users aged 13 to 17. Those raise their own serious questions about speech, privacy, and how online services work, and need debate and scrutiny as well. But they don’t change the core problem here: this bill hands control over children’s online lives to Big Tech.
If Congress really wants to help families, it should start with something much simpler and much more effective: strong privacy protections for everyone. Limits on data collection, restrictions on behavioral tracking, and rules that apply to adults as well as kids would do far more to reduce harmful incentives than deputizing companies to guess how old your child is and shut them out.
But if lawmakers aren’t ready to do that, they should at least drop KOSMA and start over. A law that treats ordinary parenting as a compliance problem is not protecting families—it’s undermining them.
Parents don’t need Big Tech to replace them. They need laws that respect how families actually work.
Republished from the EFF’s Deeplinks blog.
Filed Under: brian schatz, coppa, kids, kosma, moral panic, parental controls, social media, ted cruz


Comments on “Congress Wants To Hand Your Parenting To Big Tech”
I’d like to get politicians off social media!
Thank you, you’ve been a great audience, be sure to tip your waitress.
Re:
Two specific bits of technology have made this world demonstrably worse:
The first isn’t hard to grasp. The second makes the list because of (A) group chats and (B) easier access to the Internet, including social media services, from basically anywhere. A lot of dumbfuck rich people with smartphones and social media fried their fucking brains, and we’re all paying the price for it.
Re: Re:
Those brains were fried regardless of whether they had smartphones or not, though.
It is getting increasingly obvious that wealth in and of itself at certain geometric multiples makes the owner stupid, paranoid and damages their empathy.
There are ways to do this without wealth bubbles but wealth or the desire thereof is the fastest and most consistent way.
Is it going to be considered censorship of conservative voices when they start blocking people who read and write at a 3rd grade level?
Republicans have a Trump Addiction, so they assume anyone can become addicted to stupid things. What’s the Democrats’ excuse?
Wait, I thought conservatives were all for parental rights…
It was their platform when they cut funding to public schools for private charters, and for homeschooling…
And your right to marry your 14 year old daughter to a 40 year old landowner…
Because they do, in practice? You said so yourself: Of course, everyone knows many kids under 13 are on these sites anyways.
It’s both.
The entire reason these discussions exist is because they so many don’t actually do that. While parents are aware of their kids being on social media, that doesn’t imply they’re making informed decisions on the risk, or being a responsible parent.
That’s the cool thing about liability based incentives, it doesn’t require trust.
EFF’s universal solution to every problem: universal privacy protections! This does nothing to even attempt to fix the actual problem. It’s especially funny given the COPPA mention, which manages to leave out that COPPA already has privacy protections for kids under 13 (or their lack of enforcement). Wonder why?
Re:
The last federal privacy law passed was the video privacy protection act passed in 1988, I think its time for some goddamn updates.
Re: Re:
Same. I just don’t think doing the Carthago delenda est bit on every article regardless of topic is a good way to get there, I feel like that cheapens it and will make people write it off as a joke.
Re:
I am sure there are a lot of parents who aren’t making informed decisions or being responsible.
Ted Cruz is just about the last person I’d trust to make those decisions for them.
Or, not to put too fine a point on it, for me, because as the article repeatedly notes, these would be blanket, one-size-fits-all legal requirements with no allowance for parents who want to make their own choices about what’s acceptable for their children.
Re: Re:
Yeah, I mean, KOSMA is trash. But it would be nice to know what an EFF-approved version might look like. Allowances for parents are a red line, but I suspect that isn’t the only part they’d want changed. It’s a necessary but probably not sufficient change.
Re:
it’s not really the role of the government to ensure parents are being “responsible parents” (who’s defining responsible here anyway and how is it defined?). There are a large number of parenting decisions I don’t consider responsible, but I don’t want the government to force people to do/not do
Re:
What actual problem?
ugh.
This article is a bit silly.
Replace the word “Congress” with “Parents” in the title, and you hit the root of the problem.
I work at a school in the tech department, and I am constantly fielding questions from an endless parade of bewildered and defeated parents who express deep concern about what their children are exposed to online, and they almost all have the same one-word reaction to any simple inquiry about what methods they have tried to monitor and/or control what their children do online: “Huh?”
As much as republicans use absurdly inflammatory or terror-inducing language, this article does the opposite. While there are indeed many more children with parent-assisted accounts than not, this article tries to paint a Norman Rockwell picture of a family gathered around the computer, learning and benefiting from the magical wonders of social media, whereas in a small place I like to call “reality”, parents give their kids social media accounts because they honestly don’t seem to understand that is any other way to get their kids to stop pestering them.
I can tell you from personal experience, most of those “a family making a decision together” accounts are in fact parents making a decision to push their kid’s best interests aside because saying “no” will cause too much trouble.
At the risk of exposing my old-fartiness, back in what I like to call “the day”, any time a kid wanted something expensive, and that kid offered the obvious “but all the other kids have them” argument, they would be shut down with “well, if all the other kids jumped off a cliff, would you?” response. Now, however, parents seem to be terrified that anything other than complete capitulation to the material desires of their kids will brand them as terrible people who have left their children disadvantaged in some way.
Which leads us to here. I have listened to countless parents with countless “I had no Idea that kind of stuff was online, but I caught my daughter looking at…”, “…why are such things allowed on the internet?”, or even “I can’t believe that someone so deranged would tell my 9 year old those things…” stories, and they that almost all end with “well, somebody should do something”, and any suggestion that involves a parent…perhaps doing some parenting…either falls on deaf ears, or is met with a conversation-ending look of disdainful “how dare you!”, as though I just suggested a beating with a lead pipe.
I’ve said this countless times in my 30 years working with technology in schools, and I’ll say it again: if your child needs to do research for school, the difference between giving them unfettered access to the Internet and dropping them off alone in downtown Washington, D.C. with a map to museums and libraries is…they would be safer alone in D.C.
Also, to the original author of this article: the only times Youtube has ever asked me to log in was when the video had been flagged for adult content, so that example doesn’t really help illustrate a perfectly innocent need for an under 13 year-old’s social media account.
Re:
So the same parents deemed by you to be incapable of parenting by saying “no” are the same parents who are accurate reporters of real issues caused by the internet and not found in society otherwise. And the government and bigmoney tech should have control over these things so the parents aren’t the ones who have to say no. By identifying and tracking everyone who uses any internet. Great.
God bless you for wading through YouTube or anywhere else without logging in so you are exposed to the randomly most popular and idiotic stuff.
Re: Re:
I think you kinda misunderstood near everything I was trying to say. I never said anything about deeming anyone to be “incapable”, but I’ll try to address your objection to something I didn’t actually say anyway. My point was more about parents being unwilling to lift a finger, not incapable. You imply that these parents are “accurate reporters of real issues caused by the internet and not found in society otherwise”. I partially agree, but I would have worded it more accurately to reflect that they report, but do nothing. Over the past 30 years or so, there has been a slow, but definite shift away from “I found my kid looking at , how do I block that?” to “I found my kid looking at , why hasn’t someone blocked that?”. Notice the difference? I keep hearing parents being upset by/concerned with/worried about what their kids are doing with the inernet that the PARENTS THEMSELVES provide on the phones/tablets/devices that THE PARENTS THEMSELVES happily give to their children with ZERO filtering, ZERO oversight, and more often than not, little to no actual looking at the phone/tablet/computer to see what their kids are up to. I have gently suggested on many occasions that parents who are worried about activities or websites that they caught their kids getting into could possibly try taking the devices away from those misbehaving kids, and I am more often than not met with a reaction somewhere between shock and horror that I dare suggest such an injustice, to a more defeated admission of “no. that would cause too many problems, they wouldn’t like that”. So, back to my original point, if parents don’t want their kids to use social media, or other obnoxious sites/apps, as parents, they have the power to make the rules, filter/monitor/remove content they don’t want their kids to see on their own devices, but more often than not, they just won’t, because they don’t want to be the parent that says “no” to their kids.
About the youtube thing, your response makes no sense. The author of the original article seems to think that if a class assignment includes a youtube video, then that necessitates a youtube account, it doesn’t, unless the video has been flagged as adult content. So…what about what I said was wrong?
“Wading through youtube or anywhere else without logging in so you are exposed to the randomly most popular and idiotic stuff.”
What the hell are you talking about?
LGBTQ+ kid with bigot parents, atheist kid with fundamentalist ones...
Another problem with requiring parental consent for a kid to use social media is that maybe there are children out there that have a good reason for them to not want their parents looking over their shoulder and vetting any social media activity, and in fact maybe those kids would benefit from being able to socialize without having to, let’s just toss out a hypothetical, ask their abuser(s) for access to information that might help them with their abuse or realize they are being abused.