Otherwise Objectionable: The Internet At Risk

from the saving-the-internet dept

Imagine an internet where every website faced an impossible choice: either carefully review every single post before it goes live (making them essentially TV stations), or allow absolutely everything with zero moderation. This nightmare scenario wasn’t hypothetical — it was exactly what the infamous Stratton Oakmont v. Prodigy ruling threatened to create, before Section 230 saved us from that fate.

In the latest episode of our Section 230 podcast series “Otherwise Objectionable,” I spoke with the two people who prevented this disaster — Section 230’s authors Chris Cox and Ron Wyden — along with the lawyer who represented Stratton Oakmont (yes, the “Wolf of Wall Street” firm) in the lawsuit that nearly broke the internet.

While this history is often misunderstood, the internet was starting to fill up with garbage and nonsense (spam) and internet services were trying to figure out how they could create more curated communities, such as Prodigy’s attempt to make a “family friendly” space.

The Stratton Oakmont v. Prodigy ruling threatened all of that, by saying that if you did any moderation at all, then it was possible that you could be held liable for any content you left up.

In that world, you’re left with what’s referred to as “the moderator’s dilemma.” You either very, very carefully review absolutely everything — making you more like a TV station or book publisher, in which case only a few bits of content would be allowed out each day. Or you moderate nothing and let there be a total free-for-all of spam, abuse and harassment.

Cox and Wyden understood that this would be a disaster. They recognized that the value of the internet was that you could have something better: a place where different kinds of communities could be cultivated. You didn’t need it to be a top-down “broadcast” style model like a publisher or a TV station, but also, you didn’t need it to be everything for everyone.

Section 230 created a third way — one that enabled websites to moderate content as they saw fit without facing liability for everything they missed. This wasn’t just a technical legal fix — it was a fundamental recognition that the internet could enable new forms of communication and community that didn’t fit into old regulatory boxes.

Filed Under: , , , , ,
Companies: prodigy, stratton oakmont

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Otherwise Objectionable: The Internet At Risk”

Subscribe: RSS Leave a comment
13 Comments
Anonymous Coward says:

Re:

Well, I suppose it won’t matter since they’ll probably not vote it down if it gets put to the floor.

Both because it’s not a direct repeal (more of a delayed one) and because it’s another cyanide pill wrapped in “think of the kids!!” candy.

Really wish there was a way to see how the ENTIRETY of congress’ members feel about it, but even then I’m not sure if I’d like the answer.

I don’t know how to keep going if they get rid of the internet.

Mike Blumenthal (user link) says:

When does a platform become a publisher

I understand and appreciate the relationship of Sec 230 to content moderation and the problem it solves. But it would seem that 1)there are some loopholes needing clearing up and 2)clarification of the boundaries between a platform and a publisher as all too often the likes of Google hide behind the former when they are in reality the latter.

As to point 1, an example of this is in the review space where Google isn’t obligated to moderate or remove a review even if it is found to be defamatory.

RE point 2 – Google assembles and mixes and matches data from a multitude of sources, rearranges it in unique ways and then says that it is content created by others and posted on their platform. An example of this are the local Business Listings. A related coming example are AI generated search results.

In both cases there needs to be quick and easy avenues to fix the output and prevent additional harm. Currently there is no recourse and Google doesn’t implement any safeguards because they have sec 230’s categorical protections.

This might not require a total abandonment of 230 but surely some modifications are in order ?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a BestNetTech Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

BestNetTech community members with BestNetTech Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the BestNetTech Insider Shop »

Follow BestNetTech

BestNetTech Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the BestNetTech Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
BestNetTech needs your support! Get the first BestNetTech Commemorative Coin with donations of $100
BestNetTech Deals
BestNetTech Insider Discord
The latest chatter on the BestNetTech Insider Discord channel...
Loading...