One law that has been a point of political contention has been Section 230, also referred to as the “internet free speech law.”
Some feel that it allows companies like Facebook and Twitter to decide what information can and can’t be shared on their platforms. While others feel it’s an important safeguard that enables the free expression of ideas online.
It’s something that impacts every cloud service user that posts anything online through that medium (social media, a blogging site, etc.) as well as the provider of that platform.
This law can be looked at through different lenses to be sure, but what does it actually mean and why is there so much partisan controversy surrounding it?
We’ll break down the facts of Section 230 below, so you can decide for yourself.
What is the “Internet Free Speech” Law?
Section 230 is part of the Communications Decency Act of 1996 (CDA). This legislation was originally enacted to fight the availability of pornography on the internet to minors.
This was in the early days of the internet where there weren’t many rules or safeguards. Even at the time, it was thought that the law violated freedom of speech despite good intentions.
Section 230 was part of that law, and though the CDA was found to be unconstitutional, Section 230 was not. It’s been upheld as constitutional throughout several legal challenges over the years.
The 26 Words that Created the Internet
One of the terms used to describe Section 230 is “The 26 words that created the internet.”
Here’s what those 26 words say,
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
In layman’s terms, this means that Facebook, and any other platform through which people can post information or communicate online, cannot be held legally liable for what you post.
You can imagine how fast social media services would shut down if they could be sued for everything their users posted.
You might be thinking, “But doesn’t that encourage free speech?” Yes, some people do see it that way. By protecting platforms that provide channels for communications from being held liable for what users post, the law helps keep those platforms open.
However, there is more to Section 230.
Why Are People Upset About Section 230?
The section that contains the 26 words above is noted as (c) Protection for “Good Samaritan” blocking and screening of offensive material.
What has people wanting to get rid of Section 230 is another protection given to platform providers when it comes to censoring content.
The Act states,
“No provider or user of an interactive computer service shall be held liable on account of—
- any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected”
This means that platforms like Twitter, Facebook, and others can take down offensive content without it being considered a violation of the First Amendment protections of the freedom of speech.
How platforms choose which content is taken down or not is what the controversy is all about.
How Do You Gauge Screening of Offensive Material?
On one hand, just about anyone would agree that pornographic images would be offensive content they would not want their children to see on social media. They might even be upset if that content wasn’t removed or blocked by a platform.
But what about political hate speech? That’s where the “Good Samaritan” clause can get a little less clear and a lot more controversial.
What Section 230 is stating is that:
- Internet platform providers can’t get in trouble for any offensive or libelous content posted by users on their platform; and
- They are also allowed to censor that content without it being seen as a violation of constitutionally protected free speech.
What Else Is In Section 230?
Beyond the controversy, there are a few other things contained within Section 230. These include:
- Providers of “interactive computer service” need to make users aware of any parental control protections available on the service that they can use to limit access to unwanted material.
- No liability can be imposed under a State or local law that contradicts Section 230.
- The law doesn’t have any effect on other laws such as criminal law, intellectual property law, communications privacy law, and sex trafficking law.
How Secure Are Your Cloud Solutions?
Are the online activities of your users putting your company at risk? C Solutions can help your Orlando area business do a full cloud and web security review to ensure you’re properly protected.
Schedule a free consultation today! Call 407-536-8381 or reach us online.