Section 230 primarily helps large social media platforms.
Section 230 protects internet sites and users by providing a legal basis for organizations of all shapes and sizes to moderate content. It prevents internet service providers (ISPs), internet sites of all sizes, and users from being held liable for objectionable content posted by other users. Section 230 doesn’t just apply to social media platforms.
It also protects online services that provide volunteer community moderation, such as message boards, as well as other organizations including PTAs, schools, and libraries. Without the protection Section 230 provides, many of these organizations could face crippling lawsuits over user-posted content.
Unfortunately, only the largest corporations or organizations could withstand the possible wave of litigation over user-posted content which could occur if Section 230 is weakened or repealed.
Under Section 230, internet companies don’t have an incentive to moderate user content because it gives them blanket immunity.
Internet companies must moderate because Section 230 doesn’t provide unconditional legal immunity. Section 230 allows platforms to decide what content appears on their platforms and allows them to remove “objectionable” content without fear of legal liability. It does not provide blanket protection, so companies need to remove some types of harmful content—like spam or obscene material—off their platforms to protect users.
Section 230 provides limited immunity to internet sites that allow user-generated content. Since Section 230 went into effect, courts have ruled in multiple cases that there are limits to these protections. For example, providers have no legal immunity if they materially contribute to illegal user content.
Section 230 has failed and needs to be reformed.
The Internet would not be the engine of economic growth and force for bringing the world closer that it has become without the protections of Section 230. Policymaker concerns are rooted in the way some companies have moderated content under Section 230. That does not mean the law has failed. It means the internet community must work constructively to ensure users have a better understanding of the rules of the online services they are using.
Section 230 allows internet companies to innovate and moderate content, helps promote freedom of expression online, and enables online businesses to offer products, services, and features that users expect.
Section 230 does not provide blanket immunity to online businesses. Some kinds of content — including material that’s considered criminal, like child pornography — do not have Section 230 protections. Law enforcement agencies can prosecute online businesses that host illegal content.
Changing Section 230 will only impact the big companies, like Facebook and Google.
Some proposals to overhaul Section 230 could have a fatal impact on millions of small and medium-size internet-based companies by creating costly and inflexible regulations on content moderation. This would have the unintended outcome of further entrenching the position of the largest online platforms.
Big companies can afford to design expensive moderation programs to review user-generated content. Other companies use technologies such as filters to flag obscene words and images or content which violated their terms of service. Organizations need flexibility to ensure content moderation policies on their sites make sense for their particular focus or service.
Smaller companies cannot afford to defend themselves against lawsuits by users upset about content moderation policies. They may opt instead to change their sites to avoid certain types of user-generated content altogether.
The First Amendment gives people the right to say anything they want on the Internet.
The First Amendment applies to the government, not private organizations or companies. In fact, it empowers private entities to make their own decisions about association. Just as a retail store, as a private business, is free to bar service to someone speaking rudely toward staff or other customers, internet companies are also free to set and enforce standards for appropriate content and behavior on their platforms.