TOPLINES:

  • InternetWorks believes in allowing internet companies to set the standards for how they innovate in content moderation, promote freedom of expression online, and enable online and offline businesses to offer products, services, and features that users have come to enjoy and expect.

 

  • The Internet would not be the engine of economic growth and a force for bringing the world closer without the protections of internet freedom and competition-driven innovations. 

 

  • InternetWorks represents a diverse, cross-section of internet platforms and organizations that have benefited from both open competition and the flexibility to moderate and curate content; as such, they have built loyal user bases and trust in their brands due to consistently high-quality, useful content. Our goals are to ensure our diverse user base is represented in important policy conversations and to preserve the internet as a place of limitless possibility.

 

  • The current policy debate is focused on a few companies; however, it stands to impact online enterprises of all sizes and business models. State policies must recognize there is no one-size-fits-all approach to content moderation and that flexibility has helped the internet provide opportunities for a wide range of companies and organizations.

 

  • Well-intended, but overreaching policy efforts will have a disproportionately severe impact on small and mid-sized platforms and early-stage growing companies that are not as well resourced as the world’s largest tech companies. While the latter have legal and engineering departments with thousands of staffers and financing to absorb massive regulatory costs, the vast majority do not. Failure to recognize these differences unintentionally widens the competitive gap and benefits a handful of the internet’s largest companies.

 

KEY MESSAGES

Content moderation isn’t easy or inexpensive – it’s complicated. 

There is no one-size-fits-all approach to content moderation. Companies do it differently because they host different sorts of content.

  • A travel site might take down a cookie recipe because it’s not related to travel.

 

  • A PTA message board might take down messages about dating opportunities.

 

  • Or, a neighborhood group sharing locally-relevant information might need the authority to moderate content themselves.

 

  • Other examples of harms caused by overreaching policies:

 

Online forums couldn’t function without being overrun by off-topic posts and spam.

Finding a job or company where you’d love to work would be harder if there were no authentic job reviews (both positive and negative) available.

Community-run messageboards and many review sites would likely disappear or never launch if volunteer moderators or smaller companies could be sued for every post they curate.

 

Neighborhoods couldn’t stay connected online in real-time if moderators must screen every single post before it’s allowed. Or they’d have to allow everything, including spam and profanity.

Online spaces would be overrun with hateful or explicit content if platforms were liable for every safety choice they made.

Building a website would be nearly impossible if web hosting and domain registration platforms were responsible for all the content on every website they hosted.

Finding quality products and authentic information online would be harder without honest reviews if businesses could sue everytime they received a bad online review.

Cloud storage wouldn’t exist if companies were legally responsible for all the personal information and data individuals store. Data sharing would move back to thumb drives sent through the mail.

Small businesses and creators
couldn’t extend their reach to millions worldwide if marketplaces can’t moderate listings and protect consumers.

Protecting against cyber attacks would be harder if cybersecurity companies were legally responsible for the customer data that flows across their network.

Planning a vacation itinerary for a place you’ve never visited would be more stressful if you can’t read authentic, first-hand reviews of real people who have been there.

Sharing ideas and knowledge would change drastically if platforms were responsible for each contribution and idea shared by users.

Finding freelancers would be much harder  if you couldn’t verify the individual’s previous work via online recommendations and reviews.

  • InternetWorks companies use different methods to moderate content. Several InternetWorks companies find success in community moderation models, and find it to be a competitive advantage, while others take different approaches. These unique models are competitive business features on which platforms should have liberty to innovate.

 

  • InternetWorks members make good faith efforts to moderate their platforms to protect users and ensure they only see relevant content. Our members need the flexibility to protect users and ensure content on their sites is relevant and presented in a manner that is most meaningful to their users. The modern-day internet is built on this flexibility, and users now expect access to a rich array of diverse content online.

 

Transparency Reporting

  • Some proposals to require precise details on the methods, training and key terms used to moderate content would impact millions of small and medium-sized, internet-based companies by creating costly and inflexible regulations on content moderation. 

 

  • This would have the unintended outcome of further entrenching the position of the largest online platforms. 

 

  • Big companies can afford to design expansive moderation programs to review user-generated content. Other companies use technologies, such as filters, to flag obscene words and images or content that violated their terms of service. Organizations need the flexibility to ensure content moderation policies on their sites make sense for their particular focus or service.

 

  • Only a handful of large Internet platforms have the resources and existing infrastructure to easily comply with the suggested proposals. Small and mid-sized platforms and organizations cannot afford to defend themselves against a deluge of lawsuits from users upset with thoughtful content moderation policies and actions. They may change their sites instead to avoid certain types of user-generated content altogether.

 

  • Requiring every platform that hosts third-party online content to take the same moderation approach risks unnecessary costs that would provide minimal benefits for users and hurt innovation. Internet Works’ members support the work of millions of entrepreneurs and small businesses.

 

Child-specific Privacy policies

  • The online safety of users, children in particular, is paramount to Internet Works members. Our platforms and organizations prioritize the safety and privacy of children who access their platforms. We strongly believe children deserve a heightened level of security and privacy, and there are a number of efforts within the industry to incorporate protective design features into their websites and platforms. 

 

  • Our companies have been at the forefront of raising the standard for teen safety and privacy across our industry by creating new features, settings, parental tools and protections that are age-appropriate and tailored to the differing developmental needs of young people.

 

  • We encourage state lawmakers to be mindful of existing federal and state laws regarding children’s rights, data management and privacy. Similarly, the right to operate online with anonymity is core to the online customer experience in many contexts and should not be restricted.

 

New Challenges

Deplatforming Users

  • Companies and organizations have an obligation to provide a safe and secure environment for their users. Federal law enables platforms to craft their own moderation models to ensure users are protected from objectionable, illegal and dangerous content. 

 

  • In some instances, de-platforming is a common-sense way of preventing platform abuse that, in some cases, could cause harm to other users (ex: de-platforming a malicious user from posting a misleading review).

 

  • How platforms develop and implement policies are competitive business decisions on which platforms should have liberty to innovate.

 

Civil rights protections and discrimination

  • Companies and organizations have an obligation to provide a safe and welcoming environment for their users. Section 230 provides practical and flexible provisions that allow platforms to craft moderation models that ensure users are protected from discriminatory or otherwise objectionable content. 

 

How platforms handle disinformation (vaccines, covid, etc)

  • Companies and organizations have an obligation to provide a safe and secure environment for their users. Section 230 provides practical and flexible provisions that allow platforms to craft moderation models that allow them to ensure disinformation is identified and removed.  

 

Internet companies are violating users’ First Amendment rights

  • The First Amendment applies to the government, not private organizations or companies. 

 

  • Just like a retail store is free to bar service to someone who speaks rudely toward staff or other customers, internet companies are also free to set and enforce standards for appropriate content and behavior on their platforms.