Technology executives warned lawmakers that popular online social forums like Reddit, YouTube and Instagram would face dire consequences if Congress stripped internet companies from a key liability shield that protects them from lawsuits over user-generated content.
Specifically, Section 230 of the Communications Decency Act exempts providers and users of "interactive computer services" from liability for content published on platforms by third parties. However, the law has proven controversial in recent years, with bipartisan calls for reform. At a joint Congressional subcommittee hearing on Oct. 16, House representatives mostly rejected the idea of an outright repeal of the law but suggested that revisions to existing liability protections may be warranted.
Asked at the hearing what Reddit Inc. would look like if it were held legally liable for the content its users post, CEO Steve Huffman said he was "not sure Reddit as we know it could exist." He suggested that removing the liability shield was likely to result in overly broad mechanisms for monitoring and removing potentially questionable content. Even small changes to the law would have "outsized consequences" for "what little competition remains" in the company's industry, he added.
Specifically, Huffman said in written testimony that it would require tens of thousands of contractors to moderate all content. "Medium, small, and startup-sized companies don't have the resources for this," he said. "This approach has questionable utility anyway since even tens of thousands of contractors don't scale with hundreds of millions of users, let alone billions."
Katherine Oyama, global head of intellectual property policy at Alphabet Inc.'s Google LLC, said in written testimony that a repeal of the law could harm video platforms and search engines and limit their ability to filter content.
"Video platforms like YouTube [LLC] and content-sharing apps like Instagram [LLC] might face legal claims for removing videos they determined could harm or mislead users," she said.
Additionally, without the law, Oyama said "search engines, video sharing platforms, political blogs, startups, and review sites of all kinds would either not be able to filter content at all (resulting in more offensive online content, including adult content, spam, security threats, etc.) or would over-filter content (including important cases of political speech)."
House lawmakers from both major political parties assured company officials that they were not seeking to repeal the law.
Rep. Cathy McMorris Rodgers, R-Wash., ranking member on a consumer protection subcommittee within the full committee, said it should not be any government agency's job to moderate free speech online.
"Misguided and hasty attempts to amend or even repeal Section 230 for bias or other reasons could have unintended consequences for free speech and the ability for small businesses to provide new and innovative services," she said.
However, she said it is incumbent upon policymakers to "have a serious and thoughtful discussion about achieving the balance on Section 230."
Her Democratic counterpart, Rep. Jan Schakowsky, D-Ill., chair of the consumer protection subcommittee, also said she was not interested in eliminating the law but in ensuring it stands up to changes to the tech landscape.
When asked for concrete suggestions about what types of changes the committee could explore, Hany Farid, a professor at the University of California, Berkeley's School of Information, said companies should be compelled to show they are performing "reasonable content moderation."
Still, Farid warned that making changes could harm innovation.
"If we start regulating now, the ecosystem will become even more monopolistic," he said. "We have to think about how do we make carveouts for small platforms."