In the wake of a series of mass shootings involving perpetrators who were active in posting hateful views online prior to committing crimes, Congress is reevaluating the legal protections given to online platforms that host such comments.
Social media is shielded from some of the laws that apply to newspapers by Section 230 of the Communications Decency Act, which exempts providers and users of "interactive computer services" from liability for content published on platforms by third parties. Any changes to the law would significantly impact the way that social media companies publish content, and big tech companies have lobbied against such changes even as federal legislators from both political parties have signaled interest in reforming the law.
While Facebook Inc., Twitter Inc. and Alphabet Inc.'s Google LLC have sought to crack down on abusive online posts, incidents such as the livestreaming of a deadly shooting attack on a New Zealand mosque in March highlight the difficulty of keeping such content offline.
At a Sept. 18 hearing of the U.S. Senate Committee on Commerce, Science and Transportation, Sen. Mike Lee, R-Utah, asked representatives of the three tech companies if it would be more difficult to moderate content without the legal certainty that Section 230 provides. Their response: an unequivocal yes. If anything, they implied that the stakes were even higher than the question considered — not just for the tech sector, but for the American economy.
Derek Slater, global director of information policy for Google, said the Section 230 law is "part of the reason we have been a leader in economic growth and innovation and technological development."
Nick Pickles, public policy director at Twitter, called it a "fundamental part of maintaining a competitive online ecosystem."
Sen. Lee was not the only member to question the necessity of the law at the hearing.
Sen. Richard Blumenthal, D-Conn., asked if tech companies "need more incentives to do more" to stop the spread of online hate speech from fueling acts of physical violence.
Republican Sen. Deb Fischer of Nebraska also wondered if tech companies need more accountability for content moderation.
It is not the first time federal lawmakers have suggested a change could be in the works. In April, House Speaker Nancy Pelosi said in an interview with Recode that it is "not out of the question" that the communications law could be changed. She also suggested that current law allows tech companies to shirk some responsibility for content posted on their platforms.
In the Senate, Josh Hawley, R-Mo., proposed a bill in June that would amend the law to remove protections for content moderators that are not "politically neutral." He and some other Republicans — including President Donald Trump — have claimed that mainstream tech companies including Facebook, Twitter and Google unfairly moderate content from conservative voices, a charge that company executives deny. Hawley's bill has no additional co-sponsors and has not yet been brought up for consideration.
Michael Beckerman, president and CEO of the Internet Association — a trade group whose members include Google, Facebook and Twitter — said the bill would force platforms to make the "impossible choice" between hosting speech protected by the First Amendment or losing legal protections that allow them to moderate content.
Beyond the specific bill, Beckerman added in a June opinion column that rolling back Section 230 of the communications law would "do irreparable harm" to the internet and society as we know it.
"Other countries that don't have something like it suffer," said Google's Slater at the Sept. 18 hearing.
