latest-news-headlines Market Intelligence /marketintelligence/en/news-insights/latest-news-headlines/legal-event-tackles-section-230-s-affect-on-social-media-moderation-65122681 content esgSubNav
In This List

Legal event tackles Section 230's effect on social media moderation


Live TV still dominates most TV viewing in Asia


Netflix amortized content spend estimated at $13.6 billion in 2021


Credit Risk Trends for Telecom & Tech: A Mid-Year 2021 Outlook


Summer box office rebounds in 2021

Legal event tackles Section 230's effect on social media moderation

Twenty-five years after Section 230 of the Communications Decency Act became U.S. law, federal legislators on both sides of the aisle agree that the Big Tech liability shield should be reformed — if for different reasons.

The Federal Communications Bar Association will hold the third session of its Online Platform Regulation Series on July 8 titled "Social Media Content Moderation" that will explore how Section 230 case law affects the moderation of content on platforms and whether platform algorithms should be regulated. Brendan Carr, a commissioner on the Federal Communications Commission, will be a guest speaker.

Over the years, Section 230 has been upheld as a provision that protects online providers from liability. "The main point was to allow the internet back in 1996 to flourish — not have online service providers be concerned that when a person or entity posts something, that they are going to immediately be liable," said Michelle Cohen, chair of the data privacy practice at the law firm Ifrah Law.

But now, both Democrats and Republicans have argued that the immunity provided to Big Tech companies such as Facebook Inc. and Google LLC goes too far.

Democrats are generally concerned that the law deters companies from efforts to stop the spread of misinformation and harassment online, said David Greene, civil liberties director for the Electronic Frontier Foundation. "They want the companies to be more active in taking things down," Greene said.

Companies do take down a "ton of stuff" that is either legally actionable or is just content they know their users do not want to see, Greene said. "Hate speech is rarely legally actionable, but many users don't want to see it," Greene said. "There are extensive efforts by almost every intermediary."

Republicans have been concerned with social media companies' ability to censor content, especially after former President Donald Trump was de-platformed by Facebook and Twitter Inc., Greene said.

Lawmakers on both sides of the aisle have introduced proposals to change the law.

In February, Sens. Mark Warner D-Va.; Mazie Hirono, D-Hawaii; and Amy Klobuchar, D.-Minn., announced the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms, or SAFE TECH, Act to reform Section 230. The proposal is designed to ensure social media companies are held accountable for enabling cyber-stalking, targeted harassment and discrimination on their platforms.

"Section 230 has provided a 'Get Out of Jail Free' card to the largest platform companies even as their sites are used by scam artists, harassers and violent extremists to cause damage and injury," Warner said in a news release.

In June, Sen. Marco Rubio, R-Fla., introduced the Disincentivizing Internet Service Censorship of Online Users and Restrictions on Speech and Expression, or DISCOURSE, Act. The bill aims to update Section 230 to ensure that a market-dominant firm no longer receives protections when it actively promotes or censors certain material or viewpoints, including through the manipulative use of algorithms.

"Big Tech has destroyed countless Americans' reputations, openly interfered in our elections by banning news stories, and baselessly censored important topics like the origins of the coronavirus," Rubio said in a news release.

Big Tech companies themselves are advocating for more accountability and oversight to content moderation. Facebook CEO Mark Zuckerberg testified in March before Congress that while liability protections should remain intact, online platforms should be held to higher standards for moderating content.

"Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it," Zuckerberg said.

Google CEO Sundar Pichai advocated in testimony for more fair and effective processes for addressing harmful content.

"Solutions might include developing content policies that are clear and accessible, notifying people when their content is removed and giving them ways to appeal content decisions, and sharing how systems designed for addressing harmful content are working over time," Pichai said.


July 8 The Federal Communications Commission's Task Force for Reviewing the Connectivity and Technology Needs of Precision Agriculture in the United States will hold a meeting at 10 a.m.
Industry, legal and think tank events
July 6 The Brookings Institution will host a webinar at 2:30 p.m. titled "Reconciliation 101: An explainer of the budget process."
July 7 NATOA will hold its policy and legal committee meeting at 3 p.m. to discuss issues of interest on Capitol Hill, at the FCC and in the courts.
July 8 The FCBA will host the third session of its Online Platform Regulation Series titled "Social Media Content Moderation."

Stories of note:

US consumers more concerned about data protection amid pandemic – 451 Research

Retailers continue online grocery investments despite slowing sales

US broadband proposal a 'down payment' on closing digital divide, experts say

COVID-19 broadband subsidy program sees 'strong start,' but more outreach needed