Lawmakers are increasingly scrutinizing how companies like Meta Platforms Inc.'s Facebook use algorithms to push certain content into users' newsfeeds, and they are introducing legislation that would hold those platforms accountable for misinformation and posts promoting harm.
The U.S. House Energy and Commerce Committee's subcommittee on communications and technology will hold a hearing on Dec. 1 focused on bills designed to hold Big Tech companies accountable for the effects of those algorithms, including the Justice Against Malicious Algorithms Act of 2021. The bill was introduced in October by lawmakers including Rep. Frank Pallone Jr., D-N.J., chairman of the Energy and Commerce Committee. It is designed to lift the Section 230 liability shield when an online platform knowingly or recklessly uses an algorithm to recommend content that contributes to physical or severe emotional harm.
Another bill, called the Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms Act, or SAFE TECH Act, would reform Section 230 and allow social media companies to be held accountable for enabling cyberstalking and targeted discrimination on their platforms. The Protecting Americans from Dangerous Algorithms Act would remove Section 230 immunity for a platform if its algorithm is used to amplify or recommend content that leads to offline violence.
Opponents of these bills, however, say they are problematic because the removal of Section 230 has the potential to open online platforms to frivolous lawsuits and discourage startups from entering the industry.
In general, Section 230 allows online platforms — from Facebook to individual bloggers — to avoid liability for what users are posting, said Sophia Cope, senior staff attorney for the Electronic Frontier Foundation, a nonprofit focused on digital rights.
"There are so many small and niche community message boards and websites for hobbies and different health interest groups," Cope said. "Without Section 230, if one commenter defames another commenter, that blogger is going to potentially be on the hook because the commenter said something that is legally actionable. That's going to disincentivize your average person from hosting a blog and letting people comment on it."
If Section 230 immunity goes away, platforms large and small will likely have to defend themselves through lengthy and expensive litigation, she said. The prospect of that creates an incentive to mitigate against the possibility of getting sued by changing the way people connect with each other online or either getting out of the business entirely, Cope said.
"Yes, there's a lot of bad content on the internet, but there's a lot of good content too. And a lot of users do appreciate personalized algorithms because they are finding other groups or other content that is relevant to them that's in line with their interests and beliefs," Cope said.
|Dec. 1||The U.S. House Energy and Commerce Committee's subcommittee on communications and technology will host a hearing focused on holding Big Tech companies accountable by reforming Section 230.|
|Industry, legal and think tank events|
|Dec. 1||The Financial Times is hosting an event titled "Maximizing the Value of AI-Human Collaboration in the Workplace."|
|Dec. 2||The Federal Communications Bar Association, in partnership with several groups, will hold the first session of its third annual women's summit series titled "Decryption Key: Unlocking Women's Excellence in Cybersecurity Law and Policy."|
|Dec. 2-3||George Mason University's Center for Intellectual Property x Innovation Policy will present a conference titled "Intellectual Property and Innovation Policy for 5G and the Internet of Things."|
Stories of note:
Social media critics may have wrong target in federal push for algorithm opt-out
Hulu's live TV offering to integrate Disney+, ESPN+ streaming services
Walmart, Target beat Q3 expectations as store, online sales jump before holidays