Featured Topics
Featured Products
Events
S&P Global Offerings
Featured Topics
Featured Products
Events
S&P Global Offerings
Featured Topics
Featured Products
Events
S&P Global Offerings
Featured Topics
Featured Products
Events
Financial and Market intelligence
Fundamental & Alternative Datasets
Government & Defense
Professional Services
Banking & Capital Markets
Economy & Finance
Energy & Commodities
Technology & Innovation
Podcasts & Newsletters
Financial and Market intelligence
Fundamental & Alternative Datasets
Government & Defense
Professional Services
Banking & Capital Markets
Economy & Finance
Energy & Commodities
Technology & Innovation
Podcasts & Newsletters
24 Feb, 2026
As the US federal law that shields online platforms from liability for user-generated content marks its 30th anniversary in February, emerging technologies such as generative AI and e-commerce marketplaces are raising liability questions the statute was not designed to address.
AI companies produce content rather than host it, and marketplace platforms control fulfillment and logistics, while maintaining that they merely list third-party products. Neither model fits neatly into Section 230 of the Communications Decency Act, which was designed for platforms that passively host user speech. The law states that no user or platform provider shall be treated as a publisher, and that service providers and users may not be held liable for voluntarily removing objectionable material in good faith.
Eric Goldman, associate dean for research at Santa Clara University School of Law, said the question of AI liability is best understood as a spectrum. If a GenAI model produces output identical to its training data, it is essentially publishing third-party content, and Section 230 could apply. However, if the model fabricates information with no basis in its training material, it may qualify as an information content provider under the law, potentially stripping it of that immunity.
"I don't think we're going to have a single one-size-fits-all answer," Goldman said. "I think that binary is not realistic."
Old vs. new
Jennifer Huddleston, a technology policy research fellow at the Cato Institute, said many scenarios that appear to involve AI and Section 230 do not raise new legal questions. When a user reposts AI-generated content on social media, the platform is still sharing user-generated content, and the standard liability applies. The more difficult question arises only when the AI itself is the speaker.
Huddleston cautioned against legislative proposals to carve out AI from the statute, noting that AI is already embedded in features such as spam filters and search personalization.
"When you start to have an increasing number of carve-outs, it can create an increasingly complicated regime that raises compliance costs, particularly for small platforms," Huddleston said.
Peter Chandler, executive director of Internet Works, a trade association representing midsize platforms including Reddit Inc., Etsy Inc. and Vimeo.com Inc., said the association's members are integrating AI for functions such as improving product listings and matching job seekers with openings. Chandler said those are not high-risk uses of the technology, and that any AI policy affecting Section 230 should reflect that distinction.
"Whatever policymakers decide to do with AI, it needs to be tied to risk," Chandler said.
India McKinney, director of federal affairs at the Electronic Frontier Foundation, said the organization is still determining how Section 230 applies to GenAI. McKinney pointed to unresolved questions about whether responsibility for problematic AI output lies with the model or with users who deliberately circumvent built-in safeguards.
Buyers and sellers
For e-commerce, federal courts are divided on whether platforms that handle warehousing, shipping and payment processing qualify as sellers under state product liability law. Parham Nikfarjam, a senior trial attorney at J&Y Law who has litigated product liability cases against marketplace platforms, said courts are increasingly examining what a platform does rather than what it calls itself.
Nikfarjam pointed to the 2020 Bolger v. Amazon.com Inc. case, in which a consumer was severely burned by a defective laptop battery purchased from a third-party seller on Amazon. A California appellate court reviewed Amazon's operational role in handling payments, shipping, returns and customer communications, and held that the company could be strictly liable for the defective product.
"The court didn't get hung up on what Amazon calls itself. It walked through what Amazon did," Nikfarjam said. "The product moved through Amazon's ecosystem; Amazon handled payment, shipping, returns, and it structured communications so the customer's relationship ran through Amazon rather than the upstream seller. That's the function-based analysis in action. It was Amazon's operational role that created the risk exposure."
Nikfarjam said platforms are already attempting to compartmentalize responsibility by framing product recommendations as "just information" and fulfillment as "just logistics" to avoid liability across both functions.
Chandler said a blanket repeal of Section 230 would punish companies across the industry for concerns primarily directed at a handful of large social media platforms. Some of Internet Works' member companies have fewer employees than the largest tech firms have in their legal departments, he said.
"We take the position of working with us to make sure that you're actually accomplishing the intended goal, not scooping up a whole bunch of companies as collateral damage, because we couldn't take the time to get the details right," Chandler said.
'Perfectly imperfect'
Goldman said AI companies are more likely to seek a federal preemption framework modeled on Section 230 than defend the statute directly. According to Goldman, claiming its protection would require them to acknowledge they are publishing content. "That strikes me as not the kind of arguments they're currently advancing," he said.
McKinney said Congress has more direct tools to address the market power of large tech companies, pointing to comprehensive federal privacy legislation and antitrust enforcement. She noted that Meta Platforms Inc. endorsed Section 230 reform in a full-page ad in The New York Times Co. because the company can absorb the litigation costs of a weakened liability shield while smaller competitors cannot.
The expectation that any statute can eliminate harmful content online misunderstands the problem, she said.
"You can't tech your way out of a problem you didn't tech your way into," McKinney said. "The internet is not a perfect utopia, and it never was going to be, because it's dependent on the users. We are perfectly imperfect, as we are in society, and so is the internet."
What's happening this week?
Below is a list of hearings, webinars and other technology, media and telecom-related events taking place virtually and in person in the nation's capital and beyond this week:
Feb. 24
➤ S&P Global Market Intelligence: The Evolution of Earnings Call Sentiment Analysis from Lexicons to LLMs
➤ Senate Armed Services Committee: Hearings to examine rebuilding American critical minerals supply chains.
➤ House Education and the Workforce Subcommittee on Early Childhood, Elementary, and Secondary Education: Building an AI-Ready America — Teaching in the AI Age
➤ House Foreign Affairs South and Central Asia Subcommittee: Strengthening Export Control Enforcement
Feb. 25
➤ S&P Global Market Intelligence: GenAI VC Funding: A 2026 Outlook
➤ American Enterprise Institute: A Conversation with Darío Gil — AI for Scientific Discovery
Feb. 26
➤ Cato Institute: Section 230 at 30 — The Past, Present, and Future of Online Speech and the 26 Words That Created the Internet
Feb. 27–28
➤ University of Chicago Institute of Politics: Tech for Good Conference