U.K. government plans for tougher regulation of online platforms have this week been criticized by tech giants, but lawyers say the proposals are in line with a Europe-wide movement away from self-regulation.
The Online Harms White Paper, currently under consultation, proposes greater accountability for content-hosting platforms by imposing a "duty of care" on them. Companies such as Facebook Inc. and Alphabet Inc.'s Google LLC would have to take more responsibility for "harm caused by content or activity" on their services.
An independent regulator would be given powers to fine or even ban online platforms that fail to swiftly remove violent or extremist material, posts encouraging suicide or self-harm, child abuse, and misinformation or fake news.
The Internet Association, or IA, a lobby group that represents tech firms including Facebook, Google and Snap Inc. released a report May 29 criticizing the proposals as legally vague and "unmanageable."
The group says the proposals are not "sufficiently targeted or proportionate to the harms they are designed to minimize," and risk undermining users' privacy.
"The proposals in the Online Harms White Paper present real risks and challenges to the thriving British tech sector, and will not solve the problems identified," IA's executive director Daniel Dyball said.
However, lawyers say the U.K.'s approach is broadly consistent with other efforts in Europe to hold companies to account if they fail to remove harmful content.
Peter Wright, managing director and solicitor, Digital Law U.K., said the environment in Europe is increasingly tilting toward online regulation, with the burden falling on tech firms.
"We shouldn't be surprised that the companies themselves are now turning round and saying 'please think twice before you throw any more requirements on top of us.'"
"Because ultimately more content moderation will require more staff and that will cut into bottom lines."
Antitrust lawyer at DWF Christian Peeters said tech firms could face similar directives throughout Europe as nations see online-platform accountability as the best way to better protect their citizens.
However, the U.K.'s idea to impose a duty of care on companies "could be a little more complicated," Steve Kuncewicz, a partner at law firm BLM, said. The phrase has specific legal meaning in the U.K., Kuncewicz said, for example being used in tort law to impose a legal liability on individuals so they can be sued for breach of duty.
"[The government] will want to avoid creating a new reason to sue these tech giants," Kuncewicz said.
What constitutes a breach will likely be defined by the code of practice that results from the consultation on the proposals, Kuncewicz said. "So there's an awful lot left to do [for the proposals to be legally sound]."
Any duty created by the new law is unlikely to be "a new weapon for people to deploy if they want to sue a big tech company." Kuncewicz said. "It's meant to transfer a lot more responsibility back over to the companies."
The IA plans to submit final recommendations when the government's 12-week consultation with impacted parties ends July 1.