The livestreaming of the New Zealand mosque shooting on Facebook Inc. and other internet platforms highlights the urgent need to combat the spread of abusive content online, but imposing overly restrictive regulations is not the best approach, legal experts said.
Last week, scenes of a gunman killing at least 50 people and wounding several others at two mosques in Christchurch, New Zealand, were streamed live on Facebook and also posted on Twitter Inc. and Google LLC-owned YouTube LLC before circulating throughout the wider internet. Facebook defended its response to the video uploads, saying in a March 18 blog post that it removed the attacker's video "within minutes" after New Zealand police flagged it and deleted the Facebook and Instagram Inc. accounts of the alleged shooter.
Facebook said it removed about 1.5 million videos within the first 24 hours of the attack and blocked over 1.2 million of those videos at the time of upload. The video was viewed about 4,000 times in total before being removed from Facebook, the company added.
|Flowers laid outside the Al Noor mosque |
in Christchurch, New Zealand
Source: Associated Press
Mary Anne Franks, a law professor at the University of Miami who specializes in criminal law and First Amendment law, said Facebook and its counterparts have long had a "devil-may-care attitude" about their livestreaming offerings and must address whether the products are doing more harm than good.
"You shouldn't be inflicting [livestreaming services] upon the public unless you can use it responsibly," Franks said in an interview. "And if the answer is [livestreaming] cannot be used responsibly, then it shouldn't exist."
Jeff Kosseff, an assistant professor of cybersecurity law at the United States Naval Academy, agreed that Facebook and other internet giants must step up their content moderation efforts, but noted that it is nearly impossible to keep all offensive content off the platforms.
"We can't just say improving moderation is the magic solution," Kosseff said. "It would be overly optimistic to think that any platform would be able to moderate all harmful content while keeping up all unobjectionable content."
Kosseff added that social media companies' self-regulatory effort in policing abusive content is a more effective solution than enacting stringent laws. "I get a little nervous about the government imposing too many content restrictions because that's very contrary to First Amendment values," Kosseff said.
Facebook CEO Mark Zuckerberg, for instance, has touted his company's increased investments in security of late, including developing more advanced artificial intelligence techniques to remove fake accounts and adding more security reviewers, among other measures. Speaking on Facebook's earnings call in January, Zuckerberg acknowledged that the company's increased security investments "has affected our profitability," but added that it is "the right thing to do."
For her part, Lydia de la Torre, a law professor at Santa Clara University who has expertise in data protection law, added that any "knee-jerk" legislation pressed on internet giants could actually take away the various benefits the companies also offer to consumers.
"We do have unfortunate situations where people abuse the system, but also we have situations where people enjoy sharing things instantly. It's a question of how you balance that," de la Torre said in an interview.
Rep. Bennie Thompson, a Democrat from Mississippi who chairs the House Committee on Homeland Security, wrote a March 19 letter to Zuckerberg, YouTube CEO Susan Wojcicki, Twitter CEO Jack Dorsey and Microsoft Corp. CEO Satya Nadella, requesting a briefing to describe how the New Zealand shooting videos spread on their respective platforms.
"Your companies must prioritize responding to these toxic and violent ideologies with resources and attention," Thompson wrote in the letter. "If you are unwilling to do so, Congress must consider policies to ensure that terrorist content is not distributed on your platforms — including by studying the examples being set by other countries."
Political leaders across the globe have called on Facebook and its peers to accept more responsibility for the extremist material posted on their sites.
New Zealand Prime Minister Jacinda Ardern said in a recent speech to New Zealand's Parliament that social media platforms "are the publisher, not just the postman," of abusive content and must be more accountable for what is shared on their sites.
"There cannot be a case of all profit, no responsibility," Ardern said.