Tech companies Twitter Inc., Facebook Inc. and Google Inc. drew flak from members of the British parliament for repeatedly failing to remove abusive, violent and extremist content on their platforms.
During a Dec. 19 hearing of the House of Commons home affairs committee, Sinead McSweeney, Twitter's vice president of public policy and communications for Europe, Middle East and Africa, said offensive tweets normally take "a day or two" to take down. She could not guarantee, however, that Twitter would be free of abuse.
"The world in which we are now operating has become a much, much more challenging place in terms of the different forms of extremism," McSweeney said.
McSweeney and executives from Facebook and Google attended the hearing to testify about how they progressed in tackling hate speech and illegal content on their platforms.
For his part, Nicklas Berild Lundblad, Google's vice president of public policy and government relations for Europe, Middle East and Africa, shared how the Alphabet Inc. unit has invested in machine learning and artificial intelligence technologies to remove hate speech from platforms such as YouTube.
Lundblad said it is "very important" to address the issue, especially after Google lost revenue from companies that pulled their ads from YouTube after they appeared next to extremist content.
Placing online platforms under stricter regulation by watchdog Ofcom, Lundblad said it may work for YouTube but he noted that online media are "regulated differently than traditional media."
Meanwhile, Simon Milner, Facebook's policy director for U.K., Middle East and Africa, shared his company's efforts to counter online terrorism. These include partnerships to encourage the "positive use" of Facebook, as well as a guide to help keep Muslims safe on the social network.
Milner also shared his thoughts on a recently implemented law in Germany that requires social networks to take down hate-fueled posts within 24 hours after a complaint is lodged.
"One of the concerns is that you are asking our companies to decide 'what is illegal,' rather than the courts. We think that is problematic for national jurisdictions," Milner said.
During a heated exchange, Conservative MP and committee member Tim Loughton took a hit at how the social media giants supposedly earn from abusive content.
"You are profiting — I'm afraid — from the fact that people are using your platforms to further the ills of society and you're allowing them to do it and doing very little, proactively to prevent them," Loughton told the representatives present.
Committee Chair Yvette Cooper also called attention to violent tweets hurled at fellow MPs, which were reported months ago but still remain on Twitter. The posts included threats against Prime Minister Theresa May as well as anti-Semitic tweets toward Labour MP Luciana Berger.
"What is it that we have got to do to get you to take them down? It is very hard for us to believe that enough is being done when everybody else across the country raises concerns," Cooper concluded.
The hearing was part of the home affairs committee's continuing inquiry into online abuse, hate and extremism. The committee published a report into the matter in April.
