YouTube LLC and Facebook Inc. should open up their algorithms for scrutiny to ensure they are not contributing to the spread of misinformation, Firefox-maker Mozilla Corp. said.
With social media now the most popular source of news, there is a renewed urgency to expose how misinformation and extremist content fuels polarization and radicalization, Mozilla's VP of advocacy Ashley Boyd said at a press briefing in London that coincided with its annual tech festival Mozfest.
This is best achieved by researchers probing the algorithms that recommend content to users on the world's biggest online platforms, whether it be YouTube's recommendation engine that curates videos or Facebook's news feed, according to Mozilla fellow and privacy advocate Guillaume Chaslot, a former YouTube engineer.
The solution could be to give users more control over the so-called "filter bubbles" that these machine learning systems allegedly create, he said. Platforms should alert users to their use of artificial reality and allow them to exit a recommendation "rabbit hole," he added.
Reacting to the public backlash over the spread of conspiracy theories on its platform, YouTube in January tweaked its algorithm to clamp down on such videos. Typically, its algorithm recommends 200 million videos daily on its home page and is responsible for 70% of what users watch. Recommendations on mobile devices keep users watching for more than an hour at a time on average.
Chaslot conceded that as long as big tech's end goal was digital advertising revenue, driven by higher user engagement, there is no incentive for it to be more transparent.
"There are a lot of groups watching what [these companies] are doing," Boyd said. "We will hold you accountable."
With the approaching 2020 presidential election, online misinformation is on the rise in Europe and the U.S., with increased emphasis on the role of Facebook.
Clara Tsao, an online disinformation researcher and a former chief technology officer of the U.S. government's Countering Violent Extremism Task Force, said her job is to communicate concerns to the big tech companies.
The management at these companies wield enormous power over defining credible content and freedom of expression, Tsao said. So it is important to probe their backgrounds and the extent and quality of their training for rooting out misinformation, Tsao added.