The Next Phase in Fighting Misinformation

Over the last two years, Facebook greatly expanded their efforts to fight false news: they were getting better at enforcing against fake accounts and coordinated inauthentic behavior; they were using both technology and people to fight the rise in photo and video-based misinformation; they were deployed new measures to help people spot false news and get more context about the stories they see in News Feed; and they were grown their third-party fact-checking program to include 45 certified fact-checking partners who review content in 24 languages. And overall, Facebook was making progress: multiple research studies suggest that these efforts are working and that misinformation on Facebook has been reduced since the US presidential elections in 2016.

But misinformation is a complex and evolving problem, and Facebook have much more work to do. With more than a billion things posted to Facebook each day, the company need to find additional ways to expand their capacity. The work their professional fact-checking partners do is an important piece of Facebook strategy. But there are scale challenges involved with this work. There simply aren’t enough professional fact-checkers worldwide, and like all good journalism, fact-checking — especially when it involves investigation of more nuanced or complex claims — takes time. Facebook want to be able to tackle more false news, more quickly.


So today, Facebook was kicking off a new collaborative process with outside experts that will help them hone in on new solutions to fight false news at scale. The goal of this process is to arrive at externally vetted, consistent approaches that have the potential to help them catch and reduce the distribution of greater quantities of misinformation, more efficiently.


Facebook know this won’t be easy. Whatever they do next, they need to find solutions that support original reporting, promote trusted information and allow for people to express themselves freely. So the question is, how do Facebook come up with a model where they are serving people by giving them a chance to see the content they want, while also cutting down on misinformation, without having Facebook be the judge of what is true? How does Facebook ensure a system complementary to our existing fact-checking programs, so that professional journalists can spend their time doing original reporting on the hardest cases? How Facebook can build such a system that can’t be gamed or manipulated by coordinated groups of people? How can Facebook avoid introducing personal biases into these systems? And what additional safeguards do Facebook need in place to protect civil rights and minority voices?


Those are some of the issues Facebook will be exploring in the months to come.


A Collaborative Process


As Facebook has worked to expand their misinformation efforts over the past two years, they have also been doing extensive research and talking to outside experts to identify additional approaches that might bolster our defenses. One promising idea Facebook have been exploring would involve relying on groups of people who use Facebook to point to journalistic sources that can corroborate or contradict the claims made in potentially false content, as discussed in this video.

Facebook’s head of research for News Feed Integrity, Apala Sabde, and University of Michigan professor Paul Resnick, a consultant to Facebook’s misinformation team and one of the many experts we’re working with on this topic, discuss our early explorations into community-driven approaches to misinformation.


Over the next few months, Facebook is going to build on the explorations they have started around this idea, consulting a wide range of academics, fact-checking experts, journalists, survey researchers and civil society organizations to understand the benefits and risks of ideas like this. Facebook is going to share with experts the details of the methodology they have been thinking about, to help these experts get a sense of where the challenges and opportunities are, and how they’ll help them arrive at a new approach. Facebook will also share updates from these conversations throughout the process, and find ways to solicit broader feedback from people around the world who may not be in the core group of experts attending these roundtable events.


Taking the fight against misinformation to the next level is an important task for Facebook. There are elections around the world month after month, only adding to the everyday importance of minimizing false news. Facebook plan to move quickly with this work, sharing some of the data and ideas they have collected so far with the experts they consult so that they can begin testing new approaches as soon as possible.


Source: Facebook newsroom

SIGN UP AND STAY UPDATED!
  • Spotify
  • Branders Magazine
  • Branders Twitter
  • LinkedIn Branders
  • Facebook Branders

© 2019 by Branders Magazine