Facebook is currently fighting a battle against fake news. News items that have been invented to advance a political position or line the pockets of opportunists have their world view confirmed has been something of a hot-button issue since Trump took the White House last November. But new research from The Media Insight Project suggests our current strategies may all be for naught thanks to humans’ preference for friends’ opinions.
That’s heartening in one way, but it has the unfortunate knock-on effect that means people are more likely to turn off their critical faculties if someone they respect has done the legwork for them. And that spells trouble for Facebook’s first iterative solution to slaying fake news. But more on that in a moment.
For the study, the researchers worked with a sample of 1,498 American adults. Each one of them was sent a simulated Facebook health news item that was shared by one of eight figures, including Oprah Winfrey, Dr Oz and America’s surgeon general. From these, half of the participants would see the news shared by the person they trusted the most, while the other half would get it from those they trusted the least. On top of this, the news was either attributed to The Associated Press or a fictional news site called DailyNewsReview.com.
51% of participants who received the article from someone they trusted said it was well-reported and trustworthy, compared to just 34% who read the same article via an untrusted source. That’s interesting in itself, but what’s more insightful is that the news source had only a minor impact on that figure. A trusted figure sharing an item from the AP would get 52% of participants saying the news was accurate, while 32% would say the same of the piece if it came from an untrusted source. The fictional news source was still believed 49% of the time if the original sharer was trusted.
In other words, a trusted personal endorsement seems to bring about greater trust in the content being read – even if the source is extremely dubious. On top of that, participants were more likely to recommend the news source to a friend and follow them on social media.
But back to Facebook’s fake news solution. This week, in America, certain Facebook users have started to see warnings when they attempt to share news that independent fact checkers at Snopes and Associated Press have found to be wanting:
That’s the idea anyway. Of course, every action has an equal and opposite reaction:
Lead image: Quim Gil, used under Creative Commons