Fake News is Spreading Thanks to Information Overload
We’re being overwhelmed by social media posts—and our judgement is suffering as a result.
In the beginning, the internet was supposed to be the great equalizer, making knowledge accessible to everyone and giving all citizens a voice.
Today, knowledge is more accessible, and more people have a voice, but it’s also increasingly hard to tell what’s real and what’s not.
Blame social media. On networks like Facebook and Twitter, high- and low-quality information—read: accurate and inaccurate—have an equal chance of success, according to a paper published today by Diego F. M. Oliveira, Filippo Menczer, and a team of researchers.
On social networks, they said, accuracy does not necessarily determine popularity. “Quality is not a necessary condition for virality,” Menczer said.
Our ability to discriminate accurate information can be impeded by things like confirmation bias—where we are more likely to accept information if it matches our own beliefs—and popularity bias—where we view something as high quality because it is shared frequently.
“Even when individual users can recognize and select quality information, the social media market rarely allows the best information to win the popularity contest,” Oliveira said.
It’s a contest which is growing bigger by the day. According to a 2016 Pew Research Center report , 62% of adults in the U.S. get news from social media sites, and 18% do so often. Nearly half the country uses Facebook as a news source in one form or another. People who use Facebook for news amount to 44% of the general population.
Despite the popularity of social media as a news source, not all popular content on these sites is high-quality, factual information. To see how low-quality information spread on these sites, Oliveira and Menczer’s paper, published in Nature Human Behavior , simulated a large social media system. The authors assumed that users were able to discern good quality information—an ideal scenario for social media usage, though perhaps a bit optimistic. They gave each post a quality rating that stood for factors such as accuracy, trustworthiness, reliability, and relevance. By varying how many pieces of information were produced in the system and how many posts a user could look at before sharing, the team examined conditions under which quality information went viral.
When they modeled the hypothetical scenario—in which all users can tell good information from bad—the networks struck a balance between quality and diversity of information. In other words, many people’s voices were heard and the overall accuracy of posts on the site remained high.
But when they introduced real-world data and conditions into the model, the volume of posts skyrocketed. The diversity of voices increased, but the quality of popular posts declined. The team discovered that eventually there is “so much information out there that we just can’t keep up,” Menczer said. “It underlines the limits of social media as a platform for news.”
Challenges of Diversity
“The implication is that the sheer diversity of information can actually have a downside,” John Wihbey, a professor of journalism and new media who studies virality and information literacy at Northeastern University, said of the study’s findings.
“One of the critiques of the era of mainstream media is that we artificially gate-keep, not letting enough diverse voices into the conversation,” he said. “While the critique of high-quality news producers may still be valid, it’s a more complicated picture than maybe we’ve understood. When you have actors with bad intentions who want to do propaganda, misinformation, or just silly stuff…flooding environments with all kinds of clickbait, it doesn’t improve the ecosystem.”
In the past year, fake news has become a global phenomenon. Only 24% of respondents in the Reuters Institute Digital News Report for 2017 that think social media is successful at separating fact from fiction, compared with 40% of respondents who think the same of the news media. The problem appears to be particularly acute in the U.S. Out of 36 countries, the U.S. emerged as the second most polarized. People reported trusting their favored news sources more than the news media in general by a 15% margin.
But that hasn’t spelled the end of the news media. Online news subscriptions have risen from 9% to 16% this year, and news donations have tripled, according to the recent Reuters report. People under 35 contributed significantly to this increase, as the percentage of 18–24 year olds paying for news increased from 4% in 2016 to 18% in 2017.
There are also other signs that internet users and companies are recognizing the problem of fake news. In a study this winter, J. Nathan Matias and other researchers at MIT worked with Reddit to combat fake news by encouraging users fact-check potentially misleading stories, which doubled the number of comments with links on average, suggesting the users were backing up their comments with sources. CUNY Graduate School of Journalism recently joined Facebook and organizations like the Knight Foundation to announce the News Integrity Initiative to combat fake news. And on a more extreme level, last week Filipino senator Joel Villanueva proposed a bill that would jail or fine people and companies that spread fake news.
For now, Oliveira and Menczer recommend that social media sites reduce information load by restricting the posts from automated bots, which can manipulate sites by producing high volumes of low quality or inaccurate information. “We’re finding very strong quantitative evidence that social bots are very effective at getting fake news to go viral,” said Menczer.
The group is currently working on studying and detecting automated bots to learn how to filter them out (much like spam filters for email).
While the Menczer and Oliveira say that social media sites bear the brunt of the responsibility for dealing with this issue, they do recommend that individuals carefully check the sources of information and avoid sharing content without some critical examination. They also say people should not unfollow or unfriend people who hold different opinions. “That leads to echo chambers, which make us more vulnerable to misinformation,” Oliveira said.
Wihbey agrees that bots are a significant problem. If platforms like Facebook and Twitter are better able to crack down on bots, he said, the high-quality information would have a better chance at spreading. By sharing their data sets, Facebook and other platforms could help researchers understand the relationship between virality and accuracy of information, he said, an increasingly important relationship in our democracy.
While Wihbey admits that there are times in which low-quality information can be fun and amusing, social platforms and systems should take a harder look at how their algorithms treat credible content.
“We have to acknowledge the limits of social media” Menczer said.
Photo credit: Japanexperterna.se/Flickr (CC BY-SA)