Connect with the author
The United States is facing a crisis about factual evidence. Americans believe in very different realities – depending, in large part, on their political beliefs. Signs of the growing crisis have been visible for many years. Partisans disagree about the facts concerning climate change, weapons of mass destruction in Iraq, and the birthplace of former President Barack Obama.
Some amount of disagreement is inevitable. In the words of political scientist James Kuklinski, “if a fact is worth thinking about in making a policy choice, it is probably worth disputing.” This simple truth helps explain why so many Americans disagree even about phenomena on which there has been scholarly consensus for decades – as is the case with climate change.
Other beliefs are harder to understand. Despite clear evidence to the contrary, some people believed (and may even continue to believe) that the Pope endorsed presidential candidate Donald Trump, that Vice-President Pence called Michelle Obama “the most vulgar first lady we’ve ever had,” and that there was large-scale voter fraud in the 2016 election. If Americans cannot agree on even the most fundamental facts in such realms, the nation’s ability to make sound decisions is obviously in jeopardy.
The proliferation of self-reinforcing partisan “echo chambers” and wide consumption of so-called fake news have often been blamed for sustaining separate realities. But few Americans live in truly politically insular bubbles. The success of fake news is more a symptom than the cause of America’s troubling relationship with facts.
Fighting Falsehoods is Hard
Since the 2016 election, there has been public outcry about Facebook’s role in spreading false information. Indeed, the extent to which Facebook users have engaged with fake news is disturbing. But what is the right response? Proposals from Facebook and scholars who study these problems mostly fall into two approaches, neither likely to close the gap between Americans’ different realities.
Prevent the dissemination of misinformation. This approach assumes people can be systematically dissuaded from sharing false information, but it poses considerable technical and philosophical challenges. It would be no small feat to pinpoint falsehood at the vast scale on which Facebook operates, and political actors who have a vested interest in promoting misinformation will surely try to game any effort to filter out propaganda. An even bigger challenge is drawing lines between outright misinformation and valid alternative interpretations. Whether this is attempted with machines, human editors, or crowdsourcing, there are bound to be errors – either in the direction of censorship or towards condoning propaganda.
Flag falsehoods. The second strategy assumes that inaccurate beliefs occur because people have not yet heard the truth or paid sufficient attention to the credibility of sources. As recent headlines trying to flag false information show, journalists often blame echo chambers and ignorance of correct information for growing public acceptance of falsehoods and conspiracy theories. But, in fact, even though people’s political beliefs color the news they see, it is largely myth that most Americans live in closed circles fixated on agreeable media.
Much evidence suggests that Americans do not, in fact, systematically avoid information they disagree with. On the left and the right, large majorities get their news from news outlets that include multiple perspectives on controversial issues. Audiences for CNN and Fox News are not completely separate. In fact, someone who watches CNN or Fox is more likely to watch the other channel than most other Americans. The real crisis we face is that, even though most Americans are exposed to a variety of news sources and opinions, they simply write off discordant content as untrue and dismiss as stupid or malicious those with whom they disagree. Because of this, both of the currently touted remedies – preventing misinformation and helping people recognize inaccuracies – will fail to foster shared reality. Voters will continue to articulate demands and desires based on divergent assumptions about how the world works.
Combat Distrust by Creating a Shared Reality
A deeper challenge must be faced. American society faces a loss of faith in the institutions traditionally responsible for rendering judgments and disseminating factual information. The Internet propels a dizzying array of beliefs. Historical accounts and conspiracy theories, hard news and propaganda, fact checks and rumors – all these forms of information and mistruth coexist, with proponents accusing each other of manipulating facts for political purposes.
Faced with people who believe falsehoods, some are tempted to simply shout facts more loudly, but this will not work. Forcing people to deal with others holding different beliefs or hear contradictory evidence will not create common understanding of reality. The solution is more complicated and starts with reestablishing norms that allow us, as a society, to distinguish truth from fiction. Far too many Americans have given up on traditional markers of credibility.
The ability to think for one’s self is essential but limited. Citizens have limited time, expertise, and access to relevant evidence. Although the Internet allows people to dig more deeply into important issues, it does not make every person an expert on every issue. There are times when citizens must place their trust in others. Until Americans reach greater consensus about who can be trusted to distill information into forms that everyday people can use, the worlds that Democrats and Republicans live in will continue to diverge – and U.S. democracy will remain in peril. Step by step, shared credibility must be reestablished by experts and citizens in dialogue.
Read more in R. Kelly Garrett, Brian E. Weeks, and Rachel L. Neo, “Driving a Wedge Between Evidence and Beliefs: How Online Ideological News Exposure Promotes Political Misperceptions,” in Computer-Mediated Communication, 21, 5, (2016): 331-348; R. Kelly Garrett, “The 'Echo Chamber' Distraction: Disinformation Campaigns Are the Problem, Not Audience Fragmentation,” in the Journal of Applied Research in Memory and Cognition, 6, 4, (2017): 370-376.