– Executive Summary –
The problem of misinformation in modern society has reached epidemic proportions, with intentional disinformation campaigns and their misinformation fallout becoming a weaponized war of words across the globe. In general, the intent of those behind these assaults is to destabilize democratic states by spreading doubt, fueling discontent, and interfering with fact-based decision making. Such tactics have been used by foreign actors against other countries and, increasingly, within a country to further the narrative and political goals of a leader. Intelligence experts agree that attacks on our own election process have taken place and continue to occur as we near another presidential election cycle.
In the modern world of the internet and social media platforms, there are few barriers to entry for a would-be purveyor of misinformation, and we are only beginning to understand the depth and complexity of the problem. The solution is even more elusive, considering the unpredictable dynamics of which meme or conspiracy theory might “go viral” with a particular section of society. Where are the intervention points? If false or misleading information is intentionally released to infect our public discourse, who is responsible for preventing it, or keeping it from spreading, or curing us from its ill effects? The questions related to these problems share a common lexicon—misinformation behaves much like a disease and can be described and understood in these terms. The branch of science dedicated to studying the phenomenon of cause, origin, and spread of disease or other health-related issues is epidemiology, and it serves as a logical lens through which we can view the disease of misinformation.
In its search for an end cure, epidemiology starts at the origin of the disease, which at its simplest level needs three things to exist: a disease agent or pathogen, an environment supportive of the agent’s life and reproduction, and a host to carry and eventually spread the infectious agent. Misinformation’s agent can be either the person creating a false narrative or one passing it along as truthful. The environment in which misinformation thrives is related to both current socio-political divides and the echo chambers that perpetuate them. Fear is a primary driver of human behavior, and the more polarized a population is, the more these fears create an environment that helps to spread misinformation. This agent/host/environment construct is central to the epidemiological approach of this thesis, as are the related concepts of building host immunity, reducing agent virulence, and working to make the surrounding environmental medium less conducive to spreading misinformation.
As the unwitting hosts of misinformation, humans show differing levels of immunity, often correlating positively with exposure to more sources of information. As we experience fewer and more homogenous narratives and points of view, we tend to be more susceptible to misinformation that either supports our worldview or speaks to something that might threaten those beliefs. Social media platforms help us support and defend the ideas that we like, without the discomfort of seeing or trying to understand what we dislike. And the more we engage in the feedback mechanisms for social media, the better their algorithms can further reinforce these echo chambers of confirmation bias. In order to build individual host and collective herd immunity against false or misleading information, we must understand the cognitive shortcuts, or heuristics, our brains use to process and then retain or reject information. If misinformation is rejected by the brain, then prevention efforts have been effective. If it is accepted, then the epidemiological approach looks for a cure through a process of narrative correction. As with traditional disease models, prevention is preferable to intervening with an infected host.
Impacting the epidemiological environment is another important strategy in the fight against misinformation, and there has been growing sentiment among legislators and citizens to consider regulating social media companies, the content they allow, and the methods by which they verify the veracity of content. As with other issues, constitutional concerns enter the discussion when free speech, fiercely protected in the United States, is potentially threatened. At least one social media company has refused to ban false political campaign ads from its platform, citing free-speech concerns. Changing the way that social networks do business and make money will be a huge task even without getting entangled with the Constitution, but altering the environmental factor, which has most contributed to this new epidemic of misinformation, may be the single best point of intervention.
While there is no simple solution to this complex problem, this epidemiological view does present a new way to view misinformation and may stimulate further discussion on the part of policymakers and the academic community. The existence of these false and insidious narratives is no longer benign but a true threat to our democracy at the highest levels. Misinformation is, in fact, present in the discussion of many of the issues considered to be critical in our country and the world. Political corruption, economic inequality, climate change, health care, and a host of other issues have been infected with misinformation, which is preventing much needed factual civil discourse and forward progress.