A post by Daniel Munro
Thi Nguyen (2021) describes a strategy someone with nefarious motives might use to manipulate people into believing misinformation. This strategy involves presenting falsehoods in ways that induce an illusory sense of clarity—a mere feeling of possessing understanding and insight when really one lacks them. This feeling can stop someone from subjecting a piece of information to scrutiny or attempts to verify it, since one already feels as if one has understanding.
Nguyen describes several methods for inducing a false sense of clarity. For one thing, work in psychology shows that we often use fluency and ease of processing as heuristics to show when we’ve successfully understood an idea. In other words, how quickly and easily we cognitively process some information correlates with how likely we are to feel we’ve understood it. While this heuristic is often a good, rough-and-ready guide, it means that presenting misinformation in a way that merely seems familiar, intuitive, and easy to grasp can lead to the illusion of understanding.
Nguyen also argues that manipulators can induce illusions of clarity by triggering thought processes that feel like understanding itself. While possessing knowledge merely involves the possession of individual facts, understanding involves grasping explanatory connections amongst a body of information. So, for example, it’s one thing to merely know the isolated fact that World War II began in 1939, but it’s another thing to understand why the war began, in the sense that one grasps the causal relations between various events that led up to it. So, a manipulator could induce a sense of understanding in her audience by presenting them with a set of falsehoods that seem explanatorily connected to one another, such that the audience feels as if they grasp these connections.
In what follows, I want to unpack how thinking about the imagination can help us better understand effective strategies for producing illusory feelings of understanding. I’ll argue that manipulators can effectively induce such feelings by capturing their audience members’ imaginations in the right way.
Why, in the first place, should we think that the imagination might be involved in generating feelings of understanding? This has to do with the deep connection between the imagination and processing narratives.
Misinformation which spreads online often takes the form of stories or narratives (Raab et al. 2013; Tangherlini et al. 2020; Lazić and Žeželj 2021). Think, for example, of QAnon-style stories about Donald Trump waging a war against Democrats who are heading up a cabal of child-murdering children; of stories about how the 2012 Sandy Hook school shooting was a false flag operation orchestrated to push gun control propaganda; or of stories about how Bill Gates is putting microchips in COVID vaccines with the aim of controlling our minds. Both philosophers and psychologists have argued that we mentally process and comprehend narratives in part by imagining the events they describe (Stock 2013; Van Mulukom 2020; Arcangeli 2021). So, we should expect that people often process misinformation they encounter online by imagining the contents of stories they read.
Now, a good narrative is one that presents a set of causally connected events (this might even be part of what it is to be a narrative, by definition). Narratives don’t describe a bunch of disparate, unconnected events. Instead, they describe a series in which earlier events cause later ones: event A happened, which caused B to happen, which caused C, and so on. Prompting someone to imagine the contents of a narrative, then, seems like a good way to generate the feeling that they’ve grasped causal, explanatory connections between the events the narrative describes. In other words, it seems like a good way to generate the feeling of understanding that Nguyen described.
Of course, sometimes this feeling is genuine: to genuinely grasp what led up to World War II, it helps if you can do something like imagining a narrative unfolding, in which interconnected events cause one another. However, if the narrative in question is describing something false (e.g., a far-fetched conspiracy theory), this feeling of understanding might be an illusion.
Hopefully, this gives you a sense of why imaginative engagement with narratives might be one way to induce a false sense of understanding. In the rest of this post, I want to drill down into some ways manipulators can do this most effectively.
So, imagine that you yourself are such a manipulator. What follows is a “how-to guide” for effectively manipulating your audience members’ imaginations. (Disclaimer: Of course, my goal isn’t for anyone to actually follow this guide and start spreading misinformation. My aim is to contribute to our overall understanding of how misinformation spreads, which is a necessary step towards effectively counteracting it.)
Step 1: Look for “ugly truths” that can be replaced with pleasing falsehoods. Some events are extremely unpleasant to imagine. I recently read the book Sandy Hook by Elizabeth Williamson (2022), which details the fight against conspiracy theories about the 2012 school shooting. The book begins with fairly graphic descriptions of what really occurred during the shooting, which were difficult enough to read that I almost put the book away for good. Similarly, it’s unpleasant to imagine the monumental amounts of tragedy and death that COVID has inflicted on vulnerable people over the last few years.
This unpleasantness makes it difficult to imagine these things: because it’s so emotionally taxing, it takes effort to turn your attention to imagining these events. Contrast this with imagining fantastical conspiracy narratives about these events. It’s easier to imagine the school shooting being staged by a bunch of actors than it is to imagine children being brutally murdered. And it’s easy to turn your attention away from the tragedies inflicted by COVID and instead imagine that Bill Gates is trying to implant us with microchips—although the latter scenario is unpleasant in a sense, it also has the feel of an entertaining sci-fi thriller, which makes it fun to think about.
Now recall Nguyen’s idea that ease of cognitive processing can contribute to a feeling of understanding. If that’s right, then it’s not hard to see how someone could mistake the ease of imagining conspiracy theories, relative to imagining tragic truths, for genuine understanding. While this relative ease really originates in the fact that imagining conspiracy theories is less painful, it might be possible to mistake this for the ease of processing that accompanies understanding.
When trying to decide what sorts of misinformation to focus on spreading, then, you should look for opportunities to replace hard-to-imagine truths with falsehoods that are more pleasurable to imagine. Your audience will feel as if they can imagine these falsehoods more easily and fluently, which contributes to a feeling of understanding.
Step 2: Craft narratives with concrete, easily imaginable details. The contents of some narratives are more imaginable than others. Compare these two recent examples of conspiracy theories:
(1) Biden and other Democrats are conspiring to take away your gas stoves.
(2) Clinton, Obama, and other Democrats get together to murder children and drink their blood in the basement of a pizza restaurant called Comet Ping Pong.
It’s much easier to imagine the contents of (2). That’s because (1) is fairly vague and contains no concrete details about how Democrats would actually carry out their plan—it might conjure up some vague mental imagery of gas stoves and a scheming Joe Biden, but this imagery doesn’t seem very concrete. In contrast, (2) contains more specific, concrete details. This makes it easier to imagine its contents, which makes it more likely that someone who encounters it will end up with imagination-based illusions of understanding. Perhaps this helps to explain why narratives like (2) often have staying power in terms of their cultural uptake, while narratives like (1) are often in the news for a briefer period of time.
So, if you want to craft narratives to manipulate people’s imaginations, you should craft narratives that contain concrete, imaginable details.
Step 3: Try to provoke intense emotional reactions. The examples of misinformation that get the most uptake online are often the most juicy and emotionally arousing (Vosoughi et al. 2018; van Prooijen et al. 2022). They’re often emotionally arousing simply in that they’re entertaining, which makes it pleasurable to imagine their contents. Sometimes, they’re emotionally arousing because of the subject matter they concern—for example, someone who doesn’t like Democrats might experience a feeling of righteous anger when imagining Clinton and Obama murdering children. Other times, they’re emotionally arousing because of how they’re presented: when the conspiracist influencer Alex Jones delivers content on his show InfoWars, for example, he often does so in a way that’s full of rage.
Emotional reactions contribute to how deeply one becomes imaginatively absorbed in a narrative. And empirical research suggests that we’re more likely to be persuaded by narratives the more deeply we become imaginatively absorbed in them (Hamby et al. 2018). That’s in part because deep, emotionally charged absorption is cognitively taxing, in a way that makes it difficult to subject a narrative to much critical scrutiny. So, once someone has achieved a feeling of understanding, their emotional absorption will preoccupy them and make them less likely to question whether that understanding is genuine.
References
Arcangeli, Margherita. 2021. "Narratives and Thought Experiments: Restoring the Role of Imagination." In Epistemic Uses of Imagination, edited by Christopher Badura and Amy Kind, 183-201. New York: Routledge.
Hamby, Anne, David Brinberg, and James Jaccard. 2018. "A Conceptual Framework of Narrative Persuasion." Journal of Media Psychology 30 (3): 113-124.
Lazić, Aleksandra and Iris Žeželj. 2021. "A Systematic Review of Narrative Interventions: Lessons for Countering Anti-Vaccination Conspiracy Theories and Misinformation." Public Understanding of Science 30 (6): 644-670.
Nguyen, C. Thi. 2021. "The Seductions of Clarity." The Royal Institute of Philosophy Supplement 89: 227-255.
Raab, Marius H., Stefan Ortlieb, Nikolas Auer, Klara Guthmann, and Claus-Christian Carbon. 2013. "Thirty Shades of Truth: Conspiracy Theories as Stories of Individuation, Not of Pathological Delusion." Frontiers in Psychology 4: 1-9.
Stock, Kathleen. 2013. "Imagining and Fiction: Some Issues." Philosophy Compass 8 (10): 887-896.
Tangherlini, Timothy R., Shadi Shahsavari, Behnam Shahbazi, Ehsan Ebrahimzadeh, and Vwani Roychowdhury. 2020. "An Automated Pipeline for the Discovery of Conspiracy and Conspiracy Theory Narrative Frameworks: Bridgegate, Pizzagate and Storytelling on the Web." PloS One 15 (6): e0233879.
Van Mulukom, Valerie. 2020. "The Evolution of Imagination and Fiction through Generativity and Narrative." In Evolutionary Perspectives on Imaginative Culture, edited by Joseph Carroll, Mathias Clasen and Emelie Jonsson, 53-70: Springer.
van Prooijen, Jan‐Willem, Joline Ligthart, Sabine Rosema, and Yang Xu. 2022. "The Entertainment Value of Conspiracy Theories." British Journal of Psychology 13 (1): 25-48.
Vosoughi, S., D. Roy, and S. Aral. 2018. "The Spread of True and False News Online." Science 359 (6380): 1146-1151.
Williamson, Elizabeth. 2022. Sandy Hook: An American Tragedy and the Battle for Truth. New York: Dutton.