Knowledge Overconfidence and Anti-Consensus Views
Caveat: I am not a psychiatrist nor am I the physician treating the individuals discussed here. I am not trying to make diagnoses from online behavior or activity. I am only noting here that these specific behaviors align with specific enumerated traits as listed in the DSM-V. I cannot make a statement about these people individually.
People who think of themselves as brilliant heterodox thinkers are overcompensating for a significant mismatch between their objective and subjective knowledge.
That is to say, people who hold anti-consensus views have both an inadequate fund of knowledge on these particular topics and an unearned inflated view of their own expertise. In fact, the larger the gap between their actual fund of objective knowledge and their subjective assessment of their knowledge, the more strongly they hold anti-consensus views — the most extreme opponents are most likely to believe that their actual knowledge ranks among the highest, yet actually ranks among the lowest.
The authors of this 2022 article in Science Advances found that “increasing opposition to the consensus is associated with higher levels of knowledge confidence for several scientific issues but lower levels of actual knowledge.”
“Although broadly consistent with the Dunning-Kruger effect and other research on knowledge miscalibration, our findings represent a pattern of relationships that goes beyond overconfidence among the least knowledgeable.”
Can we apply the findings of this research to the current slate of contrarians and grifters who hold anti-science views and anti-consensus perspectives of COVID and COVID vaccines? In fact, the authors lay it out directly in the article:
“As opposition to getting a COVID-19 vaccine increases, both general and COVID-specific objective knowledge decreases.”
This also tracks with public displays of narcissistic traits by this same cohort. More on that later. Let’s examine the article and studies.
The introductory sentence perfectly establishes the entire premise of the argument:
“Uncertainty is inherent to science. A constant striving toward a better understanding of the world requires a willingness to amend or abandon previous truths, and disagreements among scientists abound. Sometimes, however, evidence is so consistent, overwhelming, or clear that a scientific consensus forms.”
What those who hold anti-consensus view are doing falls into the same category as conspiracy theories and logical fallacies, leveraging a partial truth — that uncertainty is inherent to science — to then argue falsehoods.
When people with the credentials and training that should give them the ability to better understand science and reality succumb to this anti-science line of thinking, they are weaponizing their qualifications to legitimize these views. The lay person only sees credentialed experts disagreeing leading to the belief that there is a larger controversy when, in fact, there is more consensus than equipoise. They are creating a false equivalence where no such equivalence exists. They using their credentials to “launder” disinformation, serving as validation for folks who do not share the same credentials, and further driving the propagation of these theories as misinformation.
Understanding why even highly credentialed people continue to hold these positions may help us combat the spread of misinformation which harms both scientific inquiry and discourse and the public at large. The authors acknowledge that the consequences of the perpetuation of such misinformation resulting from “these anti-consensus views are dire, including property destruction, malnutrition, disease, financial hardship, and death.”
So, what does this article help us understand about misinformation and disinformation?
Prior models of opposition to scientific consensus have been attributed to a lack of scientific knowledge, or a “deficit model.” If the deficit model held, we would expect education-based interventions to improve upon objective scientific knowledge and overall adoption of scientific models and consensus; however, these interventions have shown little efficacy, which makes such models insufficient.
Another model holds that beliefs are shaped more by their cultural values or affiliations, and therefore a selective acceptance and interpretation of information which conforms to their worldview, called “cultural cognition,” though evidence suggests that this model is insufficient as well given the role of objective knowledge in relation to acceptance of scientific consensus.
Recent evidence suggests a potentially important revision to models of the relationship between knowledge and anti-science attitudes: those with the most extreme anti-consensus views may be the least likely to recognize the gaps in their knowledge, as they may be the least knowledgeable, while also the most overconfident in their knowledge.
The authors performed 5 different studies to assess this hypothesis. In one study, the authors demonstrate that the study participant assessment of their subjective knowledge isn’t directly related to the that individual’s beliefs in their knowledge as related to cultural cognition or alternative viewpoints, as they specifically asked study subjects to make a monetary bet based on their knowledge specifically related to scientific evidence:
“When the uninformed claim they understand an issue, it is not just cheap talk, and they are not imagining a set of ‘alternative facts.’ We show that they are willing to bet on their ability to perform well on a test of their knowledge.”
In another study, the authors asked participants how much they thought scientists knew about or understood COVID and then asked them to rate their own knowledge in relation to scientists. Those who rated their own knowledge as higher than scientists’ were more opposed to virus mitigation policies and more non-compliant with recommended COVID-mitigating behaviors, while also scoring lower on the objective knowledge measure.
The authors assert that their findings: “suggest that knowledge may be related to pro-science attitudes but that subjective knowledge — individuals’ assessments of their own knowledge — may track anti-science attitudes. This is a concern if high subjective knowledge is an impediment to individuals’ openness to new information.” The lack of openness to incorporating new information into their own knowledge base or mental framework is an important point to keep in mind, and while directly related to their overconfidence in their paradoxically inadequate fund of knowledge, I believe this relates to underlying personality traits as well.
While the findings of these studies are correlational, and all the limitations of conclusions derived from correlational studies apply, I appreciated this point from the authors:
“It is possible, for example, that higher levels of media attention, or even how easy or difficult it is to imagine the harms associated with each scientific issue, could shift how (or whether) people make assessments of their own knowledge.”
I think this point is well made and relates to the intersection between cultural cognition and their findings of an objective and subjective knowledge mismatch, but importantly incorporates the utility of direct experience or a willingness to understand the direct experiences of others — what we can refer to as empathy.
One will note that of the crop of COVID contrarians, nearly all of the loudest voices have had no or limited experience on the front lines of treating patients with COVID, particularly with severe infections during the earlier days of the pandemic through the first Omicron surge. At the same time, their overconfidence in their subjective knowledge along with what can only be described as a fundamental lack of empathy limits their willingness or ability to incorporate new information outside of their own experience.
While the authors of this article discuss prior models of anti-consensus views and amend them with the additive perspective of this mismatch between objective and subjective knowledge, none of them seems sufficient alone, and all are important to consider in light of the others. A deficit model is insufficient, because the knowledge deficit met with overconfidence is the important component in strengthening the correlation with strength of anti-science and anti-consensus views, but do not adequately account for personal and cultural factors, while the “cultural cognition” model is insufficient to explain the knowledge deficits and mismatch between subjective and objective knowledge. What other factors, particularly personal factors, may tie those models together?
I have written previously about the contrarian commitment to the Galileo Gambit and the belief that they alone possess the correct information — information which goes against commonly accepted beliefs or scientific consensus — much like Galileo and his heliocentric model of the solar system. We can refer to this as a grandiose sense of self-importance. Combine that grandiose sense of self-importance with a public platform or megaphone, and you’ll uncover their need for admiration. They’ll regularly accuse anyone else making evidence-based statements on social media as somehow clinging to the “narrative” of the currently available evidence and scientific consensus for likes or follows or attention while they are being “censored” despite the fact that their anti-consensus views garner more, further reinforcing this negative cycle. Taken together with the low ability for incorporating new information from the experiences of others or for empathy, and this tracks closely with narcissistic traits according to the DSM-V criteria.
Understanding the link between those with a subjective and objective knowledge mismatch and an associated lack of empathy as rooted in narcissism, and subsequently the means of managing such personality traits, can provide an important tool for addressing the misinformation space. By directly managing the largest and loudest anti-science and anti-consensus personalities who are weaponizing their credentials on large platforms or allowing their credentials to be weaponized by others to perpetuate disinformation, we will better be able to limit the harms caused misinformation.
Addendum: Turns out, there’s more scientific support for this position than I was aware of when I had previously written this piece: https://www.psychologytoday.com/ca/blog/strange-journeys/202308/why-narcissists-love-conspiracy-theories
