Abstract
Does epistemic confidence affect Americans’ willingness to defend misperceptions in the face of correction? Individuals with excessive confidence in their political knowledge are expected to resist the effects of corrective cues against political misperceptions. In this study, I assess the effects of confidence on skepticism towards five common political misperceptions in observational and experimental settings. In Study 1, I observationally assess the effects of epistemic confidence on resistance to corrective cues. In Study 2, I temper excessive confidence among a random subset of respondents using a specialized experimental treatment, before exposing them to a corrective cue. Together, the results show that corrections can reduce support for misperceptions among those with modest confidence. However, in the presence of excessive epistemic confidence, these treatments are ineffective. The present findings suggest that epistemic confidence complicates the work of fact-checkers and science communicators in modern democratic politics.
Recent studies have revealed that many Americans over-estimate their skill in comprehending political phenomena (e.g., Anson 2018; Anspach et al. 2019; Graham, 2020; Ortoleva and Snowberg 2015). So too are many Americans highly confident in their ability to navigate the media environment and to spot fake news despite widespread media illiteracy (Lyons et al., 2021). This unwarranted epistemic confidence leads some Americans to adopt polarized political beliefs, and to resist the acquisition of additional political knowledge.
The goal of this study is to determine whether epistemic confidence conditions Americans’ resistance to corrections against misperceptions. Studies of resistance to corrections are valuable because misperceptions are often difficult to correct or combat (e.g., Jerit et al. 2006; Nyhan and Reifler 2010; Thorson 2016; Wood and Porter 2019). Some recent efforts have found success in reducing individuals’ willingness to support falsehoods using specialized treatments (e.g., Nyhan et al. 2019; Bode and Vraga 2018). However, the relative effectiveness of such interventions on individuals with varying levels of epistemic confidence is not yet well understood.
Confidence likely interacts with partisan motivated reasoning to produce especially immutable political misperceptions among confident, committed partisans. However, while many studies posit a role for partisan motivated reasoning in driving misperceptions, confidence is less commonly explored as a distinct concept (but see Graham, 2020; Lyons et al. 2021). I argue that epistemic confidence helps to explain why many Americans, and especially partisans, are unwilling to update their incorrect beliefs. Because highly confident individuals rely on intuition to parse new information, they more readily discount messengers of contradictory information when compared to those with more modest levels of confidence. By overweighting their priors, confident individuals will more readily dismiss accurate information in favor of their own unpopular, but plausible, misperceptions.
Below, I present two studies designed to examine the effects of corrective cues on Americans’ support for factual misperceptions (e.g., Carnahan et al. 2021; Lewandowsky and Van der Linden 2021). In Study 1, I examine the association between confidence and the effectiveness of corrective cues, in a large-scale, observational setting. In Study 2, I investigate the same relationship experimentally. I directly manipulate respondents’ confidence using a specialized survey experimental design. Results from Study 1 show that excessive confidence in one’s political knowledge is associated with lower sensitivity to corrective cues, while in Study 2, tempering confidence increases respondents’ willingness to correct their political misperceptions.
The durability of misperceptions
Political misperceptions can emerge due to a lack of foundational political knowledge (e.g., Berinsky et al. 2012; Motta et al. 2018; Pasek et al. 2015; Thorson 2016). They are also known to occur due to the pernicious effects of partisan motivated reasoning (e.g., Nyhan and Reifler 2010; Schaffner and Roche 2016; Thorson 2016; Wood and Porter 2019). While epistemic confidence is related to both political knowledge and partisan-ideological commitments (Ortoleva and Snowberg 2015), confidence is presently understudied as a core driver of political misperceptions. Especially relevant to misperceptions are those individuals with unwarranted confidence in their own political knowledge (e.g., Anson 2018; Moore and Schatz 2021). Individuals with high confidence but low competence in the political realm often quickly develop strong ideological and partisan attachments, which may further boost their receptivity to (and defense of) partisan-congenial misperceptions. But psychological theories of information acquisition posit a separate role for overconfidence in conditioning misperception receptivity—one that likely affects partisans and political independents alike.
Excessive confidence in one’s objective performance is often labeled the “Dunning-Kruger Effect” in educational studies (e.g., Kruger and Dunning 1999). While the assumptions of the Dunning–Kruger framework have been more recently criticized as resulting from measurement error (e.g., Schlösser et al. 2013), contemporary work in political science has successfully examined the relationship between confidence and susceptibility to misperceptions. Graham (2020) demonstrates that confident responses to political survey questions tend to be more accurate than less-confident ones. This finding appears to occur regardless of the strength or direction of respondents’ partisan attachments. Lyons et al.’s (2021) important recent study also shows that fake news susceptibility increases as respondents over-state their own skills in navigating the media environment. Despite these advances, it currently remains unclear how and whether epistemic confidence affects individuals’ willingness to correct their misperceptions.
Why epistemic confidence conditions the durability of misperceptions
Confident individuals may be increasingly willing to defend misperceptions due to their reliance on intuitive cognitive processing (e.g., Ecker et al. 2011). Information processing theories assert that individuals weigh their own priors against new, potentially conflicting information. Once a confident individual develops priors about a subject, conflicting facts will be met with heavier than normal resistance. Among those with justifiable confidence—that is, confident experts who accurately appraise their expertise—this intuition may serve as a reliable method to parse new information. But unwarranted or excessive confidence pairs high self-regard with poor performance. Overconfident individuals incorrectly assume that their intuitions serve as a high-quality method for discriminating true from false, despite the likelihood that these intuitions are biased (due to motivated reasoning) or are generally uninformed.
When confronting corrections to their misinformed beliefs, highly confident individuals will fall back on intuitive processing. Intuition causes individuals to accept a message if it “contains no elements that contradict current knowledge, [if it] is easy to process, and [if it] ‘feels right’” (Lewandowsky et al. 2021). As a result of their overreliance on intuition, the confident may rapidly strengthen their beliefs and commitments after gaining only a cursory amount of information about a political topic (Wolak and Stapleton 2019). The result is a kind of path dependency: Once misinformation becomes embedded in the cognitive schema of a highly confident individual, their unwillingness to engage in the effortful process of squaring corrections with their existing beliefs lends those beliefs endurance. Among those individuals with average or below-average confidence, more effortful information processing strategies might lead them to correct their mistaken beliefs when presented with contradictory or corrective information.
Correcting misinformation: expectations
Corrective cues against misinformation take many forms. Sometimes these cues come in the form of statements from official sources (e.g., Nyhan and Reifler 2010). In other studies, consensus messages are used to help individuals understand the relative unpopularity of their beliefs and attitudes. For example, in Van der Linden et al.’s (2017) study, subjects saw a graphic claiming that 97% of scientists agreed about climate change’s human causes (see also Lewandowsky and Van der Linden 2021).
Epistemic confidence may be especially likely to interfere with receptivity to consensus-based corrective cues, making these treatments a salient test case for the present theory. The confident individual is expected to rate their own intuition as unassailable, and display skepticism towards the claims of others, regardless of those claims’ popularity. This line of reasoning leads to the expectation that unwarranted confidence will depress the effects of a consensus-based corrective cue in a survey experimental setting. This expectation is stated below as • H1. Those individuals with unwarranted confidence will exhibit decreased responsiveness to a corrective misinformation cue relative to those with accurate self-appraisals.
Because excessive confidence is likely correlated with several other relevant political and social predictors of misinformation susceptibility, including partisanship and ideological strength,
Research design
From December 20–21, 2019, I reached a national sample of adults (N = 1021) using the Amazon MTurk platform (hereafter Study 1). From September 22–23, 2021, following a pre-registered study design, I reached a second sample (N = 1209) using the Lucid Theorem marketplace (hereafter Study 2). The latter sample’s composition reflected Census estimates for age, ethnicity, gender, and region (Coppock and McLellan 2019).
Pre-Treatment items
The experimental protocols for both studies are detailed in Figure 1. In each Study, the protocol began with the collection of basic demographics and an assessment of general political knowledge. A political knowledge battery (hereafter “PK battery”) consisted of an additive political knowledge scale designed and validated by Anson (2018). The PK battery includes five political knowledge items related to American political institutions, current events, and political ideology. The PK battery was followed by a self-appraisal battery (hereafter “SA battery”) which asked respondents to evaluate their performance on the quiz (on a scale from “poor” to “excellent”). See the SI for question wording and scale validation. Experimental protocols, studies one and two.
Study 1: Design
While Study 1 provides an observational examination of the association between epistemic confidence and the effects of corrective cues, its design is based on a corrective-cue experiment. Respondents in Study 1 were randomly assigned to one of three treatment groups. Respondents saw a table containing five common misperceptions alongside the percentage of 1000 (fictional) survey respondents who had recently rated each statement as “false.” See page 3 of the SI for full treatment descriptions. The treatment conditions tested two versions of this corrective cue with varying message strength. In T1 (62–71%) and T2 (88–94%), majorities of increasing size rejected each statement as false. A third group of respondents were assigned to a Control condition with no corrective message. 1
To test
Study 2: Design
Study 2 proposes a causally identified test of
Study 2’s Confidence Adjustment task allows us to compare the observed world against an experimental condition in which all respondents are made aware of their true level of political knowledge, be it high or low. While this treatment necessarily complicates the interpretation of the full-sample treatment effect, it avoids the undesirable practice of deception in experimental research. To study
In this way, Study 2 demonstrates whether reduced confidence increases receptivity to corrective cues. Because respondents in T3 received both the Confidence Adjustment task and a corrective cue, experimental contrasts between T3 and the other treatments allows us to evaluate the available evidence for
Dependent variable measurement
In the final phase of both Studies 1 and 2, all respondents evaluated a five-item question battery tapping common misperceptions about topics such as the national debt, federal bureaucracy, crime, foreign aid, and social security (hereafter the “CM battery”). The topics were derived from existing literature on misperceptions, as indicated in Table A2 of the Supplementary Information. For each item respondents were asked to evaluate the statement’s accuracy on a scale from 1 (not at all accurate) to 5 (very accurate). For ease of interpretation, I rescale respondents’ CM battery scores to a continuous variable with range [0,1].
I operationalize epistemic confidence through a confidence score, which measures the difference between respondents’ perceived performance (reported in the SA battery), and the quintile in which they scored on the PK battery (Lyons et al. 2021). To facilitate the interpretation of coefficients, I rescale the confidence score to take a range of [-1,1]. See the Supplementary Information for further details.
Results
Results of OLS regression models predicting skepticism towards five common misperceptions, study one.
Notes: seven observations missing due to DV battery nonresponse. *p < 0.05; **p < 0.01; ***p<0.001. Standard errors in parentheses.
Column 1 of Table 1 presents average experimental treatment effects. We see from this model that both the Consensus Cue (T1) and the Large Majority Consensus Cue (T2) increased respondents’ skepticism towards misperceptions in the aggregate. On average, respondents in T1 became around 6 percentage points more skeptical of misinformation compared to the Control group (p < 0.001), while those in T2 saw their skepticism increase by around 5 percentage points (p < 0.01). These results suggest that respondents are as attentive to corrective cues from simple majorities (and perhaps more so) when compared to cues from large majorities.
Column 2 of Table 1 describes how political overconfidence conditions this average treatment effect. We see from this model that in the baseline condition, overconfidence is associated with a small decline skepticism (a 1.4 percentage point decrease; p = 0.26). However, respondents’ confidence accuracy substantially interacts with both the T1 and the T2 treatment effects—providing evidence in support of
Models 3 and 4 show that even when compensating for potential sources of spuriousness like partisanship, ideology, political knowledge, and attentiveness, the results are nearly identical. When controlling for political knowledge and attentiveness in Model 3, as measured by the PK battery, formal education (on a six-point scale), speeding, and performance on a four-item attention check, overconfidence again decreases skepticism by 5.2 (T1; p <0.01) and 5.9 (T2; p<0.001) percentage points, respectively. Results are robust to the further addition for measures of ideological and partisan strength in Model 4, and to three-way interactions between party and ideological strength, the treatments, and confidence accuracy, as seen in Figures A3 and A4 in the SI.
Figure 2 depicts the interactive effects between treatment exposure and political overconfidence from Column 2 of Table 1. Consistent with expectations and the coefficients reported in the Table, Figure 2 shows that respondents who received corrective cues in T1 and T2 substantially increased their predicted skepticism. But as respondents’ confidence exceeds their objective political knowledge (i.e., as we move rightwards along the x-axis), treatment effects are rendered nonsignificant. Interactive predictions from OLS regression model predicting skepticism, study one. Notes: Predictions from Column 2 of Table 1. Shaded areas denote 95% Confidence Intervals surrounding predicted values.
Study 2: Results
Results of OLS regression models predicting skepticism towards five common misperceptions, Study two.
Note: *p < 0.05 **p < 0.01 ***p < 0.001. Dependent variable is a continuous skepticism score ranging from 0 to 1. Overconfident respondents are those whose perceived score on knowledge quiz exceeded their objective score. Standard errors in parentheses.
The results in Table 2 show that among the overconfident, the corrective cue alone (T1) had little effect on skepticism. Relative to the Control condition, exposure to T1 was associated with an increase in skepticism towards misinformation of only 0.1 percentage points (p = 0.97). Consistent with expectations (because in T2 respondents received no corrective cue), we also see no significant effect for the Confidence Adjustment treatment (T2) relative to the Control (β = 0.001, p = 0.93).
However, in combination (T3), the Confidence Adjustment treatment and the corrective cue had a substantial positive effect on skepticism. Relative to the Control condition, T3 increased skepticism by roughly 4.4 percentage points (p < 0.05). This treatment-control contrast again provides evidence in support of
Discussion
The results of Studies 1 and 2 show initial evidence that epistemic confidence interferes with Americans’ willingness to correct their misperceptions. When presented with consensus-based corrective cues, we see observational evidence that confident Americans show little increased skepticism towards misperceptions. Epistemic confidence remained associated with decreased cue effectiveness after controlling for political knowledge, partisanship, ideology, and survey satisficing. In an experimental setting, corrective cues were increasingly effective among the initially overconfident after a specialized experimental treatment tempered their self-appraisals.
While partisan motivated reasoning and political knowledge deficits remain important directions for the study of misperceptions, the present results point to the idea that political overconfidence deserves scrutiny as a complementary (and perhaps exacerbating) driver of factual inaccuracies in public opinion. While for most people, consensus is a useful corrective, some Americans believe themselves to be wiser than the crowds. This may be one important reason why efforts to correct misperceptions in the public are sometimes unsuccessful.
Much more work remains to study the relationships between overconfidence, political knowledge, and political misperceptions, especially in a causally identified manner. In particular, it remains unclear how overconfidence conditions efforts to correct partisan-congenial misperceptions (Anson 2018). Furthermore, it is likely that source cues matter in real-world cueing attempts—especially partisan source cues.
Analyzing the effects of overconfidence in real-world social contexts is another important direction for future research. As we know from existing studies of misperceptions, corrections and fact-checks can diminish misperceptions while failing to influence underlying political attitudes (Nyhan et al., 2019). Coupled with earlier findings that overconfidence leads to increasingly polarized beliefs (Ortoleva and Snowberg 2015), the present findings point to a role for overconfidence in the defense of broader political worldviews. In an era of online radicalization and persistent misperceptions, future investigations of political overconfidence are increasingly urgent.
Supplemental Material
Supplemental Material - Epistemic confidence conditions the effectiveness of corrective cues against political misperceptions
Supplemental Material for Epistemic confidence conditions the effectiveness of corrective cues against political misperceptions by Ian G Anson in Research & Politics
Footnotes
Acknowledgments
The author wishes to thank Brian Guay, Matthew Graham, Logan Dancey, Doug Ahler, Matt Motta, John Kane, participants at the 2020 Southern Political Science Association Annual Meeting panel on “Misinformation, Misperception, and Fake News,” and three anonymous reviewers for their helpful feedback on earlier versions of this paper. The open-access publication of this paper was generously supported thanks to a UMBC Center for Social Science Scholarship Small Research Grant.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Correction (June 2025):
Supplemental Material
Notes
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
