Abstract
Community water fluoridation (CWF) has become one of the most contested public health interventions despite a robust evidence base supporting its effectiveness, safety, and equity. Drawing on a recent comprehensive review, this perspective argues that persistent controversies revolve not only around the effectiveness, safety, and implementation of CWF but also around a “fourth controversy”: how people evaluate and hold confidence in what they think they know. We discuss CWF as a case of contested science in which polarization and misinformation undermine belief updating. Using insights from metacognition research, we suggest that improving public reasoning requires communication and policy approaches that foster epistemic humility, iterative learning, and evidence-informed revision rather than reliance on evidence accumulation alone.
Community Water Fluoridation as a Contested Scientific Domain
The comprehensive review by Do et al (2025) highlights community water fluoridation (CWF) as an effective, safe, and equitable public health measure. By revisiting 3 major controversies around CWF—its effectiveness, safety, and implementation—the authors reinforce that, after 8 decades, the scientific case for the caries-preventive benefits of fluoride remains robust, supporting CWF’s effectiveness in preventing dental caries, alleviating socioeconomic inequalities, and showing no measurable neurodevelopmental harm at optimal fluoride levels. Yet, as the authors note, CWF is one of the most contested interventions in public health.
Why, despite a voluminous evidence base, does public debate persist? In this commentary, we argue that the problem is not only about evidence quantity or quality—the first 3 controversies, or “what we know”—but also about how individuals think about and evaluate what they know: “what we think we know.” That is a domain of metacognition: our awareness of our knowledge and its limits. We propose that this constitutes a “fourth controversy”: not over knowledge itself, but how knowledge is evaluated.
In that respect, CWF exemplifies a class of contested sciences such as climate change, vaccination, or pandemic response, in which high-quality evidence exists, but misinformation abounds (Torwane et al 2025). In these domains, public beliefs do not necessarily converge toward the scientific evidence. Rather, citizens’ beliefs are polarized by political ideology or worldviews (Hornsey 2021). Research from these domains shows that metacognitive insight—knowing what one knows and what one does not—plays a key role in these domains, explaining how people form beliefs, search for information, and act (Fischer and Fleming 2024).
Contested Evidence and Metacognition
Beyond the quantity or quality of evidence lies the “ugliness” of contested science: controversies in these domains are not necessarily—or even typically—resolved by adding more, or better, data. The problem is not a lack of information but differences in how knowledge is weighted and trusted.
Metacognition refers to the capacity to evaluate one’s own knowledge and the confidence one assigns to that knowledge. Good metacognition—metacognition that is sensitive to the accuracy of knowledge—means knowing how well one knows: assigning high confidence when knowledge is accurate and low confidence when only guessing.
Metacognition offers a useful lens for understanding ongoing controversies around CWF. Individuals with more accurate metacognition tend to hold more evidence-based beliefs about climate change (Fischer and Said 2021). Similarly, individuals who are better aware of the accuracy of their interpretation of evidence are less prone to polarized belief-updating about climate change (Said et al 2022) and show more epistemic humility—that is, the willingness to acknowledge one’s own performance, including the recognition of one’s own reasoning errors (Fischer et al 2025). Further, individuals with stronger metacognitive insight into their knowledge about vaccines and viruses were more likely to adhere to evidence-based public health measures during the COVID-19 pandemic (Fischer et al 2023).
From this perspective, persistent disagreement around CWF is not simply a matter of ignorance or ideology. Rather, it reflects a metacognitive problem: differences in the degree of confidence individuals place in their knowledge and in how much weight and trust they assign to it.
Now What?
If we accept that—unfortunately—evidence alone is insufficient to settle contested issues, we must think about how to build more robust knowledge ecosystems. CWF is not alone in facing persistent public controversy. Research on other contested domains, most notably climate change and vaccination, offers a valuable evidence base for understanding how people come to reject or accept scientific evidence. In these areas, communication strategies that highlight the strength of expert consensus and directly debunk false claims have proven effective in reducing misperceptions and strengthening trust in science (Smith et al 2023; Vraga et al 2023; Bruns et al 2024).
However, these experiences also reveal that the deeper challenge lies in how individuals think about knowledge itself and how confident they are in that knowledge—a metacognitive problem.
Metacognitively aware public health communication could help people reflect not only on what they believe but also on how strongly they should hold those beliefs given the available evidence. To do so, we need to shift from defensive debates to systems of collective intelligence—mechanisms that enable iterative learning, feedback, and adaptation. For CWF, this could mean embedding evidence use within an ecosystem of continuous evaluation. Such a learning-oriented system would explicitly acknowledge the best available evidence while also being transparent about knowledge gaps and uncertainties. Decisions on implementation or deimplementation could then be regularly updated as new evidence emerges, accompanied by new data flows, such as ongoing risk monitoring for adverse outcomes. This model parallels learning health systems in medicine and accepts a basic truth: perfect information does not exist—there is no such thing as “risk-free” in clinical care; what matters is that these risks are proportionate to the expected benefits and continually reassessed. Yet public health interventions such as CWF face an additional complexity: positive externalities. Individuals may experience minimal exposure to collective risks in exchange for broader social goods, such as population-wide improvements in oral health that particularly benefit disadvantaged groups. Ethical decision-making thus involves trading off between multiple public goods—the benefit of reducing caries in poorer populations versus the imperative to avoid uncertain or rare side effects.
This approach also requires continuous reflection on research methods and assumptions to ensure that the available evidence is and remains robust, transparent, and decision-relevant. A more pluralistic approach to causal inference that draws on triangulation and mixed methods can strengthen the evidentiary base for such reflection. Ultimately, fluoridation policy and other contested public health measures might operate under a tentative principle of “As Good As Possible Evidence” (AGAPE): recognizing that perfect certainty is unattainable, evidence-informed decision-making processes should transparently integrate and weigh the best available knowledge, which is continuously updated as the evidence evolves. This exemplifies a spirit of epistemic humility—akin to a Bayesian posture of learning and revision.
Author Contributions
H. Fischer, S. Listl, contributed to conception, data interpretation, drafted and critically revised the manuscript. All authors gave final approval and agree to be accountable for all aspects of the work ensuring integrity and accuracy.
Footnotes
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
Data Availability
This perspective article discusses theoretical considerations; therefore, no primary data were generated or analyzed.
