Skip to main content

Open for Debate

Non-ideal epistemology as a guidance for inquiry

27 May 2024

In our daily lives, we conduct numerous inquiries. Some of them are aimed at questions that have no significant impact on our lives, such as which restaurant in town has the best sushi, what does our friend want for her birthday, or in which year the last album of our favourite band was released. Others are about issues that are highly relevant to both individuals and society: who to vote for, whether to get vaccinated, whether to sign a petition and participate in a protest for abortion rights, or whether to believe that anthropogenic global warming is real. Studying how we inquire and form beliefs about important topics is not only epistemically, but also practically relevant, as such beliefs motivate our actions, influence public policy and thus have important consequences for society. In this post, I argue that epistemology can help us better understand what happens when we inquire about significant questions but fail to achieve our goal – knowledge, true belief, or understanding – by applying a non-ideal approach, as outlined by McKenna (2023), to investigating inquiry. Non-ideal inquiry epistemology, by avoiding various kinds of idealizations and relying on empirical data, can provide epistemic guidance for conducting better inquiries.

Most theories of traditional analytic epistemology have been predominantly concerned with describing the nature of epistemic phenomena, such as knowledge and understanding, rather than formulating guidance on how to achieve them (Ballantyne, 2019). However, various philosophers argue that epistemology can and should be ameliorative or regulative (Goldman, 1978; Ballantyne, 2019; Bishop & Trout, 2005, 2016; Roberts & Wood, 2007; but see Fantl, 2023) – giving advice on how to conduct good inquiry and how to achieve epistemic goals. If we aim to provide epistemic advice, the focus on inquiry is crucial: debates about the nature of justification and knowledge alone will not help us when we need to decide what evidence is trustworthy, who is a reliable expert, and how to proceed when we receive contradictory information (Ballantyne, 2019).

Inquiry epistemology investigates what constitutes inquiry and what norms should guide it. Inquiry, in a broad sense, includes a wide range of attitudes and activities that occur over a period of time and are aimed at answering a question by using evidence. When we inquire, we try to figure things out, settle a question, extend or refine our knowledge, or gain or improve our understanding. Rather than asking questions related only to justification and knowledge, inquiry epistemologists are interested in what is efficient, reliable, and responsible inquiry and how one should inquire – which inquiries to undertake in the first place, how to gather and evaluate evidence, and when to terminate inquiry (Ballantyne, 2019; Haziza, 2023; Hookway, 1994; Flores & Woodard, 2023; Friedman, 2017, 2019, 2020, forthcoming; Kelp, 2021; Thorstad, 2021).

Epistemological questions, including the ones about inquiry, can be investigated in an idealised or non-idealised way. In the first case, we adopt certain idealizations about inquirers and the world in which they are embedded, and in the second case we try to avoid them (McKenna, 2023). I will focus on two types of idealizations: of the psychology and cognitive capacities of inquirers, and of the epistemic environment in which they are embedded. Following the work of McKenna (2023), I will show that avoiding such idealizations and combining philosophical methods with empirical research is a good starting point for giving epistemic advice that is achievable for ordinary inquirers and improves our epistemic position.

Firstly, non-ideal inquiry epistemology is based on a psychologically realistic conception of inquirers. It draws on empirical data collected by cognitive psychology and other empirical disciplines that show us how human cognitive processes such as judgement and decision making, attention, memory, and reasoning work (Eysenck & Keane, 2020). This means that non-ideal inquiry epistemology does not require inquirers to exhibit logical omniscience, optimally update their beliefs using Bayes’ rule, and have a perfectly coherent system of beliefs (Carr, 2022). It acknowledges that we are often bad intuitive statisticians, performing poorly on probabilistic tasks but doing much better with frequencies, and that the way information is presented affects our risk assessment and decision making (Gigerenzer & Edwards, 2003; Gigerenzer, 2003). It recognises that our inquiries and judgments are often implicit and intuitive rather than deliberate and reflective (Evans & Stanovich, 2013; De Neys, 2018). It takes into account that we are prone to various biases in our reasoning (Kahneman, 2011; Kahneman et al., 1982), such as overconfidence in our abilities (Kruger & Dunning, 1999; Pennycook et al., 2017), and that we often rely on fast and frugal heuristics rather than complex reasoning process, but that such heuristics frequently lead to correct judgments (Gigerenzer, 2000). It acknowledges that it is questionable whether we actually perform Bayesian updating of probabilities (Bowers & Davis, 2012).

These insights can be fruitfully incorporated into epistemic advice. Instead of advising people to always opt for a complex calculation, we can focus on teaching them how to select reasoning strategies, e.g. how to choose between heuristics and more sophisticated methods. When we present new information about risk – e.g. the risk of complications after vaccination – we can express the numbers in frequencies rather than probabilities. We can acknowledge that many of our judgments are made automatically and intuitively, and we do not limit our advice for inquiries to situations where people have the time and opportunity to engage in careful deliberation (see also Ballantyne, 2019). We can recognize that many factors influence how we allocate our limited attention, and that ordinary inquirers cannot be expected to devote enormous resources to investigations about a particular question. We can educate people about the existence of bias and overconfidence and, if possible, how to avoid them.1 In designing interventions to influence sceptics’ beliefs about climate change, we can take into account that people deviate from Bayesian updating and that encountering new evidence does not lead to an immediate change in their beliefs, and so on.

Furthermore, non-ideal inquiry epistemology recognises that we do not inquire in isolation, but are embedded in complex social and epistemic environments, constantly interacting with, and relying on, others. Our social situatedness, such as social identity and role, importantly influences our everyday inquiries (McKenna, 2023). For example, a large body of empirical research suggests that we often engage in politically motivated reasoning (Kahan, 2013, 2016a, 2016b), meaning that we form beliefs that are consistent with the beliefs of other members of a group that defines our values and identity. Similarly, our risk assessments are influenced by what Kahan et al. (2011) refer to as cultural cognition, defined as the tendency to evaluate risks (e.g., from global warming, vaccines, guns, nuclear weapons …) in a way that coheres with our cultural values (Braman et al., 2005; Kahan & Braman, 2003; Kahan et al., 2010). Partisan segregation, which leads to a polarization of opinions, also greatly influences our evaluations of scientific testimony (Funk et al., 2019; Smith et al., 2024). There is ample empirical evidence of the correlation between views on climate change and political values in the USA (Ballew et al., 2019; Dunlap & McCright, 2008; Funk & Kennedy, 2016; McCright et al., 2016).

Considering all of these findings, as Anderson (2011) does, can help us develop epistemic advice that promotes our epistemic goals. If distrust of scientific evidence is influenced in part by perceived threats to a group’s social values, policies and recommendations can be formulated to signal endorsement of the values of different ideological groups and reduce the perception of threat (Anderson, 2011). If people are more trusting of those who share their social and political values (Kahan et al., 2011), it would make sense to try to engage politically diverse groups to advocate for the seriousness of the global warming threat (McKenna, 2023). As Anderson (2011, p. 158) puts it, “Remove the threats, affirm people’s values, and they will be more receptive to an objective assessment of the evidence.”

Secondly, non-ideal epistemology does not focus solely on the characteristics of the inquirers, but also on the features of the epistemic environment. We are often embedded in a polluted epistemic environment that makes it difficult for us to achieve our epistemic goals. In such environments, our false beliefs, lack of knowledge or understanding might not be attributed to our irrationality or our lack of epistemic virtue, but primarily to the features of the environment (Levy, 2021). One of the important aspects of good inquiry is recognizing which sources of information are reliable and trustworthy. How do we know which websites, organizations, or individuals to trust when we, for instance, seek information about climate change? In polluted epistemic environments, criteria for recognizing expertise such as credentials, track record, or intellectual honesty (Anderson, 2011; Goldman, 2001) are imitated by various parties to give the appearance of expertise and therefore cannot help us distinguish real from pseudo-experts. If we want to increase the number of correctly identified experts, we need to address the problem not only by teaching individuals how to inquire virtuously and responsibly, but also by shaping the epistemic environment in a way that will limit misinformation as much as possible (Levy, 2021) and will enable laypeople to make “open-minded, unbiased consideration of the best available scientific information” (Kahan, 2010, p. 3). Levy (2021) suggests improving the environment in a way that facilitates trust in science and in scientific institutions and enables novices to identify reliable sources of information (see also McKenna, 2023). By recognizing that environments are often polluted, inquiry epistemology can work towards better inquiries not only by providing advice for individual inquirers, but also for those who can influence how our epistemic environments are construed. But how can we improve epistemic environments? McKenna (2023) suggests that we rely on empirical data on science communication – for example, the use of framing (Badullovich et al., 2020) or prebunking – the presentation of a common argument against a scientific claim along with its refutation and explanation of why it is not a good argument (Cook et al., 2017; van der Linden et al., 2017). Furthermore, Anderson (2011) suggests that we rethink the norm of media coverage that requires both sides of the debate to be presented – when this happens in the case of climate change, it can lead to the false impression that there are equally good arguments for and against it.

To sum up, non-ideal inquiry epistemology adopts a psychologically realistic conception of inquirers, recognizes our social situatedness, and acknowledges how the epistemic environment influences our inquiries. By avoiding idealization and combining philosophical investigations with empirical research, it has the means to provide good epistemic guidance.

1: In philosophy and psychology, there is a debate about the nature of biases, about whether they are irrational or not and whether they are always epistemically bad. See, for example, Gigerenzer (1991), Hahn & Harris (2014), Kahneman & Tversky (1996), Kelly (2022).

 

Acknowledgments

 

This blog is the third post of the series “Extreme Beliefs and Behavior.” With thanks to the Extreme Beliefs Project (www.extremebeliefs.com), especially Nora Kindermann, Jakob Ohlhorst, and Rik Peels. Research for this post has been made possible through the project Extreme Beliefs: The Epistemology and Ethics of Fundamentalism, funded by the European Research Council (ERC) in the program Horizon 2020 (851613) and by the Vrije Universiteit Amsterdam.

References

Anderson, E. (2011). Democracy, Public Policy, and Lay Assessments of Scientific Testimony. Episteme, 8(2), 144–164. https://doi.org/10.3366/epi.2011.0013

Badullovich, N., Grant, W. J., & Colvin, R. M. (2020). Framing Climate Change for Effective Communication: A Systematic Map. Environmental Research Letters, 15(12), 123002. https://doi.org/10.1088/1748-9326/aba4c7

Ballantyne, N. (2019). Knowing our Limits. Oxford University Press.

Ballew, M. T., Leiserowitz, A., Roser-Renouf, C., Rosenthal, S. A., Kotcher, J. E., Marlon, J. R., Lyon, E., Goldberg, M. H., & Maibach, E. W. (2019). Climate Change in the American Mind: Data, Tools, and Trends. Environment: Science and Policy for Sustainable Development, 61(3), 4–18. https://doi.org/10.1080/00139157.2019.1589300

Bishop, M. A., & Trout, J. D. (2005). Epistemology and the psychology of human judgment. Oxford University Press.

Bishop, M. A., & Trout, J. D. (2016). Epistemology for real people. In K. Lippert-Rasmussen, K. Brownlee & D. Coadly (Eds.), A Companion to applied philosophy (pp. 103–120). Wiley-Blackwell.

Bowers, J. S., & Davis, C. J. (2012). Bayesian Just-So Stories in Psychology and Neuroscience. Psychological Bulletin, 138(3), 389–414. https://doi.org/10.1037/a0026450

Braman, D., Kahan, D. M., & Grimmelmann, J. (2005). Modeling Facts, Culture, and Cognition in the Gun Debate. Social Justice Research, 18, 283-304. https://doi.org/10.1007/s11211-005-6826-0

Carr, J. R. (2022). Why Ideal Epistemology. Mind, 131(524), 1131–1162. https://doi.org/10.1093/mind/fzab023

Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing Misinformation through Inoculation: Exposing Misleading Argumentation Techniques Reduces their Influence. PLOS ONE, 12(5), e0175799. https://doi.org/10.1371/journal.pone.0175799

De Neys, W. (2018). Dual Process Theory 2.0. Routledge.

Dunlap, R. E., & McCright, A. M. (2008). A Widening Gap: Republican and Democratic Views on Climate Change. Environment: Science and Policy for Sustainable Development. 50(5), 26-35. https://doi.org/10.3200/ENVT.50.5.26-35

Evans, J. S. B. T., & Stanovich, K. E. (2013). Dual-Process Theories of Higher Cognition: Advancing the Debate. Perspectives on Psychological Science, 8(3), 223-241. https://doi.org/10.1177/1745691612460685

Eysenck, M. W., & Keane, M. T. (2020). Cognitive Psychology. A Student’s Handbook, 8th Edition. Routledge.

Fantl, J. (2023). Guidance and Mainstream Epistemology. Philosophical Studies, 180, 2191–2210. https://doi.org/10.1007/s11098-023-01970-2

Flores, C., & Woodard, E. (2023). Epistemic Norms on Evidence-Gathering. Philosophical Studies, 180(9), 2547-2571.

Friedman, J. (2017). Why Suspend Judging? Noûs, 51(2), 302-326. https://doi.org/10.1111/nous.12137

Friedman, J. (2019). Inquiry and Belief. Noûs, 53(2), 296-315. https://doi.org/10.1111/nous.12222

Friedman, J. (2020). The Epistemic and the Zetetic. Philosophical Review 129(4), 501–536.

Friedman, J. (forthcoming). Zetetic Epistemology. In B. Reed & A. K. Flowerree (Eds.), Towards an Expansive Epistemology: Norms, Action, and the Social Sphere. Routledge.

Funk, C., & Kennedy, B. (2016). The Politics of Climate. Pew Research Center.

Funk, C., Hefferon, M., Kennedy, B., & Johnson, C. (2019). Trust and Mistrust in Americans’ Views of Scientific Experts. Pew Research Center.

Goldman, A. (1978). Epistemics: The Regulative Theory of Cognition. Journal of Philosophy 75(10), 509-523. https://doi.org/10.2307/2025838

Goldman, A. (2001). Experts: Which Ones Should You Trust? Philosophy and Phenomenological Research, 63(1), 85-110. https://doi.org/10.2307/3071090

Gigerenzer, G. (1991). How to Make Cognitive Illusions Disappear: Beyond “Heuristics and Biases”. European Review of Social Psychology, 2(1), 83-115. https://doi.org/10.1080/14792779143000033

Gigerenzer, G. (2000). Adaptive thinking: Rationality in the real world. Oxford University Press.

Gigerenzer, G., & Edwards, A. (2003). Simple Tools for Understanding Risks: From Innumeracy to Insight. British Medical Journal, 327(7417), 741–744. https://doi.org/10.1136/bmj.327.7417.741

Gigerenzer, G. (2003). Calculated Risks: How to Know When Numbers Deceive You. Simon & Schuster.

Hahn, U., & Harris, A. J. L. (2014). What Does It Mean to be Biased: Motivated Reasoning and Rationality. Psychology of Learning and Motivation, 61, 41-102. https://doi.org/10.1016/B978-0-12-800283-4.00002-2

Haziza, E. (2023). Norms of Inquiry. Philosophy Compass, 18(12), e12952.

Hookway, C. (1994). Cognitive Virtues and Epistemic Evaluations. International Journal of Philosophical Studies, 2(2), 211-227, https://doi.org/10.1080/09672559408570791

Kahan, D. M., Braman, D., Cohen, G. L., Gastil, J., & Slovic, P. Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law and Human Behaviour, 34(6), 501-516. https://doi.org/10.1007/s10979-009-9201-0

Kahan, D. M. (2010). Fixing the Communications Failure. Nature, 463, 296-297. http://doi.org/10.1038/463296a

Kahan, D. M., Jenkins-Smith, H., & Braman, D. (2011). Cultural Cognition of Scientific Consensus. Journal of Risk Research, 14(2), 147–174. https://doi.org/10.1080/13669877.2010.511246

Kahan, D. M. (2013). Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making, 8(4), 407–424. https://doi.org/10.1017/S1930297500005271

Kahan, D. (2016a). The Politically Motivated Reasoning Paradigm, Part 2: Unanswered Questions. In S. Kosslyn (Ed.), Emerging Trends in Social & Behavioral Sciences (pp. 1-16). Wiley Online Library.

Kahan, D. (2016b). The Politically Motivated Reasoning Paradigm, Part 1: What Politically Motivated Reasoning Is and How to Measure It. In S. Kosslyn (Ed.), Emerging Trends in Social & Behavioral Sciences (pp. 1-15). Wiley Online Library.

Kahneman, D., & Tversky, A. (1996). On the reality of cognitive illusions. Psychological Review, 103(3), 582–591. https://doi.org/10.1037/0033-295X.103.3.582

Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.

Kahneman, D, Slovic, P., & Tversky, A. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press.

Kelly, T. (2022). Bias: A Philosophical Study. Oxford University Press.

Kelp, C. (2021). Inquiry, Knowledge, and Understanding. Oxford University Press.

Kruger, J., & Dunning, D. (1999). Unskilled and Unaware of it: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. https://doi.org/10.1037/0022-3514.77.6.1121

Kunda, Z. (1990). The Case for Motivated Reasoning. Psychological Bulletin, 108(3), 480–498. https://doi.org/10.1037/0033-2909.108.3.480

Levy, N. (2021). Bad beliefs: Why They Happen to Good People. Oxford University Press.

McKenna, R. (2023). Non-ideal epistemology. Oxford University Press.

Pennycook, G., Ross, R. M., Koehler, D. J., & Fugelsang, J. A. (2017). Dunning-Kruger Effects in Reasoning: Theoretical Implications of the Failure to Recognize Incompetence. Psychonomic Bulletin and Review, 24(6), 1774-1784. https://doi.org/10.3758/s13423-017-1242-7

Roberts, R. C., & Wood., W. J. (2007). Intellectual Virtues: An Essay in Regulative Epistemology. Oxford University Press.

Smith, E. K., Bognar, M. J., & Mayer, A. P. (2024). Polarisation of Climate and Environmental Attitudes in the United States, 1973-2022. Npj Climate Action, 3(2). https://doi.org/10.1038/s44168-023-00074-1

Thorstad, D. (2021). Inquiry and the Epistemic. Philosophical Studies 178(9), 2913-2928. https://doi.org/10.1007/s11098-020-01592-y

van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the Public against Misinformation about Climate Change. Global Challenges 23(1), e1600008. https://doi.org/10.1002/gch2.201600008

 

Photo by eberhard grossgasteiger: https://www.pexels.com/photo/road-signage-near-snow-covered-mountains-1699028/