What’s Wrong with the Individualistic Approach to Public Trust in Science?
16 May 2022While academics and commentators have been worrying about distrust of science for some time now, the COVID-19 pandemic has thrown into stark relief the full scale of the problem as well as its practical consequences. In rich countries, a sizable portion of the eligible population refuses to get vaccinated against COVID-19 despite the widespread availability of free and effective vaccines and the efforts of public health authorities. Vaccine refusers not only expose themselves and those around them to a greater risk of infection, hospitalization, and death, but they also put an unnecessary burden on healthcare systems that are already strained to their limits.
While the pandemic has highlighted the importance and urgency of promoting public trust in science, the dominant, individualistic approach to it is woefully inadequate for the task, or so I’ll argue here (and I have argued more extensively here). According to the individualistic approach, it is primarily individuals who trust (or distrust) science. On this view, in order to restore trust in science, we need to persuade (or pressure, or nudge) people to trust science more. For the sake of definiteness, I will focus on what I take to be one of the most sophisticated versions of the individualistic approach—the account developed by Naomi Oreskes in Why Trust Science? Oreskes seems to assume that those who mistrust science do so because they do not understand how science works and why that makes it trustworthy. Oreskes’ account of the trustworthiness of science relies on a social account of science, such as the one put forward by Helen Longino. To put it in a nutshell, Longino argues that what makes a scientific community objective is not that individual scientists are unbiased and disinterested seekers of the ultimate truth—it is that individual scientists subject one another’s assumptions, methods, evidence, and results to critical scrutiny, that they are responsive to criticisms from their peers, and that they share a set of standards to settle their disagreements. On Longino’s account, it is this social system of epistemic checks and balances that safeguards the objectivity of science. Building on Longino’s account, Oreskes argues that, whenever a scientific community achieves a hard-won consensus over some hypothesis after subjecting it to this gruelling process of testing and criticism, ordinary people have a good (if defeasible) reason to trust that consensus.
Oreskes’ account of public trust in science faces a number of problems. Here, I’ll focus on three problems that I take to be typical of the individualistic approach in general. First, the individualistic approach tends to idealize science. Even assuming that a scientific community that meets Longino’s ideal of objectivity tends to converge on true (or even just actionable) hypotheses in the long run, this result might be of limited significance for promoting trust in actual scientific communities. For one thing, it is not clear how closely most actual scientific communities approach the Longinean ideal of objectivity. For another, the consensus forming mechanism works slowly, and, as the pandemic taught us, we don’t always have the luxury to wait for a consensus to emerge among the experts to make important practical decisions. Moreover, science scholars, including Longino herself, have amassed copious amount of evidence for thinking the scientific consensus is sometimes based on the shared prejudices of a scientific community (including sexist and racist prejudices). Finally, in some areas of scientific research, there are genuine concerns about the evidence on which the consensus is based, as in the case of pharmaceutical research, which is typically paid for and overseen by the very companies that stand to gain from its results.
Second, the individualistic approach places a heavy epistemic burden on ordinary citizens. Most people live in an epistemic environment that is heavily polluted with misinformation and disinformation. Oreskes knows this all too well. Merchants of Doubt, a book she co-authored with Erik Conway, tells the story of a handful of scientists seemingly motivated by a mix of political ideology and personal interest who went out of their way to convince the public that the science was not settled on practically relevant scientific issues ranging from the negative health consequences of smoking tobacco to the human contributions to climate change. In a polluted epistemic environment, it seems unrealistic to expect ordinary people to have the time, the motivation, or the resources to establish whether a scientific consensus has emerged on each and every scientific issue that might be relevant to their personal or political decisions. Oreskes optimistically suggests that ordinary citizens should live by the maxim that serves as the epigraph to Why Trust Science?—“trust but verify.” Ironically, while Oreskes attributes the maxim to Ronald Reagan, a quick internet search reveals that it is, in fact, a Russian proverb that Reagan merely popularized in the United States. If even a scholar as accomplished as Oreskes cannot verify each and every bit of (mis)information she comes across, how can we expect ordinary citizens to do so in a world in which they are constantly bombarded with a barrage of information and misinformation?
Third, the individualistic approach largely ignores what we might call the social determinants of trust. Whom we trust partly depends on social factors, including the patterns of trust (and distrust) of those around us, the social groups with which we identify, the state of the socio-epistemic infrastructure (i.e., the network of social norms, practices, and institutions that promote the reliable production, transmission, reception, and uptake of information and prevent the spread of misinformation), and the resulting level of epistemic pollution. Let me focus on two ways in which social factors affect people’s trust in science. The first is that, anecdotally, echo chambers seem to play a crucial role in promoting and sustaining various forms of distrust of science. According to Thi Nguyen, echo chambers operate through a self-perpetuating epistemic mechanism that promotes trust in insiders while systematically discrediting outside sources. Once we understand how an echo chamber works, it is easy to see why the pro-science messages from official sources (or from well-meaning academics such as Oreskes) are unlikely to be effective. Even when they filter through the walls of the echo chamber, its inhabitants, who distrust the original sources, are likely to perceive them as deceitful propaganda. The second aspect is that people’s attitude towards science are often shaped by their membership in specific social groups. For example, in most American states, COVID vaccination rates among African Americans and Hispanic Americans are lower than among Whites (and significantly lower than among Asian Americans). This might be partly explained by a higher rate of vaccine hesitancy among African Americans and Hispanic Americans, which is likely to be partially rooted in a history of prejudice and injustice as well as in a lived experience of bias and discrimination at the hand of the healthcare system. The individualistic approach to public trust in science tends to adopt a one-size-fit-all approach to distrust of science and ignore the specific roots it has in different communities or dismiss the very understandable reasons some communities have to be distrustful of science and associated institutions.
In order to promote warranted public trust in science, we need to abandon the inadequate individualistic approach in favour of a more realistic and nuanced understanding of public trust in science. In my next post, I will sketch a version of the social approach to public trust in science, which, I claim, overcomes many of the limitations of the individualistic approach.
Photo: A group of mask-wearing citizens, Locust Avenue, California, during the flu pandemic of 1918. Photograph: Raymond Coyne/Courtesy of Lucretia Little History Room, Mill Valley Public Library. Photo courtesy of The Annual Dipsea Race — the oldest trail race in the United States — as donated by the Raymond Coyne family from their many adventures to Marin County and the Dipsea Trail.
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- April 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017