Skip to main content

Open for Debate

What’s The Social Approach to Public Trust in Science?

30 May 2022

In my previous post , I have argued that, while the COVID-19 pandemic has thrown into stark relief the practical importance of public trust in science, the dominant, individualistic approach to public trust in science is woefully inadequate. In this post, I provide a (very minimal) sketch of an alternative approach, which I call the social approach (for a fuller sketch see here).

As we have seen in my previous post, the individualistic approach claims that it is primarily individuals who trust (or distrust) science. According to the social approach, on the other hand, it is primarily societies (and social groups within them) that trust (or distrust) science. On this approach, individuals trust (or distrust) science largely because (and insofar as) they live in a society (or belong to a social group) that trusts (or distrust) science. In fact, according to the social approach, a society that trusts science is not even necessarily one whose members have a positive psychological attitude towards science. Rather, it is a society that collectively defers to science on scientific issues. The case of chlorofluorocarbons (CFCs) is a good example of this sort of collective deference. CFCs are a class of gases that were widely used in a great variety of products ranging from aerosol sprays to refrigerators. In the mid-1970’s, scientists started to realize that CFCs released in the atmosphere by those products were rapidly depleting the layer of ozone that protects the Earth from harmful ultraviolet radiation. Once they became aware of this threat, scientists were quick to raise the alarm about the potentially catastrophic effects of the ozone layer depletion, politicians were quick to enact international policies that first regulated the use of CFCs and then banned their production, and the affected industries were quick to find an alternative to the use of CFCs.

As this example illustrates, one of the main social benefits of collective deference to science is that it enables an efficient division of epistemic (and decision-making) labour. The more our collective knowledge of the world grows and becomes specialized, the more each of us must outsource our epistemic tasks to others. However, it is epistemically irresponsible to delegate epistemic tasks to people or institutions that are not fully trustworthy and, when the epistemic tasks are practically relevant, it is also practically imprudent to do so. Moreover, when we do entrust something valuable to someone we don’t fully trust, we tend to take precautions (such as monitoring their performance), which, in the case of scientific knowledge, tend to be both ineffective (as we usually do not have the resources to assess the performance of experts) and inefficient (as it partly defeats the purpose of outsourcing epistemic tasks in the first place). This is why the more we can trust science (and defer to it epistemically), the more efficient the division of epistemic labour is.

However, while, in theory, complete epistemic deference to science might enable a perfectly efficient division of epistemic labour, there seem to be limits to the extent to which, in practice, absolute trust in science is either attainable or desirable. Here, I briefly focus on three general reasons why this is so. A first reason why absolute trust in science is not desirable is that, in its current form, science is not always worthy of our epistemic trust. Let me mention three aspects of this problem. First, a research environment increasingly dominated by a “publish-or-perish” culture doesn’t foster epistemic trustworthiness. The replication crisis as well as the rise in scientific frauds, retractions, ghostwritten studies, and paper mills are all symptoms of how an increasingly competitive research environment can undermine the trustworthiness of scientific research. Second, the increasing reliance of scientific research and of the academic institutions that have traditionally supported it on private funding casts a shadow on the epistemic trustworthiness of research that is conducted under more or less open conflicts of interest. Third, science is not demographically representative of the overall population. For example, according to 2010 US census data, White men, Asian men, and Asian women are overrepresented in science and engineering jobs, but all other demographic groups are underrepresented. As some feminist philosophers of science have argued, this lack of demographic diversity is likely to have epistemic consequences, as, among other things, certain kinds of prejudices and biases, including racist, sexist, and classist prejudices, are more likely to go unchallenged. To be worthy of the trust of the social groups that are targeted by those prejudices and biases, science needs to do much to repair its relationship with those groups both by becoming more inclusive and more responsive to the interests and concerns of marginalized groups.

Another reason why an uncritical trust in science does not seem to be desirable is that we often turn to scientists as advisors and, while it might be necessary for an advisor to be an expert in order for them to be worthy of our trust, it is not sufficient. First, philosophers of science increasingly agree that science is not value-free and, in particular, that non-epistemic values play a role even in seemingly purely epistemic decisions. For example, when deciding whether there was enough evidence to sound the alarm about the depletion of the ozone layer, scientists had to weigh the practical consequences of sounding the alarm too early against the practical consequences of doing it too late, which requires a value judgment. As this example shows, practical and epistemic judgments are often inextricably intertwined, which means that, when it comes to issues of practical relevance, trust in science is never purely epistemic. Second, advice on most scientific issues of practical relevance requires a broader range of expertise than that of any individual scientist or scientific community, so most individual scientists are ill-equipped to give advice on practical matters. For example, a climate scientist might be an expert on what effects a given temperature increase would have on sea levels, but not on the effects it would have on various human activities, or what the most effective economic policy to reduce carbon emissions is. Third, while trustworthy advisors need to be knowledgeable about a range of relevant matters, they also need to have other qualities, such as having good judgment and having the advice-seeker’s best interest at heart. For example, the doctor who, due to a tight appointment schedule, offhandedly dismisses parents’ worries about vaccines might lead those parents to feel that the doctor doesn’t really care about their children’s well-being and turn to online anti-vaccine parent groups for advice.

All these issues have been highlighted during the pandemic. On the one hand, there have been repeated calls for policymakers to just “follow the science.” However, even under less uncertain circumstances, the science cannot really tell us what to do, as that always requires making value judgments. On the other hand, politicians have often tried to shift the responsibility for difficult and value-laden policy decisions onto the experts to avoid accountability. However, given that experts are neither elected by the public nor accountable to it, it is not unreasonable for sectors of the public not to fully trust experts whose values and interests might differ significantly from theirs.

Let me now turn to a reason why, even if full deference were desirable, it might still be unattainable. The main problem is that ordinary people don’t get their scientific information directly from scientific sources. They get it indirectly through less trustworthy sources that often distort the information and sometimes spread misinformation. Many people, for example, get their scientific information through private media companies that have strong economic incentives to favour sensational science headlines to high-quality (but less attention-grabbing) science reporting. Or, to pick another example, members of the public often trust public figures (including politicians, celebrities, political commentators, rogue scientists, and industry executives) who have a vested interest in promoting scientific disinformation on specific issues, such as anthropogenic climate change or the safety of vaccines. In such a polluted epistemic environment, ordinary people have a hard time separating the wheat from the chaff and this makes it harder to collectively trust science even when it is at its most trustworthy.

According to the social approach, most of these issues have a common root in a deteriorated socio-epistemic infrastructure (i.e., the network of social norms, practices, and institutions that promote the reliable production, transmission, reception, and uptake of information and prevent the spread of misinformation). The scientist who works in a “publish-or-perish” culture, the biomedical researcher whose main source of funding is the pharmaceutical industry, the doctor who doesn’t have the time to address the patient’s concerns, the politicians who tries to avoid accountability for difficult policy decisions, the journalist who chooses the more sensational headline all operate within a socio-epistemic infrastructure that gives them incentives that tend to undermine their (actual or perceived) trustworthiness. While the individualistic approach primarily focuses on trying to persuade distrustful individuals to trust science, the social approach encourages to focus on the ways in which, as a society, we can improve the socio-epistemic infrastructure so as to increase the (actual and perceived) trustworthiness of science and promote a justifiably high level of trust in science.

Photo: Eugène Atget photo of eclipse of April 17, 1912 in Paris https://commons.wikimedia.org/wiki/File:Eug%C3%A8ne_Atget,_Eclipse,_1912.jpg