With many weighty contemporary issues, it is increasingly difficult to know what exactly to believe. This includes issues related to or at the intersection of politics, morality, religion, medicine, and science. Information about these issues is endless. It points in different and inconsistent directions. And its quality can be extremely difficult to discern.
We might think we can sidestep this problem by simply “trusting the experts” in the relevant domains. Alas, a similar problem often recurs at the level of expertise. Putative experts proliferate. They disagree with each other. Which ones should we trust?
Unable to outsource a solution, we might be tempted by either of two other responses. The first is skepticism. Rather than navigate the precarious epistemic landscape ourselves, we might simply give up trying and suspend judgment about the issues.
There is some evidence that skepticism of this sort is taking hold on a fairly wide scale. According to a recent news report, “47 percent of Americans believe it’s difficult to know whether the information they encounter is true. Just 31 percent find it easy. About 60 percent of Americans say they regularly see conflicting reports about the same set of facts from different sources.” As one person remarked: “There’s so much information that’s biased, that no one believes anything. There is so much out there and you don’t know what to believe, so it’s like there is nothing.”
A different reaction to our epistemic predicament moves in the opposite direction. Faced with the challenge of trying to evaluate voluminous and perplexing arguments and data, we might elect to tune out voices we disagree with, paying serious attention only to sources that tell us what we already believe or want to believe. That is, we might embrace a kind of “tribal epistemology,” which one writer describes thus: “Information is evaluated based not on conformity to common standards of evidence or correspondence to a common understanding of the world, but on whether it supports the tribe’s values and goals and is vouchsafed by tribal leaders. ‘Good for our side’ and ‘true’ begin to blur into one.”
As ways of negotiating the present epistemic landscape, skepticism and tribalism leave much to be desired, both epistemically and democratically. Therefore, I’d like to offer a brief sketch of an alternative approach. Unlike skepticism, it does not represent the path of least resistance. Nor does it, like tribalism, amount to an epistemic shortcut.
The response can be captured in terms of a pair of complementary intellectual virtues: intellectual humility and intellectual persistence. Intellectual humility is a matter of being alert to and “owning” one’s intellectual limitations, weakness, and mistakes. Like Socrates, intellectually humble persons are aware of what they don’t know and don’t pretend to be knowledgeable when they are not. Intellectual persistence involves carrying on with a belief or inquiry in the face of obstacles or challenges. The persistent inquirer is slow to give up; she is willing to struggle and fight to achieve her epistemic goals.
As with other character attributes we tend to think of as virtues, it’s possible to go too far (excess) or not far enough (deficiency) with either intellectual humility or intellectual persistence. To illustrate, intellectually arrogant persons clearly are deficient in intellectual humility, whereas persons who exaggerate their intellectual limitations are self-deprecating or servile. Similarly, people who are intellectually lazy tend to be deficient in intellectual persistence, while those who don’t know when to quit or switch course in an inquiry are excessively persistent.
These categories are useful for thinking about the kind of skepticism and tribalism noted above. Skepticism betrays an excess of intellectual humility: it involves concluding, prematurely, that one is incapable of making significant progress toward the truth. And it betrays a deficiency of intellectual persistence because it involves giving up sooner than one should. Tribalism, by contrast, betrays a deficiency of intellectual humility and an excess of intellectual persistence. Tribalists lack intellectual humility because they assume, unreasonably, that they have nothing to learn from people who disagree with them. They are excessively intellectual persistent on account of clinging to their beliefs even when these beliefs are no longer supported by the evidence they possess or ought to possess.
My suggestion is that we attempt to negotiate the information landscape in ways that “hit the mean” along both of these dimensions, that is, in ways that are expressive of both intellectual humility and intellectual persistence. We must be honest with ourselves and not lose sight of our intellectual limitations and vulnerabilities. At the same time, we mustn’t let this fallibility be a toehold for intellectual laziness or despair. With one eye on our intellectual limitations and the other fixed firmly on the goal of truth, we must persist.
What exactly does this look like? The answer will vary at least to some extent from one person to another, depending on how the person is situated, the quantity and quality of her evidence and background knowledge, what her intellectual strengths and limitations are, and so on. Virtues are always situation-relative in this way.
But one thing it does not look like is giving up or calling it quits (skepticism). That’s to regard the challenge as too onerous, the bar as too high. Neither does it look like pretending that the challenge is easy to overcome, that it can be dealt with simply by picking sides or by wrapping oneself in an “epistemic bubble” (tribalism).
Instead, humble persistence involves admitting that the epistemic landscape is precarious, that it is fraught with pitfalls and dead ends. It involves being rigorously mindful of and honest about our own cognitive fallibility, limitations, and less-than-virtuous epistemic habits. And it requires allowing these considerations to inform how we inquire, listen to others, and draw conclusions. However, it also looks like resisting the impulse to quit or give up. It involves moving forward, plodding on, and using our epistemic resources and abilities as intelligently and competently as possible.
This is no silver-bullet solution, of course. Structural, professional, technological, and other measures are also necessary for combating the epistemic problems posed by our information landscape. Yet, at the end of the day, each of us must decide how to approach and assess the information available to us. At least when compared with skepticism or tribalism, humble persistence seems like a far more promising path, epistemically and otherwise.
Thanks to Josh Dolin, Michael Pace, and Dan Speak for helpful conversations about and feedback on several of the ideas contained herein.
Photo by Henri Picot on Unsplash