A tweet made it to the United States’ front page of Reddit.com on July 21st 2019 from the subreddit, r/GetMotivated. The tweet, originally from @michaelmiraflor, read, “Unfollow [Instagram] models and influencers. Start following artists and designers. Your entire outlook on life will change”.
The tweet expresses something intuitively true: what you read will affect not only your mood but your worldview. Social media allows us a constantly changing stream of content to consume and so it has great potential to shape our beliefs. As the tweet hints at, though, much of the online content we are steeped in can be damaging to us – not only emotionally, but by changing what we believe.
The tweet raises an important epistemic question: how much control do we have over what we end up believing from what we see online? Is there any epistemic risk that comes with being exposed to particular kinds of content, or are we able to pick and choose what stays with us?
Consider your beliefs about “matters of fact”, e.g., that glass is made of sand. It’s unlikely that you’ve had the personal experiences required in order to form this belief on your own, rather you’ve probably acquired it from someone else. Consider, too, your ethical beliefs, e.g., about the morality of meat eating. Are these the result of personal research and reflection? To some degree, the answer is probably “yes”. To a non-trivial degree, though, the answer is more likely a complicated story of trusting others’ judgement and personal-historical accident. For many of our beliefs, we may not be able to muster any reasons for why we took them on at all.
We can now separate out two issues. The first has already been mentioned, can we will ourselves to believe particular things? Put differently, do we have direct voluntary control over what we believe? The second issue is about the extent to which we’re in control of the evidence or information that we listen to and read. This matters because the content that we have access to shapes the range of possible beliefs that we can take on.
Let’s begin with the second issue. The Internet has empowered us by providing the greatest source of knowledge we’ve ever had access to. Thus, we have more opportunity than ever before to take control of the content that we see. Social media is particularly ripe for this, and it can provide a window into the lives and beliefs of people we never would have otherwise encountered. This is a win for proponents of open-mindedness and intellectual humility.
Of course, a vast amount of content online isn’t epistemically virtuous. It’s telling that some of the most recognisable phrases of the past few years are “fake news” and “alternative facts”. When we find ourselves swamped with false, misleading, or otherwise epistemically poor information, our worldview is at risk of becoming distorted from reality. This leads us to our first issue: Do we have direct voluntary control over the beliefs we form? If we do, then it might not matter so much that we’re exposed to lots of poor quality content online – we can pick and choose what to believe.
However, it’s actually not apparent that we have the ability to choose what we believe. Take your own belief about the morality of eating meat. If you believe it is moral, can you will yourself to believe that it is immoral? Most philosophers agree that we can’t simply will ourselves to believe things. Instead, and in answer to our first issue, we have only indirect voluntary control over the beliefs that we form.
What this usually means is that we have some control over the kinds of evidence or the testimony that we access, and our beliefs follow from exposure to these sources. This might be due in some way to rational processes like weighing the evidence for and against some idea. Oftentimes, though, it’s likely that we take on beliefs for non-rational reasons – maybe emotional reasons, or maybe because we are manipulated.
We generally accept that childrens’ beliefs work indirectly this way. They take on the beliefs of those that are around them. Children don’t get much of a say about who and what they end up listening to and thus the beliefs they’ll develop. Our mistake is thinking that we cease to be sponges as we age. What we gain is some control over who we listen to and what we read. We don’t gain the ability to pick and choose what to believe.
As Annette Baier put it when describing the development of personhood more generally, we are essentially “second persons”. We develop gradually over time, depending on other people in a variety of ways at different stages of life. Our personalities and our beliefs about the world develop in this context of dependency on others. As we age, we often develop somewhat reflexively in reaction both to those we’ve depended on and to earlier phases of ourselves. Who we are and what we believe is causally dependent on these relations to other people and our relationship to our own histories.
This is a dynamic that’s familiar to many (myself included!) who have spent time thinking through the complicated nature of autonomy. We like to believe that we are the sole author of our own lives. It’s perhaps especially important to feel that we are the author of the beliefs that form our worldview, given the role that they play in making up our identity. But, I contend, the development of our worldview, and thus our identities, is highly dependent on our social environment.
When we consider the content that we are exposed to on social media, then, we should be wary in thinking that we will be able to see through the fake news, the alternative facts, and the emotionally manipulative content. Unfortunately, we’re not hardwired to believe the true things that we are presented with and ignore the false things – we can’t perceive truth directly. We should be cautious about exposing ourselves to epistemically dubious content under the illusion that we’ll be able to shrug off the false and misleading content and decide to believe the truth.
There’s another take-away too. As producers of online content, we should be aware of how what we share contributes to the epistemic environment that shapes others’ beliefs. Monitoring the sea of content online is a collective epistemic responsibility; we have to be vigilant to avoid the information pollution of our rapidly growing epistemic community. To be responsible epistemic agents we need to be picky listeners, but we should also be picky speakers – what we say online molds the worldviews of others too.