Social media: a viral promoter of social ills?
8 August 2022Public discourse is the currency in which we exchange our attitudes and beliefs. Social media has proven a double-edged sword with respect to this exchange. On the one hand, it has enabled debate and contact between citizens on a scale previously unimaginable. On the other, this ability to exchange ideas has been all too easily subverted by anti-democratic forces. Indeed, the single greatest problem posed today to the democratic norms of free speech and tolerance is the manner, the speed and scale in which these forces exploit group-identity cleavages, manipulate prejudice and bias, stoke fear and hatred, spread propaganda and misinformation, as well as incite to harassment and violence. These problems are by no means confined to the online world. However, by enabling them to proliferate and amplify, social media acts as a booster of social ills.
There is ample evidence that social media platforms have played a facilitating role in spreading hateful content and oppressive ideologies, thereby contributing to the recent upsurge in extreme nationalist and nativist ideology in mainstream politics. Social media use has been instrumental to stoking group-divisions through the spread of hateful content based on group-identity (race, gender, ethnicity, nationality, religion, sexual orientation, disability, immigration status, etc.). There is a growth of misogynistic websites, blogs and forums promoting gender-based hate. There is also a surge in hate crimes related to sexual orientation and gender identity. Notably, the systematic use of propaganda on Facebook and Twitter has been instrumental to the ethnic cleansing of the Rohingya in Myanmar.
Social media has also been instrumental in propagating oppressive ideological narratives. A rise in online harm has been found to correlate with a rise in violence and hate crimes. Equally, the widespread display of bigotry, bullying and harassment on Facebook and Twitter has been linked to social unrest. As Frances Haugen testifies, Facebook has failed to stop the spread of hate speech. For example, the Unite the Right rally in Charlottesville (2017) originated as a Facebook event in which many fascist and racist hate groups used the platform to incite to violence. Less mainstream platforms such as Gab, Parler, Telegram, and imageboards such as Reddits (4chan, 8chan), have been the playground for individuals with extreme right-wing views who openly share hateful content as well as creating communities of hate in discussion fora. This has led to a rapid propagation of extreme viewpoints and radicalization (which often correlates with violent actions, e.g. mass shootings in Charleston 2015, San Diego 2019, Christchurch 2019; the 6 Jan ’21 Capitol Hill insurrection).
There is also evidence that social media platforms have been used as a launchpad for spreading propaganda, mis/disinformation, fake news, conspiracy theories and political polarization with a view to shape and steer public opinion, whether in the context of voting, such as the 2016 US presidential election campaign and the Brexit referendum, or the Covid-19 pandemic. This is damaging both for public health and the moral fabric of society.
What are the conditions that make social media platforms into a turbocharging transmissibility machine?
One way to think about this is that the difference in the medium through which content is transmitted —online vs offline—comes with certain factors in the case of online media that enable social harms to spread and proliferate to unprecedented speed and rate. So, in the past, bigotry has spread by person-to-person conversations, through print media and through broadcast media. Each has an increasing number of people who might be reached. Hence the deployment of broadcast and print media by those seeking to spread hate. However, online media offers something extra. What?
First, online media levels speaker authority. A Facebook post or a tweet has, relatively speaking compared to traditional print or broadcast media, the same form and footprint regardless of whether it is from a trusted news source or a single person. In previous years a racist might have to write a letter to a local newspaper to have it circulated widely and this was subject to editorial choice and given a different status and prominence to the editorial from that paper. This inequality of publication power has been considerably levelled by the internet and social media. A single individual can author a post or video easily using widely available technology. This democratisation of broadcast speech is powerful: it levels speaker’s authority and perceived trustworthiness so that it’s easier to drown out an authoritative information source. This promotes viral spread of bigotry because it reduces the authority and defence enabled by the institutions of civil society.
Second, online media, particularly that involving advertising, incentivises both platforms and actors in ways that promote the duration of exposure to hate speech and increases its intensity. For example, in the first case, the algorithms employed to promote content—making one speech act more prominent than another in your feed—promote similar content to that already consumed. This leads to content homogenisation. Content homogenisation means that if someone has viewed hate speech a few times they will rapidly have their feed dominated by it, rather than by opposing views. Platform algorithms select information that promotes longer engagement online and thus increase revenue from advertising. Thus, there is an incentive on the part of platforms to increase exposure time to hate speech. In addition, posters themselves are incentivised to make speech acts that unleash outrage. This is because such speech acts—in particular those on the extreme end of the Overton window—tend to circulate faster through networks, and thus be more viewed and generate more sharing. In addition, posters are incentivized monetarily for this on some platforms—e.g. YouTube—since they are paid by the number and duration of views of their content.
Third, most obviously, social media allows the rapid creation of non-geographically based communities. The network infra-structure can create an auspicious environment for the spread harmful content. It makes it easy to create the critical mass of people needed to establish a thriving community of bigots. This is important, because bigotry does not thrive in isolation, but requires constant reinforcement from a community of bigots. These communities support individuals whose bigoted beliefs are questioned by others, so that they maintain adherence to the bigoted belief and attitude system.
Finally, the network structure of social media platforms satisfies our preference to belong to an in-group and to mix with those with similar preferences. However, this very feature has the effect of limiting the range of voices that people hear, alienating and excluding perceived out-groups, and creating a bubble effect in which perceived in-groups reinforce each other’s beliefs in the absence of challengers and dissenters. This becomes particularly dangerous in the transmission of harmful content such as hate speech, propaganda and misinformation, because there is but a small step to move from ideas to action.
Picture: Corona Virus from CDC Images Library: PHIL ID #23354
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- April 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017