Unpacking Manipulation for the Digital Age
18 March 2024Public debate is shaped partly by human social influence, and we routinely distinguish different types of social influence, such as persuasion, coercion, and manipulation.
While persuasion and coercion are reasonably well understood, philosophers have only recently begun to study the nature and ethics of manipulation. More attention to manipulation is essential because new digital technologies amplify the role of manipulation in shaping public debate.
This series entitled ‘Unpacking Manipulation in the Digital Age’ consists of six short essays that discuss (1) the proliferation of problematic forms of social influence in the digital age, (2) the new problem of unwitting and unintentional forms of problematic social influence that results from digital influence, (3) the need to distinguish between different types of social influence such as persuasion and manipulation, (4) the inadequacy – especially in the digital age – of the continuum model of influence that places manipulation somewhere between persuasion and coercion, (5) problems with two prominent accounts of manipulation, and (6), the indifference account of manipulation as a promising way to make sense of manipulation in the digital age.
Blog I:
‘Navigating the Digital Influence Landscape: Three Aggravating Factors’
In the internet age, which has been called the “epistemological crisis of the 21st century” (Leiter 2022), the prevalence of online manipulation has become a pressing concern.
For example, legislators and policymakers in the EU are now intensely focused on regulating manipulative influence (Faraoni 2023), which raises open questions about identifying and evaluating this peculiar type of social influence.
So, in this blog series entitled ‘Unpacking Manipulation in the Digital Age,’ I explore the growing significance of identifying and addressing problematic forms of manipulation in our “digital life-world” (Susskind 2018).
I begin by pointing out three significant developments that account for the growing relevance of manipulative influence: the proliferation of social influence, informationally empowered influence, and AI-mediated influence.
The Proliferation of Social Influence
The internet and social media have democratised access to mass influence, granting unprecedented numbers of people and organisations the power to influence others at scale. While this accessibility has positive aspects (consider the role of social media in the Arab Spring, cf. Pew Research Center’s Journalism Project 2012), it also amplifies the potential for problematic types of influence, including coercion, deception, and manipulation.
It also matters who gets access. The rising number of potentially problematic influencers, coupled with the lack of gatekeepers, exacerbate the issue. Peter the psychopath terrorising his companions in a remote village is one thing; Peter having thousands of followers on social media is quite another in terms of potentially problematic instances of social influence.
So, as more potentially problematic influencers gain access to mass influence, the problem is worsened, and we need to pay closer attention to the types of influences that increase in the digital world.
Informational Empowered, Predictive Influence
In addition, access to informationally empowered influence has become a formidable force that “aggravates” issues with the proliferation of social influence (Klenk and Jongepier 2022b).
A crucial effect of the often-discussed ‘surveillance age’ (Zuboff 2019) is the possibility of predicting the effectiveness of different ways of influencing others using anything from social media, mobile sensing, or psychological tests, which enables targeted and personalised influence (Matz et al. 2017). Though the effect of informationally empowered influence through personalisation and targeting is, thus far, contested and, if anything, small, it is undoubtedly a cause of increased concern about digital influence.
For example, the supermarket Target used predictive data analysis to target a pregnant client with advertising, learning about her pregnancy even before her father did (Hill 2012). As illustrated by this case, a modern-day Iago no longer needs to be acquainted with his target, Othello, to learn about and exploit his weak spots, which opens the door to more potent forms of influence even at a distance (Klenk and Jongepier 2022a).
While specific insights into human behaviour are available to only a few actors (data powerhouses like Google, Amazon, Microsoft, or Facebook), general insights into human decision-making and ‘psychological tricks’ like shrewd use of default options are available to a broad segment of those empowered by digital access to mass influence. Scientific insights into influence techniques like nudging, framing, and social heuristics, popularised in bestsellers like “Nudge,” “Hooked,” or “Influence” (Thaler and Sunstein 2009; Eyal 2016; Cialdini 2009), thus further contribute to the evolving landscape of social influence and its impact on public debate.
The newfound power of informationally empowered influence is often underappreciated and used without reflection. For instance, I was baffled to learn at a recent talk that user experience (UX) design students grow up with a deep appreciation of the goals and techniques set out in Eyal’s “Hooked”, which promises to teach readers how to build “habit-forming products.” As Brignull (2023) points out, the ‘hook’ cycle espoused in the book is just a reconceptualisation of the dopamine loop that medical researchers use to understand how addictions manifest themselves. We learn from science, alright, but we do it without much critical understanding of what we create.
In summary, the digital age also aggravates the newfound access to mass influence for many actors with informational, data-driven power to influence more effectively.
AI-Mediated influence
Finally, note that informational empowerment and the proliferation of social influence still have humans at their core. Increasingly, however, Artificial intelligence (AI) plays a pivotal role in mediating social influence, raising questions about the values embedded in the mediation process that shape the type of influences we encounter online (Hancock et al. 2020).
For example, recommender systems determine the reach and impact of social media posts. For all Peter the psychopath may know about swaying human opinion, ‘hooking’ his followers, and influencing them using social heuristics, the actual reach and impact of his carefully crafted post will ultimately depend on the workings of the social media platform’s recommender system that mediates his influence on others.
This form of mediating content or influences created by humans is one form of AI mediation that warrants closer attention. Another form is the generation of influence by AI systems themselves. Generative AI revolutionises influence creation, removing the bottleneck of creating content manually.
On the one hand, this means that criminals, fraudsters, trolls, and propagandists will more easily and automatically generate influence that suits their nefarious schemes (Weidinger et al. 2023; Goldstein et al. 2023).
On the other hand, the rise of influence driven by generative AI applications means an increased risk of unwitting manipulation, which I will cover in more detail in the next post in this series (see also Klenk 2024 for discussion). In short, even if you have no malign intentions, the ‘rules’ by which generative AI tends to generate influence may not be aligned with the rules that we consider important for good influence.
Conclusion and Outlook
Problematic forms of social influence are on the rise. The digital age proliferates access, enables informational empowerment, and mediates social influence using AI.
In the next post in this series, I will show how these three factors are increasingly combined and how they lead to a new and hitherto neglected problem concerning the value of digital influence in terms of its corrupting effect.
References
Brignull, H. (2023). Deceptive Patterns: Exposing the tricks tech companies use to control you. Harry Brignull.
Cialdini, R. B. (2009). Influence: Science and practice. Pymble, NSW, New York, NY: HarperCollins.
Eyal, N. (2016). Hooked: How to build habit-forming products. Norwick: Penguin Books.
Faraoni, S. (2023). Persuasive Technology and computational manipulation: hypernudging out of mental self-determination. Frontiers in Artificial Intelligence, 6, 1216340. doi:10.3389/frai.2023.1216340.
Goldstein, J. A., Sastry, G., Musser, M., DiResta, R., Gentzel, M., & Sedova, K. (2023). Generative Language Models and Automated Influence Operations: Emerging Threats and Potential Mitigations. doi:04246.
Hancock, J. T., Naaman, M., & Levy, K. (2020). AI-Mediated Communication: Definition, Research Agenda, and Ethical Considerations. Journal of Computer-Mediated Communication, 25, 89–100. doi:10.1093/jcmc/zmz022.
Hill, K. (2012, February 16). How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did. Forbes. https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/. Accessed 28 November 2023.
Klenk, M. (2024). Ethics of generative AI and manipulation: a design-oriented research agenda. Ethics and Information Technology, 26, 1–15. doi:10.1007/s10676-024-09745-x.
Klenk, M., & Jongepier, F. (2022a). Introduction and overview of chapters. In M. Klenk & F. Jongepier (Eds.), The Philosophy of Online Manipulation (pp. 1–12). New York, NY: Routledge.
Klenk, M., & Jongepier, F. (2022b). Manipulation Online: Charting the field. In M. Klenk & F. Jongepier (Eds.), The Philosophy of Online Manipulation (pp. 15–48). New York, NY: Routledge.
Leiter, B. (2022). The Epistemology of the Internet and the Regulation of Speech in America. Georgetown Journal of Law & Public Policy, 20, 903.
Matz, S. C., Kosinski, M., Nave, G., & Stillwell, D. J. (2017). Psychological targeting as an effective approach to digital mass persuasion. Proceedings of the National Academy of Sciences, 114, 12714–12719. doi:10.1073/pnas.1710966114.
Pew Research Center’s Journalism Project. (2012). The Role of Social Media in the Arab Uprisings. https://www.pewresearch.org/journalism/2012/11/28/role-social-media-arab-uprisings/. Accessed 25 January 2024.
Susskind, J. (2018). Future politics: Living together in a world transformed by tech. Oxford: Oxford University Press.
Thaler, R. H., & Sunstein, C. R. (2009). Nudge: Improving decisions about health, wealth and happiness / Richard H. Thaler and Cass R. Sunstein. London: Penguin Books.
Weidinger, L., Rauh, M., Marchal, N., Manzini, A., Hendricks, L. A., Mateos-Garcia, J., et al. (2023). Sociotechnical Safety Evaluation of Generative AI Systems. https://arxiv.org/pdf/2310.11986.pdf.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York, NY: PublicAffairs.
Image by Gerd Altmann from Pixabay
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- April 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017