Skip to main content

Open for Debate

Unpacking Manipulation for the Digital Age

18 March 2024

Public debate is shaped partly by human social influence, and we routinely distinguish different types of social influence, such as persuasion, coercion, and manipulation.

While persuasion and coercion are reasonably well understood, philosophers have only recently begun to study the nature and ethics of manipulation. More attention to manipulation is essential because new digital technologies amplify the role of manipulation in shaping public debate.

This series entitled ‘Unpacking Manipulation in the Digital Age’ consists of six short essays that discuss (1) the proliferation of problematic forms of social influence in the digital age, (2) the new problem of unwitting and unintentional forms of problematic social influence that results from digital influence, (3) the need to distinguish between different types of social influence such as persuasion and manipulation, (4) the inadequacy – especially in the digital age – of the continuum model of influence that places manipulation somewhere between persuasion and coercion, (5) problems with two prominent accounts of manipulation, and (6), the indifference account of manipulation as a promising way to make sense of manipulation in the digital age.

Blog I:
‘Navigating the Digital Influence Landscape: Three Aggravating Factors’

In the internet age, which has been called the “epistemological crisis of the 21st century” (Leiter 2022), the prevalence of online manipulation has become a pressing concern.

For example, legislators and policymakers in the EU are now intensely focused on regulating manipulative influence (Faraoni 2023), which raises open questions about identifying and evaluating this peculiar type of social influence.

So, in this blog series entitled ‘Unpacking Manipulation in the Digital Age,’ I explore the growing significance of identifying and addressing problematic forms of manipulation in our “digital life-world” (Susskind 2018).

I begin by pointing out three significant developments that account for the growing relevance of manipulative influence: the proliferation of social influence, informationally empowered influence, and AI-mediated influence.

The Proliferation of Social Influence

The internet and social media have democratised access to mass influence, granting unprecedented numbers of people and organisations the power to influence others at scale. While this accessibility has positive aspects (consider the role of social media in the Arab Spring, cf. Pew Research Center’s Journalism Project 2012), it also amplifies the potential for problematic types of influence, including coercion, deception, and manipulation.

It also matters who gets access. The rising number of potentially problematic influencers, coupled with the lack of gatekeepers, exacerbate the issue. Peter the psychopath terrorising his companions in a remote village is one thing; Peter having thousands of followers on social media is quite another in terms of potentially problematic instances of social influence.

So, as more potentially problematic influencers gain access to mass influence, the problem is worsened, and we need to pay closer attention to the types of influences that increase in the digital world.

Informational Empowered, Predictive Influence

In addition, access to informationally empowered influence has become a formidable force that “aggravates” issues with the proliferation of social influence (Klenk and Jongepier 2022b).

A crucial effect of the often-discussed ‘surveillance age’ (Zuboff 2019) is the possibility of predicting the effectiveness of different ways of influencing others using anything from social media, mobile sensing, or psychological tests, which enables targeted and personalised influence (Matz et al. 2017). Though the effect of informationally empowered influence through personalisation and targeting is, thus far, contested and, if anything, small, it is undoubtedly a cause of increased concern about digital influence.

For example, the supermarket Target used predictive data analysis to target a pregnant client with advertising, learning about her pregnancy even before her father did (Hill 2012). As illustrated by this case, a modern-day Iago no longer needs to be acquainted with his target, Othello, to learn about and exploit his weak spots, which opens the door to more potent forms of influence even at a distance (Klenk and Jongepier 2022a).

While specific insights into human behaviour are available to only a few actors (data powerhouses like Google, Amazon, Microsoft, or Facebook), general insights into human decision-making and ‘psychological tricks’ like shrewd use of default options are available to a broad segment of those empowered by digital access to mass influence. Scientific insights into influence techniques like nudging, framing, and social heuristics, popularised in bestsellers like “Nudge,” “Hooked,” or “Influence” (Thaler and Sunstein 2009; Eyal 2016; Cialdini 2009), thus further contribute to the evolving landscape of social influence and its impact on public debate.

The newfound power of informationally empowered influence is often underappreciated and used without reflection. For instance, I was baffled to learn at a recent talk that user experience (UX) design students grow up with a deep appreciation of the goals and techniques set out in Eyal’s “Hooked”, which promises to teach readers how to build “habit-forming products.” As Brignull (2023) points out, the ‘hook’ cycle espoused in the book is just a reconceptualisation of the dopamine loop that medical researchers use to understand how addictions manifest themselves. We learn from science, alright, but we do it without much critical understanding of what we create.

In summary, the digital age also aggravates the newfound access to mass influence for many actors with informational, data-driven power to influence more effectively.

AI-Mediated influence

Finally, note that informational empowerment and the proliferation of social influence still have humans at their core. Increasingly, however, Artificial intelligence (AI) plays a pivotal role in mediating social influence, raising questions about the values embedded in the mediation process that shape the type of influences we encounter online (Hancock et al. 2020).

For example, recommender systems determine the reach and impact of social media posts. For all Peter the psychopath may know about swaying human opinion, ‘hooking’ his followers, and influencing them using social heuristics, the actual reach and impact of his carefully crafted post will ultimately depend on the workings of the social media platform’s recommender system that mediates his influence on others.

This form of mediating content or influences created by humans is one form of AI mediation that warrants closer attention. Another form is the generation of influence by AI systems themselves. Generative AI revolutionises influence creation, removing the bottleneck of creating content manually.

On the one hand, this means that criminals, fraudsters, trolls, and propagandists will more easily and automatically generate influence that suits their nefarious schemes (Weidinger et al. 2023; Goldstein et al. 2023).

On the other hand, the rise of influence driven by generative AI applications means an increased risk of unwitting manipulation, which I will cover in more detail in the next post in this series (see also Klenk 2024 for discussion). In short, even if you have no malign intentions, the ‘rules’ by which generative AI tends to generate influence may not be aligned with the rules that we consider important for good influence.

Conclusion and Outlook

Problematic forms of social influence are on the rise. The digital age proliferates access, enables informational empowerment, and mediates social influence using AI.

In the next post in this series, I will show how these three factors are increasingly combined and how they lead to a new and hitherto neglected problem concerning the value of digital influence in terms of its corrupting effect.



Brignull, H. (2023). Deceptive Patterns: Exposing the tricks tech companies use to control you. Harry Brignull.

Cialdini, R. B. (2009). Influence: Science and practice. Pymble, NSW, New York, NY: HarperCollins.

Eyal, N. (2016). Hooked: How to build habit-forming products. Norwick: Penguin Books.

Faraoni, S. (2023). Persuasive Technology and computational manipulation: hypernudging out of mental self-determination. Frontiers in Artificial Intelligence, 6, 1216340. doi:10.3389/frai.2023.1216340.

Goldstein, J. A., Sastry, G., Musser, M., DiResta, R., Gentzel, M., & Sedova, K. (2023). Generative Language Models and Automated Influence Operations: Emerging Threats and Potential Mitigations. doi:04246.

Hancock, J. T., Naaman, M., & Levy, K. (2020). AI-Mediated Communication: Definition, Research Agenda, and Ethical Considerations. Journal of Computer-Mediated Communication, 25, 89–100. doi:10.1093/jcmc/zmz022.

Hill, K. (2012, February 16). How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did. Forbes. Accessed 28 November 2023.

Klenk, M. (2024). Ethics of generative AI and manipulation: a design-oriented research agenda. Ethics and Information Technology, 26, 1–15. doi:10.1007/s10676-024-09745-x.

Klenk, M., & Jongepier, F. (2022a). Introduction and overview of chapters. In M. Klenk & F. Jongepier (Eds.), The Philosophy of Online Manipulation (pp. 1–12). New York, NY: Routledge.

Klenk, M., & Jongepier, F. (2022b). Manipulation Online: Charting the field. In M. Klenk & F. Jongepier (Eds.), The Philosophy of Online Manipulation (pp. 15–48). New York, NY: Routledge.

Leiter, B. (2022). The Epistemology of the Internet and the Regulation of Speech in America. Georgetown Journal of Law & Public Policy, 20, 903.

Matz, S. C., Kosinski, M., Nave, G., & Stillwell, D. J. (2017). Psychological targeting as an effective approach to digital mass persuasion. Proceedings of the National Academy of Sciences, 114, 12714–12719. doi:10.1073/pnas.1710966114.

Pew Research Center’s Journalism Project. (2012). The Role of Social Media in the Arab Uprisings. Accessed 25 January 2024.

Susskind, J. (2018). Future politics: Living together in a world transformed by tech. Oxford: Oxford University Press.

Thaler, R. H., & Sunstein, C. R. (2009). Nudge: Improving decisions about health, wealth and happiness /  Richard H. Thaler and Cass R. Sunstein. London: Penguin Books.

Weidinger, L., Rauh, M., Marchal, N., Manzini, A., Hendricks, L. A., Mateos-Garcia, J., et al. (2023). Sociotechnical Safety Evaluation of Generative AI Systems.

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York, NY: PublicAffairs.

Image by  Gerd Altmann  from  Pixabay