Unravelling the Complexity of Manipulation Theories
15 April 2024In the previous four posts of this series on ‘Unpacking Manipulation in the Digital Age‘(Post 1; Post 2; Post 3; Post 4), I argued that more attention to different types of social influence is needed and that digital influence often seems to fall into the grey area between persuasion and coercion.
This raises the question of whether and why they fit the category of manipulation. In this post, I argue that, in the pursuit of understanding manipulation in the age of new technologies, two prevalent manipulation theories are available, each with its own set of problems.
Intention-Focused Theories
Philosophical and legal scholars often emphasise intention in theories of manipulation (Sher 2011; Spencer 2020; Noggle 1996). Intentions, they argue, are crucial for distinguishing between purposeful influence and accidental outcomes. Robert Noggle’s manipulation theory stands out, suggesting that manipulators intend to make their victims commit a mistake (Noggle 2020).
However, this approach falls short in the realm of digital influence, where the intent behind a given influence is often elusive. One reason is that we are dealing with AI-mediated influence (see blog post 3 ). When a particular influence – like a blog post or an image posted on Facebook – is mediated by AI, intentions are either missing or hard to identify. And yet misleading or otherwise problematic online influences may strike us as manipulative even if they lack the typically nefarious intentions required by intention-focused theories.
Another reason is that some of the influences that occupy the grey area between persuasion and coercion may be completely bereft of intention. A/B tests are a good example. A/B tests are a user experience research method, which usually involves two variants of a design (A and B) that are randomly distributed to users. In that way the designer can test, for example, whether variant A of a website design works better in terms of e.g. increasing sales on the website. Brignull (2023) describes how A/B-tests can be completely automated to evaluate which, for example, website design works most effectively in generating traffic and to implement the ‘winning‘ design automatically. An unsuspecting designer may simply intend to create a website design vaguely defined as ‘good‘ and end up, by way of the automated process, implementing a design that uses exploitative but effective dark patterns. This will strike many as a manipulative form of influence, and yet, the process, however, lacks the nefarious intentions that intention-based theories require.
In addition to this general problem for intention-based theories, there are significant obstacles for specific interpretations of intention-based manipulation theories. Perhaps the most influential one is the covertness view of manipulation formulated by Daniel Susser, Beate Roessler, and Helen Nissenbaum (Susser et al. 2019a, 2019b). The problem is that it develops an overly stringent necessary requirement on manipulation – that it be covert – which, as others have pointed out, is often not present in clear cases of manipulation (Bongard-Blanchy et al. 2021; Noggle 2022; Klenk 2022).
Disjunctive Characteristics Theories
The digital ethics literature presents theories that consider manipulation as a disjunction of different characteristics. This ‘laundry list‘ approach lists various factors associated with manipulation, such as undermining autonomy or causing harm, and suggests that manipulation is a type of influence that satisfies some or all of these criteria (Botes 2023; Ienca 2023). The challenge arises when trying to find common ground among these diverse features. In particular, disjunctive views struggle to provide a clear understanding of manipulation and pose practical challenges in formulating effective design or policy recommendations, raising both practical and theoretical challenges.
The Practical Challenge for Disjunctive Views
Disjunctive views make it challenging to pinpoint specific measures against manipulation. For instance, an effort to avoid manipulation by being transparent (stipulating that ‘covert influence’ is one of the disjuncts of a manipulation criterion) may conflict with the goal of preventing harm, creating a practical dilemma. This complexity hinders the application of concrete solutions in design and policy recommendations (Klenk 2023).
More generally, an account of manipulation should eventually allow us to derive concrete design recommendations to implement regulatory measures against manipulation in practice. A long list of features merely indicative of manipulation may complicate the task of deriving consistent design requirements too much.
Therefore, unless we are sufficiently convinced – which we should not be at the moment – that there is no more straightforward, unified criterion of manipulation, we should avoid disjunctive views and keep looking for a unified criterion.
The Theoretical Challenge for Disjunctive Views
Disjunctive views also leave a theoretical lacuna. Unless we can find a commonality or unifying factor behind the different disjuncts allegedly related to manipulation, we fail to truly understand the phenomenon.
One aspect of this theoretical lacuna is that we are missing an answer to the question of what, if anything, seemingly widely different forms of manipulation have in common. After all, one instance of ‘manipulation‘ may fit a subset of the disjuncts that another instance of manipulation does not share. What then warrants us classifying both these influences as part of the same phenomenon (Klenk 2024)?
Another aspect of the theoretical lacuna is moral. As Coons and Weber (2014) point out, it seems that there is a fairly uniform moral response to manipulation. It appears to be a problematic form of influence, and various scholars are trying to pinpoint precisely why manipulation is supposed to be a moral problem (Klenk and Hancock 2019). With a disjunctive view, however, it is no longer clear that we can assume that all forms of manipulation should receive the same moral response on a basic level since, as discussed, different forms of manipulation may have nothing in common. When ‘manipulation‘ is used as a meaningful category in policy and regulation, warranting bans and punishment, then this becomes a real practical and legal issue, too.
Conclusion and Outlook
The issues for two prominent approaches to understanding manipulation outlined in this post result in a call for a better theory.
In this series’s sixth and final post, I‘ll review a potential avenue for a more accurate and actionable manipulation theory: the indifference account of manipulation.
Bongard-Blanchy, K., Rossi, A., Rivas, S., Doublet, S., Koenig, V., & Lenzini, G. (2021). ” I am Definitely Manipulated, Even When I am Aware of it. It‘s Ridiculous!“ – Dark Patterns from the End-User Perspective. In W. Ju, L. Oehlberg, S. Follmer, S. Fox, & S. Kuznetsov (Eds.), DIS ’21: Designing Interactive Systems Conference 2021, Virtual Event USA, 28 06 2021 02 07 2021 (pp. 763–776, ACM Conferences). New York,NY,United States: Association for Computing Machinery. doi:10.1145/3461778.3462086.
Botes, M. (2023). Autonomy and the social dilemma of online manipulative behavior. AI and Ethics, 3, 315–323. doi:10.1007/s43681-022-00157-5.
Brignull, H. (2023). Deceptive Patterns: Exposing the tricks tech companies use to control you. Harry Brignull.
Coons, C., & Weber, M. (2014). Manipulation: Investigating the core concept and its moral status. In C. Coons & M. Weber (Eds.), Manipulation: Theory and practice (pp. 1–16). Oxford: Oxford University Press.
Ienca, M. (2023). On Artificial Intelligence and Manipulation. Topoi, 42, 833–842. doi:10.1007/s11245-023-09940-3.
Klenk, M. (2022). (Online) Manipulation: Sometimes Hidden, Always Careless. Review of Social Economy, 80, 85–105. doi:10.1080/00346764.2021.1894350.
Klenk, M. (2023). Algorithmic Transparency and Manipulation. Philosophy & Technology, 36, 1–20. doi:10.1007/s13347-023-00678-9.
Klenk, M. (2024). Ethics of generative AI and manipulation: a design-oriented research agenda. Ethics and Information Technology, 26, 1–15. doi:10.1007/s10676-024-09745-x.
Klenk, M., & Hancock, J. (2019). Autonomy and online manipulation. Internet Policy Review.
Noggle, R. (1996). Manipulative Actions: A Conceptual and Moral Analysis. American Philosophical Quarterly, 33(1), 43–55.
Noggle, R. (2020). Pressure, Trickery, and a unified account of manipulation. American Philosophical Quarterly, 57, 241–252. doi:10.2307/48574436.
Noggle, R. (2022). The Ethics of Manipulation. In E. N. Zalta (Ed.), Stanford Encyclopediaof Philosophy. Summer 2022 .
Sher, S. (2011). A Framework for Assessing Immorally Manipulative Marketing Tactics.Journal of Business Ethics, 102, 97–118. doi:10.1007/s10551-011-0802-4.
Spencer, S. B. (2020). The Problem of Online Manipulation. University of Illinois Law Review, 2020, 959–1006. doi:10.2139/ssrn.3341653.
Susser, D., Roessler, B., & Nissenbaum, H. (2019a). Online Manipulation: Hidden Influences in A Digital World. Georgetown Law Technology Review, 4(1), 1–45.
Susser, D., Roessler, B., & Nissenbaum, H. (2019b). Technology, autonomy, and manipulation. Internet Policy Review, 8, 1–22. doi:10.14763/2019.2.1410.
Image by <a href=”https://pixabay.com/users/iammrrob-5387828/?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=2903156″>Robinraj Premchand</a> from <a href=”https://pixabay.com//?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=2903156″>Pixabay</a>
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- April 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017