Skip links

Detangling the Architecture of Ethical Privacy Nudging

Digital technologies have a deep impact on almost all areas of our lives. In digital environments, individuals tend to share disproportionally more information than in offline settings. Digital nudging is a promising approach from behavioral economics to support users in privacy-sensitive decision-making. In this article, we discuss the key design components and requirements for creating ethical privacy nudges.

Privacy Nudging 4: Contrary to the economic theory of the homo economicus, individuals are not able to act rationally all the time. Especially in digital environments, individuals often decide irrationally and to their disadvantage. Especially when all alternatives to a decision cannot be considered by individuals, or the consequences are not clear.

Multiple studies provide evidence that digital environments generally lead to increased self-disclosure compared to direct face-to-face communication. The increased willingness for self-disclosure is attributed, among other things, to the fact that individuals feel a stronger sense of anonymity, social cues are weaker in comparison to face-to-face situations, and the communication situation is perceived to a greater extent as controllable. However, this is often a fallacy. In decisions where individuals tend to struggle, nudges can support them in aligning their preferences with their behaviour. Nudging is a promising approach so that users can be enabled and guided to make privacy-sensitive decisions.

This is the fourth article in a mini-series on privacy nudging. The first is Privacy Nudging Can Stop Irrational Decision-Making The second is Capturing the Complexity of Digital Privacy Nudging The third is Ethical Design Nudging Supports Autonomy and Transparency.

When discussing the ethical design of digital nudges, legitimacy can be important. Legitimacy means that dealings between different entities are fair. This is important, as offline and online environments offer no neutral way of presenting choices. When choice architecture cannot be neutral, it should not be manipulative. If nudges are not legitimately designed, one might argue that they undermine an individual’s autonomy which must be avoided.

A prominent example is the framing of decisions to accept internet cookies. Framing can be done in two ways: To support the collection of user data, or to protect user data by framing the legal and ethically compliant alternative. Although data-protection-friendly defaults are set in accordance with legal requirements, more and more internet platforms are framing the decision in such a way that it is easier for users to agree to additional cookies than to keep the actual default. Thus, ethical nudge designs should urgently promote individuals’ autonomy and transparent disclosure of the used and implemented nudges in digital work systems.

Following, we present a selection of design requirements clustered in design components that should be considered when designing ethical privacy nudges. These requirements are not exhaustive, but they can serve as a scaffold to design the base frame of digital privacy nudges. 

Illustration by Torben Jan Bare

It is important to ensure that the intended behaviour resonates with the nudged person’s preferences and values. This is crucial, as these are important aspects of ethical justification and legitimacy of digital nudge designs. A nudge should be designed to effectively support the users’ behaviour to change in a desirable direction. There is also a wide range of systems available that can influence the effectiveness of the nudges. Choice architects must decide to what extent, e.g., artificial intelligence (AI) or big data can improve digital nudges. For instance, AI can enhance digital nudges to make them dynamic, e.g., referring to smart or hyper nudges. 

The New York Times Example

When looking at specific choice architectures, we can analyse how the consent for tracker settings is presented. In this example, we are presenting the website of the daily newspaper The New York Times. Here, the consent notices are placed at the bottom of the screen. The consent options are not entirely blocking the interaction with the website and are relatively smoothly integrated into the user interface. Users are still enabled to access the website and the consent note does not dramatically slow down work processes.  In contrast, some other websites implement notices that block the full screen and are mandatory to click before moving on. The consent note that is presented in this example provides information about the specific purpose of the data collection. The information is presented in a relatively short text for better readability and to focus on some main points. The design is in black and white ensuring an ergonomic and simplified design. Additional sources such as the cookie and privacy policies are linked to offer further information and can be accessed directly by clicking the icon. However, some information could be presented more clearly, for instance, who are the third parties involved. On the right side two buttons allow an easy acceptance or rejection of tracker settings, both buttons are highlighted equally. The consent notice also presents an additional option besides an acceptance/rejection button that simply closes the window. The decision is not dramatically forced, and some can argue that freedom of choice is preserved for the user in that case. Visual cues highlight where the user should click. However, it would be better if the privacy-sensitive choice was indicated by e.g., color coding or privacy icons to make it more privacy friendly. Privacy icons can present a way to create additional transparency and help to ensure that data subjects take note of the most important information about the processing of their data. An example of privacy icons can be found here. When interacting with the consent note a cognitive overload should be avoided, making it possible for the user to weigh their decision and align their behavior. A relatively simple design focusing on relevant information can ensure that the user is not stressed or overwhelmed, enabling the users’ informational self-determination. 

Parts of this post were first published in:

  • Barev, Torben and Janson, Andreas and Leimeister, Jan Marco: Designing Effective Privacy Nudges in Digital Environments: A Design Science Research Approach. International Conference on Design Science Research in Information Systems and Technology (DESRIST). – Kristiansand, Norway, 2020
  • Barev, T.J., Schöbel, S., Janson, A., Leimeister, J.M. (2021). DELEN – A Process Model for the Systematic Development of Legitimate Digital Nudges. In: Chandra Kruse, L., Seidel, S., Hausvik, G.I. (eds) The Next Wave of Sociotechnical Design. DESRIST 2021. Lecture Notes in Computer Science(), vol 12807. Springer, Cham.

Read about our contributors here