Skip links

Capturing the Complexity of Digital Privacy Nudging

Privacy Nudging 2. Digital privacy nudges can be used to mitigate individual privacy risks and to foster informational self determination. This is critical as in decisions of personal data disclosure individuals often act against their intention. However, the design of digital nudges is complex, and many different aspects need to be considered. Otherwise, digital nudges can backfire and lead to unintended behavior. To provide systematic guidance for the creation of digital privacy nudges, we present specific steps to be considered that can serve as a scaffold.

Originally, the concept of nudging was meant to be used to make an individual’s life safer, easier and of greater benefit. In decisions where individuals tend to choose an alternative against their preferences, nudges should support individuals to align their behavior with their intention. Nudging is defined as a liberty-preserving approach that intends to “alter people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives”. Government policy makers, organisations and companies have increasingly adopted insights from behavioral economics and are using nudges to solve a wide range of behavioral issues. It must be noted that the design of digital nudges is complex, and many different aspects need to be considered. Also, nudges that were originally designed with a good intention can – when shabbily designed – backfire and lead to unintended or conflicting behavior. Nudges may jeopardise other significant goals, for example, when a nudge, designed to reduce pollution, ends up increasing another factor such as the energy costs for the most disadvantaged members of society.

Furthermore, despite their wide use, many nudges embody questionable designs. This is especially critical in decisions of personal data disclosure as individuals often act against their intention, described as the privacy paradox. When designing digital nudges, many argue that nudge designs should be transparent and visible to the users, as, otherwise, critics may conclude that nudges could be of manipulative character.

Individuals want to feel that they are masters of their own actions.

Torben Jan Barev

Given the debates about the manipulative effects of some digital nudges and the suggestions that they would be less effective if individuals were aware of their presence, the concept of reactance is important to consider. Reactance refers to a motivational state of arousal that occurs when individuals feel threatened in their freedom, leading to an attempt to restore one’s own freedom. Shabby nudge elements could lead to reactance and thus have an opposite effect. Individuals want to feel that they are masters of their own actions. Therefore, it would be important to find the sweet spot of transparent designs, providing the right information and supporting stimulus while not manipulating the individual.

This is the second article in a mini-series on privacy nudging. The first one can be read here.

To provide systematic guidance for designing digital privacy nudges, there are specific considerations to make. The general process of creating digital nudges starts with a problem identification phase and an objective setting phase. In these phases, choice architects should identify and focus on the specific behavior that they want to change. In this article, I focus on the design processes, as these are the keys when creating digital nudges. However, also the implementation of the designed nudge components should be aligned and carefully considered. These following steps for designing privacy nudges are not exhaustive, they might even be arranged in another order, though they can serve as a scaffold to design the base frame of digital privacy nudges.

Photo: Pixabay

Assessing the Context

First, choice architects should map out the user journey of the individual. This helps to determine at what point individuals make decisions and to identify optimal timings to implement interventions. At the same time, the processes in which no intervention to be implemented should be identified in order to not interfere with other processes. In doing so, the specific choice architecture should be analyzed to identify relevant context factors that determine the individual’s decision-making. For instance, the decision context differs in terms of privacy-related decisions and health-related decisions.

Assessing the Stakeholders

Considering the affected stakeholders is crucial. As nudges can trigger behavior that sometimes is in line with a specific stakeholder’s interests, for instance a specific company, but not necessarily in line with the individual’s preferences, these nudges would not be fair. To avoid this, nudge designers should take a perspective coming from the individual level and then consider all stakeholders’ interests that are impacted, weigh them fairly and then design the nudge. It is rarely the case to satisfy every stakeholder equally and often there has to be a trade-off. Therefore, we argue that nudge architects should consider a pro-social and pro-environmental goal-oriented justification of nudging. Pro-social means that nudge designers should take users’ interests into consideration. Pro-environmental means nudge architects should align pro-environmental perspectives, meaning that all stakeholders interests that are impacted should be fairly weighted as well. Other stakeholder interests could be e.g., the society in which an individual interacts, the government making the regulations, or the platform providers where the nudge should be implemented. For instance, if the system or platform provider’s interests are neglected, it could be very hard to even implement the nudge. The platform provider supplies the information system in which the nudge should be implemented. Thus, also these interests should be considered, as otherwise the nudge would miss its foundation.

Setting Normative Boundaries

To craft privacy nudge designs, choice architects should consider and assemble the ethical, legal, individual and societal standards that frame the digital nudge. Mapping these factors out is an integral element when designing good practice digital privacy nudges. It is important to highlight that these elements might differ according to the choice architecture and are highly context dependent. For instance, different legal boundaries apply for companies and governments. Yet, nudge designers should identify and set normative boundaries regarding the assessed context.

Targeting the Recipients

Many nudges show decreased effectiveness, as they do not sufficiently target the right individuals. Thus, it is important to get an understanding of the user’s cognitive and affective processes as well as which of the user’s heuristics are accessible. Evidence has been found that individuals with a stronger belief in the internal logic of a given heuristic are more likely to invoke that heuristic when presented with a cue, compared to individuals with a weaker belief. Hence, to achieve higher nudge effectiveness, the users should be segmented and targeted accordingly, as some individuals can be addressed by digital nudges in greater detail than others. The target group should be sufficiently narrow, as ineffective nudges can slow down work processes or stimulate the individual negatively. It is important to ensure that the intended behavior resonates with the nudged person’s preferences and values. This is crucial, as these are important aspects of ethical justification.

Adapting to Technology

Choice architects should build on special characteristics of the used technology and should consider how the individual interacts with it. Nudges should be designed differently when the technology can provide visual, auditory, or haptic feedback. For example, a stop signal can be transmitted to the individual by a red button (visual), an alarm sound (auditory) or a shaking impulse (haptic). Furthermore, a nudge may look differently on a stationary device than a mobile device.

Parts of this post were first published in:

  • Barev, T.J., Schöbel, S., Janson, A., Leimeister, J.M. (2021). DELEN – A Process Model for the Systematic Development of Legitimate Digital Nudges. In: Chandra Kruse, L., Seidel, S., Hausvik, G.I. (eds) The Next Wave of Sociotechnical Design. DESRIST 2021. Lecture Notes in Computer Science(), vol 12807. Springer, Cham.
  • Barev, Torben and Janson, Andreas and Leimeister, Jan Marco: Designing Effective Privacy Nudges in Digital Environments: A Design Science Research Approach. International Conference on Design Science Research in Information Systems and Technology (DESRIST). – Kristiansand, Norway, 2020

Top photo: Tim Mossholder, Unsplash