Informational self-determination can be challenging in digital environments. Insights from behavioral science can support individuals to align their behavior with their intention. In this article, we introduce the concept of boosting. Boosts are interventions in digital systems that equip users with competencies to make better choices.
There is no neutral way of presenting choices although some would wish for it. Every way a choice is presented will influence how the decision-maker chooses.
Choice architects are organizing the context in which individuals make decisions. When organizing the choice environment, Thaler and Sunstein argue that choice architectures should be designed to make life safer, easier, and of greater benefit. Choice architects should design interventions very thoughtfully and foresightedly. Often, small, and apparently insignificant details can majorly impact people’s behavior. A good rule of thumb is to assume that “everything matters.”
Tools for Choice Architects
When analyzing how to support users, we want to explore the tools available to choice architects.
One approach to influence user decision-making is nudging. Nudging can support users on digital platforms to stop irrational decision-making. Especially in digital environments users often decide to their own disadvantage. In previous articles we introduced how nudging can be used to support users in their privacy-sensitive decision-making, we presented the pitfalls of nudging and what an ethical nudge design could look like. Further, we introduced specific considerations when systematically designing digital nudges.
Mini-series on Privacy Nudging. The first article is Privacy Nudging Can Stop Irrational Decision-Making The second is Capturing the Complexity of Digital Privacy Nudging The third is Ethical Design Nudging Supports Autonomy and Transparency. The fourth article is Detangling the Architecture of Ethical Privacy Nudging.
The Concept of Boosting and the Underlying Mechanisms
Another concept from behavioral science that choice architects can use is called boosting. Boosts can be defined as interventions in digital systems that target users’ competencies to change decision behavior. The concepts of nudging and boosting are closely linked. While nudges are designed to target and change behavior directly by changing the choice environment, boosts equip people with competencies to make better choices themselves. With boosting an educational aspect comes into focus.
To understand how to design specific interventions, it is important to understand how users process their decision-making. With an overflow of information in everyday life, individuals rarely have enough time and information to fully evaluate all alternatives. Instead of performing a systematic decision-making process, individuals tend to deploy so-called heuristics (mental shortcuts). Heuristics are informal rules of thumb that reduce the complexity of decision-making and thus represent abbreviations in decision-making. Although heuristics are an efficient way to solve recurring problems, they can lead to systematic errors such as biases in information evaluation. For example, personal data is often disclosed carelessly because the risk of unwanted monitoring is mentally less tangible (availability heuristics). These false conclusions do not mean that the behavior of individuals is unpredictable and irrational. Rather, it is a systematic and thus predictable deviation from rational behavior.
Choice architects can now analyze these heuristics to design specific interventions for digital systems and guide users’ behavior. However, both mechanisms – nudging and boosting – can support users’ decision-making while they should preserve freedom of choice. They should be easy to reject if the decision maker wants to decide against it.
The following figure presents the underlying mechanisms of nudge and boosts interventions. Both interventions – nudges, and boosts build on psychological and behavioral insights.

The Difference between Nudges and Boosts
When users are displayed to the boost, over time users can harness the new and increased competencies to make choices for themselves. Consequently, the implied effects should persist once the (successful) intervention is removed. In contrast to boosts, once the nudge element is removed, the user’s behavior usually reverts to the preintervention state. It should be noted that a nudge, that is implemented into the system and changes users’ behavior repeatedly for a longer period, can also produce behavioral routines. In this case, the desired users’ behavior may last even when choice architects would remove the nudge from the system.

Nudge designers adapt the choice architecture and create environments that harness people’s cognitive or motivational deficiencies to guide behavior.
An example of a typical nudge is default settings. Some given options are pre-selected. Defaults are very effective since individuals often do not adapt settings to their needs, the default option (the status quo) remains overly preferred. By contrast, boost designers are using educational elements to teach users competencies for their decision-making. Unlike nudging, boosting is based on the premise that people can navigate complex environments well and make largely rational decisions despite limited cognitive capacities. Boosting focuses on improving competencies. An example of a boost can be the provision of specific information or techniques that can help the user in decision-making. In this case, the changed user’s cognitive and motivational competencies would change the user’s decision behavior.
Boosting and Selected Privacy Heuristics
Designing digital interventions and supporting users is specifically important when dealing with complex decisions such as personal data disclosure. Multiple studies provide evidence that digital environments generally lead to increased self-disclosure compared to direct face-to-face communication. In digital environments, individuals tend to share disproportionally more information. The increased willingness for self-disclosure is attributed, among other things, to the fact that individuals feel a stronger sense of anonymity, social cues are weaker in comparison to face-to-face situations, and the communication situation is perceived to a greater extent as controllable. However, this is often a fallacy. As privacy-sensitive decision-making is a complex process, specific interventions can support users to align their intentions with their behavior.
Boosting can make users aware of specific heuristics and how they affect our decision-making. Boosts can support users to challenge and reflect on their decision-making and the used heuristics. With the newly acquired competencies, users can better weigh the costs and benefits of specific behavior such as personal data sharing in digital environments.
Following, we present a selection of privacy heuristics, assembled by Sundar et al. (2020), that are likely to be triggered by common cues in online interfaces. An understanding of these heuristics can help users to reflect on privacy-sensitive decision-making on digital platforms.

Informational Self-Determination and the Privacy Calculus
In digital work environments, individuals frequently share content with other individuals that are often inconsistent with their own intentions and are oftentimes not able to manage their own privacy settings. Particularly in the context of information privacy-related decisions, human decision-making is often imperfect, and decisions are made that often do not correspond to the objectives pursued.
In privacy-related decisions, the so-called privacy calculus is of relevance as to which individuals rationally weigh potential benefits and risks before deciding. In numerous circumstances, users might exchange personal data in exchange for time and money, self-enhancements, or pleasure. Depending on whether individuals attribute higher benefits or costs to a situation, they decide for or against a certain behavior, for instance sharing sensitive personal data in a digital environment. According to the privacy paradox, individuals, therefore, perceive more benefits than costs in disclosing their personal information. Influencing factors such as time pressure or cognitive complexity do not make purely objective decisions possible.
In these situations, the concept of boosting can offer a tool for conscientious choice architects to support individuals. Correctly designed and implemented boosts can equip individuals with competencies to navigate complex decisions easier, and faster and most importantly align individuals’ behavior with their intentions. Accordingly, choice architects can contribute to a privacy-sensitive design of modern information systems and the informational self-determination of its users – one of the key challenges in current and emerging digital environments.
Parts of this post were first published in:
- Barev, T., Janson, A., & Leimeister, J.M.: Designing Effective Privacy Nudges in Digital Environments: A Design Science Research Approach. International Conference on Design Science Research in Information Systems and Technology (DESRIST). – Kristiansand, Norway, 2020
- Barev, T. J. & Janson, A.: Towards an Integrative Understanding of Privacy Nudging – Systematic Review and Research Agenda. Annual Pre-ICIS Workshop on HCI Research in MIS (Pre-ICIS). Munich, Germany, 2019.
- Cass R. Sunstein, Nudging and Choice Architecture: Ethical Considerations (Harvard John M. Olin Discussion Paper Series Discussion Paper No. 809, Jan. 2015, Yale. J. Reg. (forthcoming 2015)).
- Grüne-Yanoff, T., Marchionni, C., & Feufel, M. (2018). Toward a Framework for Selecting Behavioural Policies: How to Choose Between Boosts and Nudges. Economics and Philosophy, 34(2), 243-266.
- Hertwig, R., & Grüne-Yanoff, T. (2017). Nudging and Boosting: Steering or Empowering Good Decisions. Perspectives on Psychological Science, 12(6), 973–986.
- S. Shyam Sundar, Jinyoung Kim, Mary Beth Rosson, and Maria D. Molina. 2020. Online Privacy Heuristics that Predict Information Disclosure. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–12.
Photo: Kai Gradert, Unsplash