Skip links

From Consent to Data Control by Design

Research. The current model of consent is impractical and illusory, turning a tool that is supposed to empower individuals to make informed choices into a tool that submerges users in unread privacy notices that they accept as the de facto price of online services. Privacy by Design (PbD) offers privacy protections by default. But it does not fully empower users to take control over their privacy. Data Control by Design (DCD) does and gives individuals the right to allow collection and use of their data when data are processed by trusted agents

By Belli. L, Schwartz. M. Louzada L. (2017). Selling your Soul while Negotiating the Conditions: From Notice and Consent to Data Control by Design. In The Health and Technology Journal. Springer-Nature.

This article claims that the Notice and Consent (N&C) approach is not efficient to protect the privacy of personal data. On the contrary, if data where to be considered individuals’ soul, N&C could be seen as a license to freely exploit the individual’s personal data. For this reason, legislators and regulators around the world have been advocating for different and more efficient safeguards, notably through the implementation of the Privacy by Design (PbD) concept, which is predicated on the assumption that privacy cannot be assured solely by compliance with regulatory frameworks. In this sense, PbD affirms that privacy should become a key concern for developers and organisations alike, thus permeating new products and services as well as the organisational modi operandi. Through this paper, we aim at uncovering evidences of the inefficiency of the N&C approach, as well as the possibility to further enhance PbD, in order to provide the individual with increased control on her personal data. The paper aims at taking a step further, shifting the focus of the discussion from “take it or leave it” contracts to concrete solutions aimed at empowering individuals. As such, we are putting forth the Data Control by Design (DCD) concept, which we see as an essential complement to N&C and PbD approaches advocated by data-protection regulators. The technical mechanisms that would enable DCD are currently available (for example, User Managed Access (UMA) v1.0.1 Core Protocol).

We, therefore, argue that data protection frameworks should foster the adoption of DCD mechanisms in conjunction with PbD approaches, and privacy protections should be designed in a way that allows every individual to utilise interoperable DCD tools to efficiently manage the privacy of her personal data. Initially, we provide a brief overview of the “traditional” N&C mechanism, used to protect individuals’ data privacy online and will emphasise the failures of such mechanism. Notably, we stress that, in the online environment, individuals are presented with complex and legalistic privacy notices to which they can either consent, in order to enjoy a given service, or refuse, thereby forfeiting the option to use the desired service. This all-or-nothing scenario highlights that the current model of consent is impractical and illusory, turning a tool that is supposed to empower individuals to make informed choices into a tool that submerges users in unread privacy notices that they accept as the de facto price of online services.

Subsequently, we briefly analyse the concept of PbD, underscoring that such a concept may be considered as an advancement in terms of the evolution of privacy protection, compared to an inefficient N&C approach. To cope with such inefficiency, PbD proposes a proactive approach that embeds the protection of privacy into technologies, procedures, and architectures. This evolutionary step is particularly meaningful, for it represents a shift from a legal approach to privacy to a “design thinking” approach that translates legal concepts into the technical architecture of the Internet environment, and ICT-environments in general, as well as into the organisational architecture of the various entities operating in such an environment.

PbD usefully requires designers and operators to fashion procedures, products, and services with the privacy of their users in mind and ideally offering privacy protections by default. However, PbD still does not fully empower users to take control over their privacy because, as in the N&C context, individuals are only given the option to choose whether or not their data will be collected and processed rather than being able to exert full control over how their data will be used and by whom.

Indeed, both N&C and PbD schemes fail to consider the nuances that exist between strong data privacy protection and no data privacy protection as well as between blanket data collection and no data collection. Particularly, N&C and PbD do not seem to consider the possibility that individuals might be interested in allowing the collection and use of their data when it is necessary for specific purposes and data are processed by trusted agents.

To facilitate a further step forward in the direction of data privacy and user empowerment, we propose the concept of DCD and examine the emergence of DCD models and initiatives. As such, we hope to shift the focus of the debate to the possibility of empowering individuals to effectively control their personal data, rather than constraining them to trade data for digital services.

Lastly, we discuss the specificities of health and genetic data, and the role that a DCD may play in this context, stressing that the sensitivity of genetic and health data requires special scrutiny from regulators and developers alike. In conclusion, we argue that concrete solutions allowing for DCD already exist and that policy makers should join efforts together with other stakeholders to foster the concrete adoption of the DCD approach.

Read the research in full or on Springer’s website