Data protection impact assessments (DPIA), also referred to as Privacy Impact Assessments (PIA), are compulsory under the new EU General Data Protection Regulation (GDPR). But ensuring an organization or business’ data protection compliance now and in the future is the bare minimum. In a world of data networks that increasingly shape and define everything from our identity, economy to politics, we need to also consider the broader ethical and social implications of data processing.
Why an ethical impact analysis in addition to the PIA?
Here’s an example from Denmark. Since 2015, wellbeing digital tests have been performed in Danish schools. Children are asked about everything from bullying, loneliness to stomach aches. Recently it came out that although presented as anonymous, they were not. Data was stored with kids’ social security numbers, correlated with other test data and even used in case management by some municipalities. (please read the analysis of this case – only in Danish – by DataEthics Catrine Byrne Søndergaard here). The privacy and data protection legal implications of a system like this are evident and a proper PIA or DPIA would probably have caught many of the core legal issues that now have caused the Ministry of Education to pause the tests.
But is compliance with basic provisions of data protection legislation really the only issue here? No and it’s a question of an evolution that we can’t ignore. A digital test is not just a different type of test tool, it is social engineering. Data systems like this are increasingly integrated into our social realities. They are part of the very social structure of everything from our local communities, to our economies to our politics. The data from these digital tests are for example analysed and used by schools, and teachers in their work with the wellbeing of the children. An ethical impacts assessment would take the analysis of impacts one step further than mere legal compliance. It would consider factors such as community influence, social risks, the distribution of responsibilities. How does a data system like this for example influence the role of the teacher, the school, the municipality, the parent and the child? Which decisions can a teacher make based on these data sets? Which decisions can a school make? How are the tests perceived? Are conclusions combined with experienced input? Or are these data analytics and derived data conclusions perceived as objective truth? Which conclusions are derived by data analytics? Based on which criteria? With which consequences for the local community? etc. etc.
Ethics is a value choice
These are just a few questions that a social and ethical impact analysis will highlight. But we might want to take this even further. What is ethics? Ethics is culture, that means a set of “shared meanings” that are produced informally within a given society and represented formally (in our regional and national laws for example). Ethics is not a given. Our choice of ethics is a value choice. With our choice of ethics we choose to focus on specific risks, we prioritise interests, we distribute roles and responsibilities. Interestingly, the choice to combine the digital tests in the Danish schools with the children’s social security number was among others based on recommendations from experts that saw this methods’ value in terms of scientific analysis. This was indeed a prioritisation of one specific value. But was it a conscious one? Was it balanced against other values and interests? Those of the individual or the community?
The first step in a data ethics impact analysis would be to make transparent the ethical rationale behind the choices we make: Which risks are prioritized? Whose interests? Whose responsibility.
Some more inspiration:
Collecting some tools here for inspiration. All of them based on each their rationality and ethical value system.
Will be updated.
Basic data ethics questions
These are the minimum questions we should ask when conducting a basic data ethics impact analysis:
Data Ethics Principles from DataEthics.EU:
We have developed a set of data ethics principles and guidelines that may help the integration of data ethics in your data processing activities. In this folder we present the principles, a detailed questionnaire and a FAQ on data ethics. Download here (PDF) Dataethics-uk
The Data Ethics Canvas from Open Data Institute
The Data Ethics Canvas is designed to help identify potential ethical issues associated with a data project or activity. It promotes understanding and debate around the foundation, intention and potential impact of any piece of work, and helps identify the steps needed to act ethically.
Eticas Framework
The Eticas framework has four pillars that provide entry points for asking the right questions and coming up with suitable solutions.
The Belmont Principles
- Respect for persons: protecting the autonomy of all people and treating them with courtesy and respect and allowing for informed consent. Researchers must be truthful and conduct no deception;
- Beneficence: The philosophy of “Do no harm” while maximizing benefits for the research project and minimizing risks to the research subjects; and
- Justice: ensuring reasonable, non-exploitative, and well-considered procedures are administered fairly — the fair distribution of costs and benefits to potential research participants — and equally.
Beauchamp and Childress’ Principles
Autonomy – The right for an individual to make his or her own choice.
Beneficence – The principle of acting with the best interest of the other in mind.
Non-maleficence – The principle that “above all, do no harm,” as stated in the Hippocratic Oath.
Justice – A concept that emphasizes fairness and equality among individuals.
Article
Book
Processes transform in a digital environment. In an ethics assessment of a digital system, it is therefore also important to understand the specific characteristics of new digital technologies. This book provides some very insightful articles on the practical ethics of technologies: The Ethics of Technology: Methods and Approaches (edited by Sven Ove Hansson), Bowman and Littlefield, 2017.