Analysis. Late April 2019, Denmark was blessed with the formation of a Data Ethics Board. We were many who perhaps was a bit dazzled by this and may have lowered our guards a bit. However, at the exact same time, the same government pushing the data ethic agenda, giving toast speeches on data ethics, introduced new legislation providing for the use of mass-surveillance profiling tool on unemployed without any highlight of the theme nor specific wording to highlight it during the hearing process.
On 30 April 2019, members of the Danish Parliament voted yes to a “Law on active employment efforts” (Lov om en aktiv beskæftigelsesindsats L.209 – ‘the Law’). The law basically confirmed a political agreement from 2014, which may explain why all legislative parties was caught off guard, as the new legislation without any debate or questions introduced a ‘mass-surveillance and profiling tool’. This, however, is not an isolated incident in the recent Danish political life, why the question if Denmark is becoming the Scandinavian China is now being asked.
Denmark is among the least corrupt countries in the world, with some of the highest public trust to the government and one of the strongest welfare programs. This may also add to the explanation why introduction of various private life invasive initiative can be introduced without a general uproar. In recent years, Denmark is rapidly becoming one of the most digitalised countries in the world, and it seems that this growth takes blatant disregard to that public trust, all in a rush to become a breeding ground for a social welfare surveillance state.
Profiling the Weakest
Section8 of the Law provides the Minister of Employment sweeping powers to commission a so-called ‘nationwide digital clarification and dialogue tool’. The tool is introducing the use of algorithms and machine learning in order to profile unemployed citizens. Based on aggregated personal information collected from various digital public registers, the aim is to identify the risk of long-term unemployment. The use of algorithms, machine learning with the purpose of supporting public worker helping the weakest citizens may from the humanitarian perspective, be seen as a legitimate and fair purpose. However, even if this purpose is described in the legislation, it cannot be ruled out that the aim of introducing this semi-automated decision making tool may also be seen as a mean to cut governmental costs. What is surprising and striking is the absence of addressing the fundamental rights of citizens, highlighting the relevance of focus and discussions on this or even an attempt to recognise the impact on private life the use of pooling of registers within the public sector will have. And what about the why? Why do this? Do they have specific help they want to provide to those in danger of being long-term unemployed?
Moreover, the tool aims at profiling citizens at a time in their life, where the basis of living is at stake and they are reliant on public hel, in order to place them in predetermined boxes without a clear and concise explanation. Even if the tool is only to be used to support the public worker and that the final decision is made by a human – why the processing is not deemed to be in breach of the GDPR art. 25 – the use of machine learning may deprive the individual of fundamental rights.
The algorithm was devised already in 2014 and developed by The Agency for Labour Market and Recruitment (STAR). It was tested by 16 local jobcentres for 1,5 years (from January 2016 until June 2017). Half of the jobcentres attested that the algorithm did not provide any value in their decision making, according to a Danish report on the matter. Despite such bleak reports that also included criticism from some of Danish leading AI-experts, STAR went on and introduced the legislation that is required in order to enable the use of this nationwide tool.
Hidden on Page 212
On a late Tuesday evening, Catrine Søndergaard Byrne, Dataethics.eu and Hanne Marie Motzfeldt, Associate Professor at the Center for Information and Innovation Law at the University of Copenhagen, more a less stumbled across a small section on page 212 in the more than 1000 page explanatory comments to the now approved legislation describing the digital tool. They decided that it was urgently required to highlight this and already Wednesday Hanne Marie Motzfeld was quoted in one of the largest Danish daily newspapers, Politiken, for this ”roll-out of a systematic mass-profiling of all unemployed based on a rather unsuccessful trial round”. As an avid proponent of the right to privacy, she raises a question: “Have the elected representatives really seen the results and protests so far, but nevertheless said deliberately and thoughtfully yes to this?”
Ayo Næsborg-Andersen, Associate Professor in Personal Data at University of Southern Denmark, was chocked that MPs have approved this “Government’s “profiling tool”, without any public knowledge of exactly what data the algorithm uses, and how high the different data are weighted in the overall assessment” and asked if Denmark was copying China.
Furthermore, from GDPR perspective, there is no clear definition of who will be a Data controller in this situation – STAR, Jobcentres or Danish Unemployment Fund Organisations, which will further slacken data protection and security of the system, when processing special categories of personal data of the weakest members of the society.
Catrine Søndergaard Byrne turned to the second biggest unemployment funds, 3F, and in an article in their online magazine, she questioned how such an a tool could be introduced into legislation without any of the more than 25 associations, authorities, think-tanks, NGO, unions etc. involved in the hearing process, including 3F and the Danish Data Protection Agency, did not raise a single flag and as a consequence the use of such invasive tool was not at all discussed during the hearing process.
With no known precedence, the Danish Data Protection Agency Wednesday declared that they will reopen the hearing of the law. In a very harsh letter to the Ministry of Employment published on the website, it became clear that the Danish Data Protection Agency had requested to be pointed to parts in the more than 1,000 explanatory documents where question on data protection was relevant. However, this part on profiling and the use of automated means was not highlighted to the Danish Data Protection Agency. Besides providing a good explanation why the Danish DPA did not raise any concern during the hearing process, this issue also points at significant neglect from the Ministry of Employment.
Nationwide Surveillance Projects
This government surveillance is however not so exceptional, as citizens of a democratic society would hope for. One Danish municipality – Gladsaxe – has made itself infamous for its intention to, with the use of profiling algorithms and machine learning from pooling various registers, identify children at risk of abuse and/or neglect. The municipality wanted to create an algorithm that could assign scores to families with children in order to identify children at risk. The project, however, is now on hold.
The Minister for Children and social-affairs, Mai Mercado saw exciting opportunities and did not immediately understand the criticism, indeed she was “a little annoyed that the debate turned into surveillance discussion”. Even if the formal legislation to enable to municipality to initiate such a tool was not in place, the municipality began creating the algorithm using machine learning on a number of real-life matters within the social department where child neglect or even abuse was identified. The processing also included the creation of a so-called control group of families with no problems.
Realising that this type of processing did require formal legal basis in order to comply with the GDPR and the increasing awareness of the fundamental problems with the use of such measure amongst both local and national politicians, forced Gladsaxe to stop the pilot and the data processing already in the making. The matter is described in Algorithm Watch report
To add to the scandal, Gladsaxe municipality administrators did at the same time exhibited a blatant neglect of GDPR compliance, and were plagued by a high number of security breaches (41). A recent report by PWC estimates that this particular municipality needs between 10 and 17 full time employees to assist a full time DPO, in order to bring data protection and information security in order to comply with requirement when processing personal data of more than 70,000 citizens. The municipality employed a part-time DPO.
An Algorithm To Fight Social Fraud
Another municipality – Horsens – moves full steam ahead to develop an algorithm to fight social fraud. According to Mads Lund Torslev, manager at the department of Development, It- and Digitalisation at Horsens municipality, this initiative is not so much about saving money for the local community, but a desire “to constantly try new technologies” and see where they might lead. He fails to see any GDPR challenges in training the algorithms on historical real life data, where only 1 out of 6 reports proved to be fraudulent. Moreover, development of the algorithm is done by 2 students at the Aarhus University and not by machine learning professionals with many years of experience and deep understanding of data. The approach is questionable both from a legal perspective but also from a data ethical perspective.
Views, Biases and Prejudices
Leading Danish expert Thomas Hildebrandt, who is a professor at the Department of Computer Science at the University of Copenhagen, is disturbed by the current digitalisation-rush in the Danish public sector. He concurs that “Data in itself does not provide clear answers, and interdisciplinary efforts and understanding of complex problems are required if algorithms are to assist in decision making”.
The kind of complex understanding of the human nature that the overworked and understaffed municipality caseworkers are not able to assert due to constant quotas on saving time and resources. And the kind of complex understanding that is now being delegated to interns and students. We readily condemn Silicon Valley and its engineers in their 20s, that create mass-surveillance algorithms at Google, Facebook and Amazon. And then we turn around and entrust development of similar algorithms for welfare surveillance by not morally conscious students in our own backyard.
STAR’s algorithm is said to be employing the impartial mathematical and statistical procedures for profiling. However, like any algorithm, the tool is looking for patterns that are predetermined by the humans who designed it. Therefore, it cannot be impartial, as human views, biases and prejudices are always subjective.
The Danish Surveillance State
Danish government seems to have a compelling belief in the digital solution being a key solution in the future welfare system. Politicians seems to have a high focus on digitalisation of the public sector no matter what. Without identifying the impact such measures will have on our highly praised welfare system and the potential social consequences. More surprisingly, it seems as if they have totally forgotten our fundamental administrative legal principles that also must be complied with.
The latter is especially clear when the private sector assists the public sector in their digitalisation process. The private sector simply does not seem to have the skills and framework to operate within such restrictive environment when introducing privately developed tools to be used within the public sector.
Nationwide Municipality union (KL) published in February 2018 a report under the motto ‘welfare development through new technology’. The initiative under the name ‘The municipalities’ technological leap’ – a 202-page analysis – covered “Five technological themes.” Sadly, the authors of the report did not consider or describe any of the challenges, barriers and ethical dilemmas, or whether it is desirable to pursue the opportunities provided by the technology.
These are the dilemmas that Denmark as a society urgently needs to discuss, why the upcoming election to the Danish Parliament also include discussions sustainable use of data.
How could this happen in a democratic society? How did we come to accept this status quo? We live in a society with high trust, and we have confidence in our politicians to make decisions that promote welfare of our citizens. Normally, trust and transparency go hand in hand. But there is little transparency, and no ethics in the way the decisions and law proposals are being.
We need a fundamental debate on the limits of the democratic society in the light of implementation of digital impacting on fundamental rights of the individual. We need to identify the boarders within which the public sector can operate.
When is mass-surveillance, profiling and pooling of data to new purpose a proportionate and legitimate step in order to help and assist the citizens?
Is mass-surveillance, pooling, and re-use of data proportionate, legitimate and fair in order to identify a child at risk of abuse and neglect proportionate? Is it ethical not to use whatever means for such an important purpose?
From a data ethics point of view, human interests must always prevail for institutional interests. The human being must be at the center and have the primary benefit of data processing, according to our principles.
Except for the Danish DPA, no one of the associations etc. involved in the hearing process, has issued statements. Only the think-tank Justitia, also involved in the hearing process, has made a brief statement saying that the process when legislating on use of such measures is worrying.
The Data Ethics Boards Needs To Act
The Danish government has appointed a Data Ethics Board, which purpose is to support responsible and sustainable data use in business and the public sector. Specifically, the Data Ethics Board must contribute to an open debate on the use of new digital solutions, data, artificial intelligence, etc. and the dilemmas that the new technological possibilities raise. It is puzzling that not even the Data Ethics Board has made a single comment on the new legislation. It is not very clear what the role of the Board has and how it forms part of the government/state besides it is clearly not independent. Nevertheless, the establishment has brought light to these fundamental dilemmas that requires discussions and conclusions. Blind acceptance of technology and in stead feed the public sector with appropriated respect for human rights.
Denmark Could Learn From Others
Every day Denmark there is a movement towards increasing use of the digital registers within the public sector, use of surveillance., profiling and automated control of the citizens. It is clear that the movement is basically due to lack of knowledge and a strict aim on digitalisation as the solution to all problems and providing common goods to all Danes. This is rather different in other Scandinavian countries. Both Sweden and Norway strive for openness in the digital interaction between the citizen and the administration, says Hanne Marie Motzfeldt to Version 2.
Estonia is even further ahead with an efficient, secure and transparent ecosystem that saves time and money, called X-road. This small Baltic nation has become the 21st century citizen-centered state and with its service-oriented infrastructure has made it possible to build a safe e-services ecosystem. It is named ‘the most advanced digital society in the world’ by Wired and had already inspired other countries to adopt its best practices. Finland, Faroe Islands, Sweden and Iceland are all implementing Estonian e-solutions.
Danish politicians should stop perpetuating a fantasy of being the global digital front-runner and take time to understand the principles and impact of rapid digitalisation of government administration on its citizens. The effect of digital technologies is determined by the framework: legal, economic and ethical, in which these technologies are included. It is time for a human-centred government.