Skip links

Ethics for the Digital Age

ANALYSIS: This January the European Data Protection Supervisor presented his new “Ethics Advisory Group”. A group of experts that will help him “reconsider the ethical dimension of the relationships between human rights, technology, markets and business models and their implications for the rights to privacy and data protection in the digital environment.” He is not the first European decision maker or thought leader to bring forward ethics as a guiding principle in the digital age. Over the last year digital ethics, and in particular data ethics, have become the “talk of the town” in Europe. Based on the realisation that laws have not followed pace with the development of digital technologies, technologists, academics, policymakers and businesses are today revisiting cultural values and moral systems when groping for a new ethical framework for the digital age.


Ethics of Technology

Technological developments have in history always at some point during their implementation into society forced us to revisit laws, but in particular also ethical value systems and limits. Time and again we are faced with the fact that technology is in fact not neutral, but contain in their very design ethical implications. The photograph was in its early stage of implementation in the late 19th and early 20th century, discussed as both art and reality. This discussion entered the court rooms and the legal rights over a photograph were determined. It was however not only legal rights that were defined, but a delineation of the very ethical implications of a technology (the camera, the photograph) that could reproduce the appearance of an individual with such accuracy. It was an examination of the particularly human consequences (distress and humiliation) of the capacities of this new technology. Defining a right and wrong and attempting to morally manage its implications for individuals.

What we experience these years is a pace of technological developments as never seen before. Not only did the World Wide Web and the capacities of digital technologies develop over just a few decades, but the digital evolution expanded into practically every area of life and society over an even shorter period of time. It only took a few years after Tim Berners Lee invented an open source information space interlinked by hyperlinks in 1989 before the first online businesses emerged and ordinary people started using internet services in the mid 1990s.

Evidently laws have not followed pace with the countless ethical implications of today’s rapid technological development. Now we are questioning the ethics of automatic systems designed to collect data on us en masse, algorithms designed to predict and profile us, technologies used to surveil us and manipulate us and not the least business models profiting from the most private details on individuals. The only way we can do this is by revisiting our values and morals, the ethical foundations of our societies.

Privacy under attack

The right to privacy was originally defined in legal instruments such as the European Convention of Human Rights as a protection against state surveillance. But with the development of an online market, social sphere and economy, surveillance evolved. Today surveillance is “ordinary” embedded in the interactions of every day life and performed by both state and industry actors. One can argue that this evolution of surveillance was in many ways made possible by the legal grey zones left open to interpretation by the fast paced technological development and exploited heavily by not only state actors but also a data driven industry.

Laws are framed to include interpretations and exceptions that permit data collection beyond the norm e.g. for purposes such as law enforcement, public safety and security, Robin Wilton from Internet Society writes in a paper on ethical data handling, and he continues: A major challenge is to ensure that such carve-outs remain consistent with what is just and fair, particularly since data use practices tend to evolve much faster than the related laws and regulatory measures”

But this checks and balance of fairness in data handling clearly did not happen (as the Snowden revelations have illustrated). Similarly business innovation in the digital age have evolved within the legal grey zones of privacy rights and data protection laws. Being innovative in the digital market means to be innovative with user data. Innovation in a data driven economy is data-inspired price setting, forecasting, market design, marketing, user design, business decisions etc. Listen to this panel debate between digital media venture capitalists at Stanford from 2013. The start ups that these venture capitalists reward with capital, innovate in legal grey zones and push the legal limits of data protection laws and privacy rights. Privacy is in digital business innovation an obstacle, something to be reiterated later.

In sum our fundamental privacy rights have been under constant attack from all sides over the last couple of decades. Until very recently it’s been a silent attack that has gone by mostly unnoticed by the average citizen who’ve even participated actively to their own surveillance when engaging with digital services and businesses. Unnoticed because they are built on an opacity built into the design of the online services and products we use, and because the privacy and ethical implications of business and state practices lack, as Robin Wilson also argues a clear ethical problem to solve.

From Laws to Data Ethics

But tides are turning. Due to a number of geopolitically critical events such as the Snowden revelations of a global surveillance infrastructure, countless data leaks and hacks, not to mention the consumers’ increasing sense of lack of control over their digital identities, “online privacy” has now been transformed into one of the most intensely debated topics with intricate power relations among interest groups and global key players. Most evidently shown in a number of pivotal legal judgements such as the Right to be Forgotten ruling and the CJEUs invalidations of respectively the Data Retention regulation in 2014 and the US Safe Harbour agreement in 2015. Moreover, the European Data Protection Reform represented a key battlefield for the renegotiation of roles and power relations in the global information technologies community and the economic interests of the different entities, the institutions of the European Union, civil society organizations, the industry and third country national interests.

All of these movements are exposing the limits of current laws as to the protection of the right to online privacy. We see the lack of remedies, enforcement, clashes of national laws, legal approaches and jurisdictions, and in general too many legal grey zones open to individual interestbased interpretations.

A New Digital Ethics

It is exactly at this point that we turn to a discussion about ethics and in particular data ethics. These days Europe is groping for the words to define an ethics for the digital age. We’ve seen enough examples of the fact that in the technological era “Not everything that is legal is ethical”.

This is not a new thing. We turn to ethics in transitional phases where the formal agreements in society do not follow the progress of society. We have turned to “data ethics” because it is our privacy rights that are under fire and it is in particular the ethical implications of businesses’ data innovation that is exposed today.

When laws do not follow progress, we revisit our cultural value systems. Ethics are not neutral, neither are laws and technology. Digital Ethics is a moral management of the human implications of digital developments. With ethics we determine “the right” and “the wrong” with a view to shared cultural value systems and social agreeements. 

Evidently in these days, when European and US data protection laws and cultural approaches clash, Europe revisits a particularly European ethical value system based on “personal dignity”. European leaders start systematizing, defending, and recommending concepts of right and wrong conduct”. They, as Julia Powles and Carissa Veliz recently put it, begin a fight to change in particular US technology companies’ “wrecking ball” ethics by developing the industry’s “moral compass”. 

Business ethics become data ethics

The world’s leading information technology research and advisory company Gartner Inc have predicted that by 2018, 50 percent of business ethics violations will occur through improper use of big data analytics. In combination with the emergent focus on ethics, these numbers will add data ethics to the list of criteria that deems a company ethical or not, trust worthy or not, competitive on a social corporate responsibility level or not. The companies that do that little thing more than mere compliance with data protection laws. Have the highest level of transparency in data handling processes, collect minimum amounts of data, develop privacy considerate organisational structures, privacy by design products etc.

Where lies the answer?

These days we see an emergence of stakeholders planting their flag in the data ethical debate. All will claim they have created the perfect solutions to the problems posed by the technological challenges to the individual’s privacy rights. And we haven’t seen the end of it. New laws will continue to be developed to manage privacy (and ethical) implications of state and business conduct. Technologists will present one privacy by design solution after the other. We will see more and more new and old companies presenting their products and services as the solutions to the ethical dilemmas described in this article (we describe these in our book “The Data Ethical Company ” coming out at the end of this year).

Many of these will fail us as humans (and many have already failed us gravely). Some of these might support us or even empower us.

But not one will develop the perfect solution to the ethical dilemmas that we are facing today. And we should not look for perfect solutions. We need to see these as what they are and we need to acknowledge the context they are evolving in. They are experiments and we are in an age of experimentation where laws, technology and not the least our limits as individuals are tested and negotiated on a daily basis. It’s the sum of all the efforts in the name of “ethics”, “privacy” and “human dignity” that will pave the way into an ethical technological future.