Skip links

Why We Need to Make (Attempts of) De-Anonymisation Illegal

If we want to use personal data to help humans and societies thrive, we need to make de-anonymisation illegal and let independent third parties into machine rooms to audit anonymisation processes. As it is today, we get more and more afraid of using anonymised personal data. Avast is a good example. But if telcos and other private companies, statistics organisations, scientists and others can’t work with and capitalise on anonymised personal data, we might as well close down major parts of our digitalisation and only work with non-personal data such as weather and product data.

Searches, lookups of locations and GPS coordinates on Google Maps, people visiting companies’ LinkedIn pages, particular YouTube videos, and people visiting porn websites. Those data were – according to a MotherBoard investigation – harvested through Avast’s anti-virus software and sold to other big companies as anonymised datasets. Users had explicitly opted into this (at least for the past half year) – that is they we asked, if that was okay and they actively said yes. But critical media reports stating that multiple Avast users said they did not know that Avast was selling browsing data, made Avast close down its data collection.

In many ways, this reminds me of when Dutch TomTom in 2011 were critised for selling anonymised location data to the Dutch policy, so they could determine ideal locations for speed traps and had to go out and apologise for it. Another example is telcos. They are desperately looking for new business models. Some are too afraid of even thinking of selling anonymised location data, while others, e.g. Telefonica, is taking the chance.

Most people don’t understand the difference between personal data and anonymised personal data sets. Even journalists seem to again and again tell the public that it is wrong selling anonymised data. Yes, it can be wrong and it all depends on who you sell to, how you inform the users affected and how well you encrypt and anonymise the data.  It really needs to be done in a proper way, and if the seller of anonymised data sells them for the rights purposes or gives value back to the community, it would be even better. But we need to be able to work with anonymised personal data, as it is the only way we can gain wealth from our personal data without undermining privacy.

For exampel, national statistics organisations, are sitting on a lot of personal data which they anonymised before private companies or scientist can use it for services or science. Statistics Denmark, for example, is making revenues from this, but they also do a lot of work around anonymisation, privacy and security.

What could help advance the trust around usage and selling of anonymised personal data sets are two major things:

  1. We must prohibit the de-anonymisation of anonymised data, as a German Data Ethics Commission suggested last year. It should even be illegal trying it. Some expert/studies say that it is easy to de-anonymise anonymised data sets – others say that it can be done without too high a risk of de-anonymisation
  2. We should establish independent audits, so independent experts can be sent out to companies claiming they are anonymising personal data. They should be let into the machine room to certify/audit that it is done properly. And what is properly? We will probably have to accept some kind of risk level, as we know that nothing is 100% safe – neither is crossing a street.

When personal data sets are anonymised, it is legal to sell them. Then GDPR does not apply anymore. If the buyer of anonymised data sets try to de-anonymise them to enrich the personal data they are already sitting on, it will in many cases be illegal, as it is a way of treating and repurposing of personal data without consent. There will be ways out of this, if the buyer can use arguments of legitimate and public interest. But it is this kind of de-anonymisation that the German Data Ethics Commission believes should be illegal.