Blog. For decades we have entrusted a data industry our personal information. That trust has been violated by the very same industry. Companies taking privacy protection seriously and valuing the trustful relation with their consumers will be the winners of tomorrow.
Trustful relations are what make our world function. Every day when we deliver our children in daycare, when we take the bus or share highly confidential information with our doctor we entrust other people with something of value to us. Trust allows us to cooperate. We make ourselves vulnerable in a risk-taking that is essential when interacting with in the belief that the trusted party will not violate our positive expectations.
Think for a moment about your digital travels. Which Google searches on compromising or confidential instances you make, about the confidential talks you have with our phones laying next to you and about the meetings you have online. Think about the documents you leave in ‘free’ cloud services, the pictures you share, how you like or comment on something on social media and how you leave our location footprints all over. Then imagine a virtual spy standing at the side all along taking notes giving them to whoever bids the highest so that they can profit on your most vulnerable information, that being banks, insurance companies, retailers or even governments and intelligence services.
Now you have an image of how severe the breach of confidentiality and trust in the digital environment really is. Knowledge is power and the datadriven industry has for too long exploited a misbalance of power where one party has all the information about the individual making them capable of manipulating their agenda.
Is it Learned Helplessness?
Having put forth that the lack of privacy protection resembles a breach of confidentiality and trust, one might ask the question why we have allowed for this breach to happen? One answer is that laymen have simply has not been aware of the data industry’s practices. And though reports show that more are becoming aware and concerned, we can still observe a lack of reaction. Theorists point to what they call ‘learned helplessness’. In short learned helplessness describes that consumers have become accustomed to the harm of privacy invasion, and have learned gradually that there are no alternatives but to accept the terms of agreement and sacrify privacy. As modern human beings (in most digitized countries) we are dependent on digital services, this being an inequality of power with one agent holding something the other party relies on whereby she can set the terms. Having increasingly become accustomed to the invasiveness of and dependency on the services in which our information is being shared amongst multiple parties, we have adjusted to resignation. Individuals’ relation to digital service providers resembles mere dependency.
Exploitation is as such a condition that the situation allows and the fact that the individual is either ignorant to the conditions (an inequality of information) or helpless due to the lack of alternatives (an inequality of negotiating power) resembles a situation of mere reliance that the industry will care-take your interests as an individual, which has not been the case so far. The unequal balance in power by it being unavoidable thus culminates into an environment where trust (a phenomenon between equal parties) by definition does not exist. So the vulnerability in giving all these valuable information about oneself that the individual accepts as an unavoidable fact is not reciprocal, and the industry does not seem to care-take this entrusted value with the individual’s interest as a goal.
A new standard
The new EU regulative (GDPR) is coming into effect May 2018, which includes initiatives that should be trust building such as transparency and clear communication, and it will be interesting to see if companies can regain trust with more transparent data handling.
My analysis of the problem is along the lines of the book ‘DataEthics – The New Competitive Advantage‘, that the system is broken to a degree where companies must do an extra effort to regain trust not just by complying with legislation but also by addressing the core of the problem, namely that trusted information is distributed and utilized for purposes that does only benefit the industry. The awakening of the individual regarding this exploitation will hopefully create a demand for privacy protection and if so the industry will be forced to changing its behavior. For more than 20 years, exploitation of data has been possible, but as we have seen with other industries, a sustainable balance must be found, and I believe that companies taking privacy protection seriously and valuing the trustful relation with their consumers will be the winners of a new standard of data utilization.
A seal of privacy certification
Specifically there are alternative suggestions on how to do this. I have investigated how an ethical consideration must be the starting point from which companies make their decisions and strategies in relation to data utilization. The alternative I suggest for now is to fertilize the ground for a trustful relation in respecting the individual’s right for privacy by not sharing data with third parties this being a breach of expectations of confidentiality. What I have investigated during my Masters Degree is a legal contract determining compliance with the most ambitious standard of privacy protection leading to a seal of certification that can be a lighthouse for individuals to navigating by thus simplifying the complex process of understanding the terms of what you consent to when interacting with a company. Another alternative that has to be further investigated is the restructuring of our systems towards privacy by default at a technical level. The first being a feasible strategy for companies here and now, the other hopefully setting the standard of the future.
The use of data is not a bad thing. Data can help us solve some of the most complex and simple issues that the world face helping us being smarter about our organizing of it. But it must be done in a responsible way that does not compromise the individual’s privacy.
Anna Lykke Lundholm-Andersen is the founder of Bay&Lundholm Gruppen and has been studying privacy protection and trust in a datadriven industry during her Master Degree programme at the IT University of Copenhagen.