When talking about data collection and profiling of tech companies, people tend to think: “It does not concern me, I have nothing to hide.” The issue, however, is not about protecting secrets. It is about protecting our free will and democracy. Such are the words from author and professor Shoshana Zuboff whom visited Denmark recently.
According to Zuboff the problem have evolved over time. First companies would sell predictions about our clickstream and make predictions about where on a homepage to put an advertisement. Then they evolved the business model and started to monitor our mood and emotional state to predict when to expose us to advertisement. The latest thing is the manipulation of our physical whereabouts. A game like Pokémon Go nudges and incentives us to go to certain places and with the spread of IoT we will only experience more of this kind of manipulation in the future.
Threat to our free will and democracy
With data on our behavioural patterns and the access to our attention – through apps on a phone always with us – it is now possible for tech companies, to influence, not only our clickstream, but our feelings, opinions and physical behaviour. The access tech companies have to data about their users that can be used in political campaignes to target individuals as a way to influence the election. We have seen this in the Cambridge Analytica scandal where data about Facebook users were used in campaigns in the American election and the Brexit election, both in 2016.
Profitable business model
Zuboff explains that the selling of predictions of human behaviour is a profitable business model. It allows new tech startups to quickly become profitable and ensure investors. Companies that do not use this business model are often left behind and struggle to get funding. The market therefore favours the creation and growth of companies using a business model of selling data about their users.
Why Data Privacy is not enough
Even the demand for data privacy and data ownership for the users, we are experiencing right now, will – according to Zuboff – not be enough to protect free will and democracy in the future.
She points out, that providing people with the right to own and govern their own data does not prevent the business model of selling predictions about human behaviour, since companies still are allowed to sell data, they own about us acquired through collection or buying directly from us.
As long as this business model is legal, we will continue to see the sale of predictions about human future, with the intentions of manipulating individual and populations, she says.
Law is the solution
The solution is to regulate the market with law, says Zuboff.
As she points out, once it was possible and legal to base your business on the work of slaves or children, but we evolved as a society and decided that it was necessary to regulate the market for the sake of human protection. In the same way;
It is illegal to trade in human organs and according to Zuboff it should also be illegal to trade in the prediction of human future
So even with the focus we now have on data ownership, we must not forget the power of legislation as a means of protecting our democracy and the free will of individuals. The law must be updated to keep humans and our democracy protected, not only now, but also for future generations.
Signe Agerskov is researching blockchain ethics at the European Blockchain Center.