Skip links

Essay: The Inseminated Will

While isolation preserves us from the pandemic, it makes us especially vulnerable to the effects of the infodemic. In a context that has brought our offline lives to the online wires, we are all exposed to being constantly tested by algorithms, which are then able to steer the construction of our meaning-networks in the direction of the investors’ interests.

By Javier Lede

In the pandemic scenario, the Big Tech companies are anticipated to be the great winners at this juncture.1 Their victory does not lie in the possibility of reaffirming their economic power, but in the unprecedented opportunity to collect inconceivable volumes of data.

The particularity of the present context is that, for the first time in history, an analogic virus is pushing us massively into the digital field, creating an ideal breeding ground in which the pandemic and information technologies led to the proliferation of the emerging disease of the 21st century: the infodemic.

In words of the United Nations Department of Global Communications, it is the dissemination of disinformation, misinformation and rumours during a health emergency. 2 In broad terms, it is possible to be understood as the tipping point of the globalization of Fake News. For the first time, it becomes explicitly visible how info-fakes result in a ubiquitous menace.3

In these circumstances, it is necessary to identify precisely the great dangers to which Western democracies are exposed, before an emerging power takes on unexpected forms. Only by identifying what the real risks are, we will be able to avoid a dystopian fate for information societies.

The great trick

Information technologies do not pose their main threat in the surveillance of individuals, but in the use of profiling and targeted communication techniques, which allow the identification of the most vulnerable users to influence the result of collective decisions.

This is nothing more than the result of traditional marketing and propaganda methods, taken to previously unimaginable limits, thanks to the capacities for gathering and processing large data sets abled by the current technologies.

We can point out this practice as a work of reverse social engineering, meant as the use of different data mining techniques to analyse collective behavioural patterns and find out correlations, in order to develop strategies to intercede on their interactions, in what can be defined as social programming.4

Nevertheless, technology companies are presenting themselves as the guardians of the freedom of speech, against government persecution. For example, in its transparency reports, Facebook claims “We do not provide governments with ‘back doors’ to people’s information”.5

To preserve their business model, it is essential for them to invigorate an aversion to control and a sense of freedom, because this is what encourages their users to expose themselves without precautions or self-censorship, revealing as much data as possible.

In this illusionism act, while large technology companies market themselves as the defenders of freedom, they provide mass manipulation dispositives that can interfere in the proper functioning of democratic systems.6

Locked in digital living rooms

The context of the COVID-19 pandemic has provided a historically unprecedented scenario, where Western societies, immunocompromised, crouched in a physical self- confinement, at the same time they got in a virtual hyper-connection. This especially unusual condition provided a highly beneficial situation for companies whose business model is based on data collection.

Just over a year ago, the social network’s CEO announced that “Facebook would change from being a digital town square to creating a type of digital living room, where people could expect their discussions to be intimate, ephemeral and secure from outsiders”.7 As if it were a prophecy, the pandemic finally tore us away from public spaces to lock us in private rooms, taking Zuckerberg’s digital proposal to an analogic level.

This situation that blurs geographies and segments us into ideological territories, builds echo chambers where the repetition of opinions in a closed system disfavours empathy and interchange with diverse perspectives, reinforcing extreme positions.8

We are thus atomized, more exposed than ever to the influence of those who through profiling techniques know which messages will be more effective to intercede in our desires9, through the possibility of inseminating and viralizing designer fakes tailored to exploit the weakness of each targeted group.10

Encapsulated, we are not able to contrast with what happens in the next-door room, while a self-confirmation loop reaffirms internal creeds. The circles of mutual revalidation weave a web of ideas that entangle us until they leave us no choice but to believe in what our surrounding believes.11

In this way, we are at the mercy of technocrats who have the power to create and process large data banks. While isolation preserves us from the pandemic, it makes us especially vulnerable to the effects of the infodemic.

Developing a vaccine

The paradox of our future lies in the fact that the dystopia of our information societies can only be fought with knowledge. The vaccine against misinformation and manipulation is composed of empathy, critical spirit and scientific rigor.

In this flood of data, these are the tools that serve to filter out information from fakes, an increasingly demanding task, in a context in which we are beginning to progressively compete against algorithms with superhuman processing capabilities.12

On the qualitative side, we must also reflect on the value offered by real human contact, which is being supplemented by all kinds of artificial communication dispositives. Once again, the comforts that technology offers threaten us to dazzle and make us lose sight of the type and quality of relationships that we are building. The circumstance demands of us a renewed effort to stimulate those relational modes that revitalize the collective spirit, before the diagnosis signs a lethal doom.

Technologies are promoting transactional relationships, which are reduced to a mere exchange of benefits, usually disposable and valued in economic terms. In other words, all levels of human relations are falling into the logic of consumption.

We are getting used to turning on our webcams at an agreed time, for a brief online meeting, in the framework of a specific remote working or eLearning task. When the camera is turned off, the contact is over.

The richness of the occasional conversations that took place in the halls of the universities or in the canteens of the companies is being suppressed. Opportunely, these apparently peripheral spaces, due to their unsupervised condition, allowed to flourish new ideas -aside of the status quo repetition- which traditionally gave rise to revolts. The environment of critical thinking is being eradicated.

In a context that has brought our offline lives to the online wires, we are all exposed to being constantly tested by algorithms, which are then able to steer the construction of our meaning-networks in the direction of the investors’ interests.

It is vital here to point out a matter of probability and statistics. While it is unlikely to predict the behaviour of an individual user, working on a large data set it is possible to find correlations to foresee and drive the behaviour of the average. For this reason, it is not in the personal data where our Achilles’ heel is, but the metadata of the interactions as a collective.

Archetypes resulting from profiling, offer models that allow to infer behavioural patterns, even for individuals who did not voluntarily signed in the online platforms.13 As in quarantine, partial isolation is not effective, but requires a joint effort.

Many regulations focus on the protection of personal data, when in fact the power of manipulation techniques only works when applied on a massive scale. Future government measures should keep this as a central consideration.

It is a collective responsibility to keep an eye on the ethical use of our data, an indispensable task to tackle challenges that humanity is facing in the near future. By properly monitoring and identifying of the risks, we are still on time avoid inseminated wills.

Notes

1 Wakabayashi, D., Nicas, J., Lohr, S., & Isaac, M. (2020, 23.03.2020). Big Tech Could Emerge From Coronavirus Crisis Stronger Than Ever. The New York Times Company. Retrieved 06.05.20202 from https://www.nytimes.com/2020/03/23/technology/coronavirus-facebook-amazon- youtube.html?auth=login-facebook
2 United Nations Department of Global Communications. (2020). UN tackles ‘infodemic’ of misinformation and cybercrime in COVID-19 crisis. United Nations. Retrieved 06.05.2020 from https://www.un.org/en/un-coronavirus-communications-team/un-tackling-‘infodemic’-misinformation- and-cybercrime-covid-19
3 Ramonet, I. (2020, 25.04.2020). Ante lo desconocido, la pandemia y el sistema-mundo. Le Monde Diplomatique.
4 Han, B. C. (2017). Psychopolitics: Neoliberalism and New Technologies of Power. Verso Books. 69, 79
5 Facebook Inc, & Sonderby, C. (2020). Our Continuing Commitment to Transparency. Retrieved 14.05.2020 from https://about.fb.com/news/2020/05/transparency-report/
6 Cadwalladr, C., & Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. Retrieved 04.07.2019 from https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
7 Isaac, M. (2019). Facebook’s Mark Zuckerberg Says He’ll Shift Focus to Users’ Privacy. The New York Times. Retrieved 06.05.2020 from https://www.nytimes.com/2019/03/06/technology/mark-zuckerberg- facebook-privacy.html
8 Grimes, D. R. (2017, 04.12.2017). Echo chambers are dangerous – we must try to break free of our online bubbles. The Guardian. Retrieved 07.05.2020 from https://www.theguardian.com/science/blog/2017/dec/04/echo-chambers-are-dangerous-we-must-try-to- break-free-of-our-online-bubbles
9 Rosenberg, M., Confessore, N., & Cadwalladr, C. (2018). How Trump Consultants Exploited the Facebook Data of Millions. The New York Times. Retrieved 06.07.2019 from https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html
10 Harari, Y. N. (2018, 18.09.2018). The myth of freedom. The Guardian. Retrieved 06.05.2020 from https://www.theguardian.com/books/2018/sep/14/yuval-noah-harari-the-new-threat-to-liberal- democracy
11 Harari, Y. N. (2016). Homo Deus: A brief history of tomorrow. Random House.
, ibid.
12 Harari, Y. N. (2019). Still time to stop rule by computer algorithms. Canadian Friends of the Hebrew University. Retrieved 07.08.2020 from https://www.cfhu.org/news/hus-yuval-noah-harari-still-time-to- stop-rule-by-computer-algorithms/
13 Privacy International. (2017). Data is power: Towards additional guidance on profiling and automated decision-making in the GDPR.

References

Cadwalladr, C., & Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. Retrieved 04.07.2019 from https://www.theguardian.com/news/2018/mar/17/cambridge- analytica-facebook-influence-us-election

Facebook Inc, & Sonderby, C. (2020). Our Continuing Commitment to Transparency. Retrieved 14.05.2020 from https://about.fb.com/news/2020/05/transparency- report/

Grimes, D. R. (2017, 04.12.2017). Echo chambers are dangerous – we must try to break free of our online bubbles. The Guardian. Retrieved 07.05.2020 from https://www.theguardian.com/science/blog/2017/dec/04/echo-chambers-are- dangerous-we-must-try-to-break-free-of-our-online-bubbles

Han, B. C. (2017). Psychopolitics: Neoliberalism and New Technologies of Power. Verso Books.

Harari, Y. N. (2016). Homo Deus: A brief history of tomorrow. Random House.

Harari, Y. N. (2018, 18.09.2018). The myth of freedom. The Guardian. Retrieved 06.05.2020 from https://www.theguardian.com/books/2018/sep/14/yuval-noah- harari-the-new-threat-to-liberal-democracy

Harari, Y. N. (2019). Still time to stop rule by computer algorithms. Canadian Friends of the Hebrew University. Retrieved 07.08.2020 from https://www.cfhu.org/news/hus-yuval-noah-harari-still-time-to-stop-rule-by- computer-algorithms/

Isaac, M. (2019). Facebook’s Mark Zuckerberg Says He’ll Shift Focus to Users’ Privacy. The New York Times. Retrieved 06.05.2020 from https://www.nytimes.com/2019/03/06/technology/mark-zuckerberg-facebook- privacy.html

Privacy International. (2017). Data is power: Towards additional guidance on profiling and automated decision-making in the GDPR.

Ramonet, I. (2020, 25.04.2020). Ante lo desconocido, la pandemia y el sistema-mundo. Le Monde Diplomatique.

Rosenberg, M., Confessore, N., & Cadwalladr, C. (2018). How Trump Consultants Exploited the Facebook Data of Millions. The New York Times. Retrieved 06.07.2019 from https://www.nytimes.com/2018/03/17/us/politics/cambridge- analytica-trump-campaign.html

United Nations Department of Global Communications. (2020). UN tackles ‘infodemic’ of misinformation and cybercrime in COVID-19 crisis. United Nations. Retrieved 06.05.2020 from https://www.un.org/en/un-coronavirus-communications- team/un-tackling-‘infodemic’-misinformation-and-cybercrime-covid-19

Wakabayashi, D., Nicas, J., Lohr, S., & Isaac, M. (2020, 23.03.2020). Big Tech Could Emerge From Coronavirus Crisis Stronger Than Ever. The New York Times Company. Retrieved 06.05.20202 from https://www.nytimes.com/2020/03/23/technology/coronavirus-facebook- amazon-youtube.html?auth=login-facebook

Javier Lede is a Master Student at Neu-Ulm University of Applied Sciences, Information Management Department