Guest Contributor Renée Ridgway
This open access article published recently in Big Data & Society draws on Brin and Page’s original 1998 paper to explain how Google developed its hegemony on search and laid the groundwork for contemporary surveillance capitalism.
‘Deleterious consequences’ was coined by computer scientist and theorist Phil Agre who in 1998 expressed concern about the harmful effects of AI if programmers did not keep one foot planted in the craft work of design and the other foot planted in the reflexive work of critique.
In this article, I revisit Brin and Page’s coeval, seminal and only extant text on their search engine and the PageRank algorithm, ‘The Anatomy of a Large-Scale Hypertextual Web Search Engine’ (1998). I highlight and contextualise some of their original keywords (counting citations or backlinks, trusted user, advertising, personalization, usage data, smart algorithms) that already foreshadow what was yet to come at Google in spite of their ‘don’t be evil’ motto. Although Google’s mission statement ‘organising the world’s information and making it accessible and useful’ is well known, what isn’t well known is that Google’s ‘intentions’ were not necessarily accidental, arbitrary nor (un)intentional. Through certain ‘moments of contingency’ their decisions led to corporate ‘lock-ins’ along with promoting their own services in search results along with corporate acquisitions and takeovers that facilitated the ‘googlization of everything’ (Google Ads, Google Maps, Gmail, Google Earth, Google Docs, Google Deep Mind, Android, Waymo, et al).
The past 25 years Google came to not only ‘shape the web’ through patents and the novel PageRank algorithm that counted citations or backlinks to deliver search resultsbut by reinventing digitaladvertising through secret auctions on keywords. Trusted users’ search queries and clicking on links increased traffic and the flow of capital as well as contributing to the world’s largest ‘database of intentions’. As an omnipotent infrastructure that is intertwined with Big Data’s platformization, the article also explains what usage data is accumulated (all) and how it is shared, borrowed and stored beyond just personalization. This extraction and refinement of usage data becomes what Shoshana Zuboff deems ‘behavioural surplus’ and results in deleterious consequences: a ‘habit of automaticity,’ which shapes the trusted user through ‘ubiquitous googling’ and Google’s smart algorithms, whilst simultaneously generating prediction products for surveillance capitalism. What would Google have become if Brin and Page in 1998 had applied a ‘critical technical practice’, combining reflexive critique and design decisions instead of developing an advertisement company (87% of their revenue still comes from advertising as of writing) cum search engine and not a search engine for research?
This article is part of a special issue, State of Google Critique and Intervention, published by Big Data & Society as open access. Other articles can be found here.
Photo: Adobe Firefly (supposedly trained only on consented data) with the prompt: ‘user in front of a computer searching with google as it surveils them’