“On the internet no one knows you’re a dog” is a famous internet meme from the beginning of the 1990s. On the picture a dog sits in front of a computer potentially connecting with other individuals who don’t know it’s a dog on the other end of the screen. The meme is a reflection of that cyber utopian belief (represented in e.g. John Perry Barlow’s internet manifesto) that the internet would lead to equality. That on the internet everyone would be treated equally because everyone were anonymous. And thereby no-one would be able to judge you – and everybody could be basically be anyone they wanted to.
It was the dream of the ultimate freedom for every individual and it’s a beautiful utopia, but as the sociologist Immanuel Wallerstein one wrote: “The last thing we really need is more utopian visions”.
We need to face up to reality. A reality that is very much about a handful of companies gaining more power, wealth – and data about the rest of us. With inequality as a tragic result.
Children Treated as Adults
But this cyberutopia also stands in the way for one other thing: Urgent protection of children in the digital environment. because, if everyone on the internet is treated equally – then children are de facto treated as adults.
We documented last year that this is in fact the case with the report Online Games gamble with childrens data. It demonstrates that gaming – and other tech companies state that their products are not for children – but they fail to gatekeep their digital products. Thus, they keep monetizing on children day after day in spite of the GDPR. Many European countries – including Denmark -simply still hasn’t implemented the GDPR for children and children are treated as adults.
And it’s about more that data. As an example Bloomberg recently reported that concerns grow that new generations of immersive games are teaching children how to gamble online, raising the risk of addiction:
“Gambling regulators have placed strict age restrictions on betting for money, with a minimum of 18 to visit a casino or game online in Britain. Playing virtual poker where no money changes hands is treated more like a videogame and is not illegal in the UK or US, and none of the major companies behind the virtual casinos have broken any laws. Now activists and politicians are increasingly focusing on how to regulate gateways into gambling”. Another example is Forbes who recently documented that livestreams on TikTok are a popular place for men to lurk and for young girls to perform sexually suggestive acts enticed by money and gifts.
In both examples it is a matter of digital products that fail to be safe by design. The examples of children being harmed online on ordinary platforms that they use most of their waking hours is simply overwhelming – and they are very often systemic.
Concrete Solutions Are on the Rise
More and more systemic solutions are popping up all across the world. Solutions that are opposed to utopian visions. Three of these types of solutions are Codes, Standards and Toolkits. Below I will write a bit more about the three with links for those who want to read more in detail.
The UK is implementing GDPR for children via the The Age Appropriate Design Code. It’s meant to protect children through 15 standards for e.g. safety by design that applies to “information society services likely to be accessed by children”. In that way companies can’t just write in their terms and conditions that their platforms are not for children and thereby avoid to be held responsible. The Code is materializing in other countries e.g. in the US, in the State of California. The Code is a very important step that really any European country could just adopt – if they wanted to… And this data protection must be accompanied by other protections like child friendly design on order to keep children safe. Which leads us to standards for age appropriate design.
The development of Age Appropriate Standards are on the rise. In 2021 an IEEE Standard for an Age Appropriate Digital Services Framework was released based on the 5Rights Principles for Children. The more than 50 pages long standard document describes the set of processes by which engineers and technologists can consider children’s rights throughout the stages of concept exploration and development. So, this is a form of Safety By Design with the aim of enabling: “organizations to design and deliver systems with the rights and the needs of children in mind”.
The framework center around the following key areas:
- Recognition that a user is a child
- Consideration for the capacity of and upholds the rights of children
- Offers terms appropriate for children
- Presents information in an age appropriate way
- Offers a level of validation for service design decisions.
Now a CEN-CENELEC working group* will build a European set of standards based on this IEEE-standard. As it says in the invitation: “The original design of most digital technologies did not anticipate use of those technologies by children. This had led to an asymmetry of power between children and the technology they use since children do not have the life experience and developmental capacity of adults. Multiple stakeholders have been facing challenges to work out how to redress the balance in a way that puts the needs of children first but continues to promote innovation“.
I am a part of this working group, and I hope that it will lead to actual changes that benefit the protection of children online.
The last component I want to mention is The Child Online Safety Tool Kit launched on May 16 by a number of partners. According to the Tool Kit itself The Child Safety Online Toolkit is a hands-on, comprehensive guide to making the online world free from harm for children: “It builds on existing international agreements and best practices, developed in consultation with international experts from a range of backgrounds. It has accessible worksheets and resources both online and in print to help you make child online safety a reality“.
The Child Online Safety Toolkit contains:
- 5 things every policymaker needs to know to enshrine child online safety into law and practice.
- 10 Policy Action Areas with detailed roadmaps and key practical steps needed to make child online
safety a reality.
- A model policy that policymakers can adopt and adapt to create bespoke country policies.
- Downloadable worksheets to create a bespoke policy fit for practice.
In other words the toolkit is very practical and hands on – just what we need to build a digital world where children can thrive and are protected – simply because they are children.
*CEN is the European Committee for Standardization – an association that brings together the National Standardization Bodies of 34 European countries.
CENELEC is the European Committee for Electrotechnical Standardization, is an association that brings together the National Electrotechnical Committees of 34 European countries