Children need more protection than adults. We addressed this long ago with ethics, rules and legislation that together protect children more than adults. Just not online… A green paper shows what forward-looking countries are doing.
When it comes to online games, social media and other places, where children are active online, their data is most often still treated like adults.
Cases of data misuse, harmful and illegal content, digital violence, hidden marketing, self-harm, pornography and radicalisation on social media are some of the issues that have highlighted the need for a critical look at rapid digital developments. In particular, the safety and well-being of children and young people have come under the spotlight. Most recently in the wake of leaked research reports in the US that have shown that Instagram, among other sites, can have a detrimental impact on young people’s self-esteem and mental health.
A new Green Paper gives concrete examples of what more can be done to develop technologies and a digital world that give children the protection they deserve and need. This can and must never be left to parents and children alone.
The Green Paper is a catalogue of online child protection initiatives in eight geographical areas. What is clear after writing the Green Paper is that if Denmark insists on being a digital frontrunner, it is urgent to strengthen online safety for children as well. Digital education at individual level is not enough. Demands must be made at a systemic level.
Below we bring three examples from the Green Paper, which can be downloaded here in Danish.
If children are to be protected online, the service must know that it is a child – but we must also ensure that this information does not work against their best interests (giving the service unnessesary personal data). Age-verification mechanisms are becoming a requirement in Australia, the UK and France. And Germany already has legislation that means, for example, self-harm and online porn can only be shared online in closed adult user groups. The way to ensure this is through the use of age verification mechanisms, which have gone through a certification process. The EU-consent project is particularly interesting. It’s ambition is to simultaneously protect children online through age verification and at the same time protect their data.
Safety By Design
This means that – as for all sorts of other products – there must be a safety requirement built into the design. Safety standards for online games, social media and other online services might mean, for example, that children are not advised to ‘befriend’ strange adults; that children are not presented with behavioural designs intended to make the child continue despite a need to shut down; that there are no algorithms that recommend, for example, pornography, self-harm, violence and other age-inappropriate content to children. Safety standards that mean companies can’t just write themselves out of responsibility by claiming their services aren’t for children under 13 and then incidentally use children’s data at will, as we described in the report Online Games Are Gambling With Childrens’ Data (2021) and The Child Data Violators (2016).
Australia’s E-safety Commissioner works on standards for safety by design and has developed guides for both software developers and investors. Forthcoming Australian legislation impose requirements in this area as well. In the UK, ‘The age appropriate design code’, an implementation of GDPR, requires that if children use – or might use – a digital service – the company behind it must meet 15 standards. A set of formal standards for safety by design is also being developed and will be described in the English version of the Green Paper, to be published in April.
Beeban Kidron, founder of 5Rights, writes: “a child is a child until they reach maturity – not until they reach for their smartphone”. According to the Convention on the Rights of the Child, that is until the age of 18. And a series of bills introduced in the US in 2021 all aim to protect not only younger children, but teenagers in particular, from data collection, manipulation and harmful content.
In April, the Green Paper will be published in English with a handful of additional initiatives. The Green Paper is written by Mie Oehlenschläger and published in cooperation with Digital Responsibility