When the new EU data protection regulation (GDPR) will come into force in 2018, businesses must be ready to implement new requirements regarding the data of their users. This includes responsibilities in relation to their younger users. With e.g. a new age limit on 16 years (may be lowered to 13 years in individual member states), the GDPR opens up for a discussion in the EU and beyond regarding clarification of concepts, challenges and practical models for implementation in relation to children.
In October the LSE Media Policy Project hosted a round-table meeting to discuss the impact of the General Data Protection Regulation on children. Although primarily focused on a UK context, key points from the discussion at the workshop among industry, academic, consumer rights, children’s advocacy and government representatives reflect more general discussions regarding the status of children and youth in the GDPR. In general many of the participants hoped that the GDPR might encourage debate to clarify and articulate the nature of children’s internet rights in general.
Some positions from the workshop:
(hand picked and categorised by this author. Please read the full report here)
- The age limit: Will it divert attention from long term sollutions and responsibility to support online literacy and education?
“If efforts focus too much on producing an age limit, there will not be enough emphasis on the long term solution and responsibility to provide digital and online literacy, training, education, etc. at all level (commercial and government). Age limit discussion should consider the position of children, not just industry; not just in terms of top down regulation, but also horizontally, in terms of societal benefit” (position from academia)
“this is about industry keeping children out (or at least giving the perception that they have tried) rather than making them digitally literate. Whilst it’s a commercially savvy decision it’s not in children’s interests.” (position from academia)
- Age Verification: Will it actually prevent harm? How to achieve this without compiling personal data?
“…this is a poor mechanism for harm prevention. This was raised in the 1990s when the Childrens. Online Privacy Protection Actwas passedin the USA. COPPA’s goal was to get companies marketing to children to develop fair practices for marketing towards children. This opportunity should be used to develop fair marketing and big data collection practices for teens. Companies’practices regarding advertising should be limited. Commercialisation and the internet are intertwined. Fresh dialogue is needed between all stakeholders to ensure a balance between participation and protection. “ (position from Children’s Advocacy)
“…this could lead to collecting even more invasive data on children in order to confirm their age. However, most major online platforms have mechanisms in place, for example parental reports, thatprevent under 13’s using those platforms by removing their accounts….” (position from industry)
“…age verification is the key question within Europe. It could turn out that age verification is privacy enhancing. Some privacy options can restrict industry’s ability to gain personal information, by having age verified by a third party” (position from children’s advocacy)
- Analysis and developing an understanding of big bata, profiling and tracking: Parents and children don’t understand it and know about it.
“Any research on this topic should include analysis of big data, digital advertising analysis of which children are often the key target market. This includes cross device tracking, highly sophisticated geo location tracking, neuro marketing and identity management to try and shape perceptions and behaviours. Facebook was identified as a key operator of these techniques in relation to children. Future policy should afford teens greater control over these processes.” (position from Children’s Advocacy)
“Given the contemporary nature of digital media system it is no longer possible for an individual to understand the processes behind advertising online experience e.g. programmatic advertising (auctioning for predictive analytics).” (position from Children’s Advocacy)
- New business models that do not track: How to survive as an online business without it?
“…some participants questioned if, from a consumer rights perspective, it was not possible to develop platforms that don’t depend on profiling and advertising for their funding. However, other participants stated that there are subscription models for platforms that exist; however,this is regulated in a different way-for example if a subscription funded platform provides original content as part of their service, they will be regulated like a television network. From an industry perspective, this generates non trivial additional cost. Such models also exclude those who cannot afford it from not being profiled, which is unfair. The internet as a whole has not been able to solve the problem of generating revenue for content creators.”
- Moving away from the tick box culture: The GDPR puts too much responsibility for decisionmaking on the users. Users should be able to access non tracking services, and companies should provide them. But the GDPR is not explicit enough on what can and cannot be done with data.
“The GDPR in its current form puts the onus for making decisions about the use and appropriateness of individual information society services on parents and children, when, arguably, industry should offer services that are free of monitoring practises. The GDPR could be more explicit in what industry can and cannot do with personal data, which would resolve some of these issues”. (position from Academia)
“Academia accepted that Industry had different priorities in this regard, but endorsed a shift away from the commercialisation of information society services, particularly advertising. Routinized use of data is not even being talked about, and many are not even aware of the conversation. A move away from a ‘tick box culture’ would not directly benefit industry, but would empower civil society. Many large internet companies signed up to the European Commission’s CEO Coalition to make the internet a better place for children in 2011,but not much substantial has happened in that time.”