Skip links

Services Aimed at Children and Teens Need Data Ethics

A 15 year old boy went viral. Making and sharing a 15 second music video on the hugely popular online service, lip syncing and gesturing with what appears to be his dying grandfather in the background. ‘What’s up with this kid?’ was the main response of the online comments, articles and youtube videos following this boys’ online endeavour. Another question we should ask is: what’s up with these online services targeted youth? Where are their data ethical responsibility?

Children and young people are an attractive market for digital services, as they make trends and spend money. But the youngest of them also pose an existential challenge to these services, because in many countries businesses can’t legally collect and process data on children under 13 without their parents’ consent. This is particularly unfortunate for the ‘free’ services, where the main business model is to process and track personal data for marketing purposes. Not only is it complicated and expensive to obtain parental consent, but the extra trouble of having to show specific consideration towards children in marketing (also a legal requirement in many countries) demands even greater efforts.

An Internet Built on Human Empowerment
The Internet has brought about edless opportunities, bringing people together, spiking new forms of creativity and literacy – including online social networking and user generated content. But these opportunities can only be realised if human empowerment, rights and specific needs are thought in to the very processes of online innovation. Online services must be designed with embedded ethics and a social awareness of the different types of citizen groups whose lives they support and have an influence on. And they need to adapt and respond with data ethics as a compas while evolving.

An Age Limit Doesn’t Solve the Data Ethical Issues
Gary Vaynerchuk, the chief executive of VaynerMedia, an advertising agency that focuses on social media and has helped clients produce campaigns for among others the platform, says in this New York Times article: “This is no question the youngest social network we’ve ever seen,” and continues: “you’re talking about first, second, third grade.” Nevertheless, still insists that its users are over 13, and underlines this with a sentence in its privacy policy: “We do not knowingly collect information from children under 13 and we do not want it.” And with this sentence and a one pager with parental guidance on their site, they seem to have solved their data ethical responsibility regarding their younger users.

But children do use, as the New York Times article underlines, and it is very doubtful that do not know this. With their 100 million users worldwide, most of them of a younger age, they have taken their responsibilty particularly lightly. Children who use the service have accepted a long incomprehensible privacy policy that say they are not children. Full stop. Which at the same time implies that they do not need any special treament. They are fully aware of their actions and can take full responsibility for the consequences of these.
The mere insistence on an age limit is of course not a sufficient solution, because a child or young person’s maturity is not defined by a number, its deeply personal, social and cultural. The fact that we have different laws in countries with different age limits for children’s participation in various societal functions is evidence of this. There is a very good reason why we have internationally agreed upon specific provisions concerning the empowerment and protection of children. As stated in the UN Human Rights Declaration: “the child, by reason of his physical and mental immaturity, needs special safeguards and care, including appropriate legal protection…”.

One Step Further Than Basic Legal Compliance
In the US there is the COPPA law, which prohibits the collection and processing of data on children under 13 without parental consent. In Europe we have the new European Data Protection Regulation, which comes into effect in 2018 and that also has a 16 years age limit (with the possibility of each state lowering it to 13). But clearly an age limit is not sufficient to implement a healthy online environment for children and young people where they are empowered as well as protected.

With size and evolution comes responsibility and if children and young people use a service at large, whether you want it or not, there are legal but importantly also ethical requirements that must be met. The service must be designed privacy by default and for children and young people with features that speak their language. Parents of the youngest users need to be involved, they must be given the option to provide informed consent (or not), which includes awareness and educational efforts. In addition, an online service with young users needs to reflect on the data- ‘culture’, ‘mindset’, and education it inspires. When you create the framework for a children’s universe, you also have an indirect responsibility for the culture that this framework helps to define. What are children being encouraged to share through the service? Do they understand the consequences? Did the 15-year-old boy posting a music video with his dying grandfather online, truly understand the consequences of this action? The dissemination, the amplification, the response, the effect on future opportunities and identity? Most adults don’t even get that. So to ask this from a young person?

Companies could benefit from taking their data ethical responsibilities to children seriously, design their services and business models data ethically and thus differentiate themselves positively from their competitors that currently do not seem to understand the role they are playing in the lives of children and young people. Of course they cannot do it all on its own. It’s a shared responsibility, where companies, states and even the users themselves work towards a sustainable and ethical framework for our data saturated environment.