News. The Chinese app TikTok (former Musical.ly) has been fined under the American privacy law COPPA for disturbing practices towards children. Is YouTube next?
The American Federal Trade Commission (FTC) alleges that the app TikTok for years directed their service at children under the age of 13 and collected the personal information of these children, according to the global information privacy community (IAPP), who also writes it has been stopped and TikTok
“will pay $5.7 million in civil damages and must delete the personal information in its accounts or take steps to verify the age of users and delete the personal information of users under the age of 13 for which the service does not obtain parental consent or who fail to verify their age”.
Moreover, strict reporting, record-keeping, and compliance-monitoring measures are also included in the order.
The commissioners state that: “the agency “uncovered disturbing practices” that “reflected the company’s willingness to pursue growth even at the expense of endangering children.”
Regulation via COPPA
The American COPPA Rule (the Children’s Online Privacy Protection Act) imposes certain requirements on operators of websites or online services directed to children under 13 years of age but also on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age.
The FCT found that TikTok violated COPPA, because it allegedly failed to obtain verifiable parental consent prior to collection, use and disclosure of children’s personal information. They didn’t delete personal information at the request of parents and they allegedly retained children’s personal information for longer than reasonably necessary.
Is YouTube next?
It is not the first time that the FTC acts on illegal collection of childrens’ data. A year ago a coalition of over 20 child advocacy, consumer and privacy groups filed a complaint asking to investigate and sanction YouTube for violation of federal children’s privacy laws. FTC believes that YouTube is violating COPPA, because YouTube doesn’t get parental consent for children under the age of 13, before they collect children’s data. But technically YouTube has made sure that the site is aimed at those users 13 and older, its terms state.
The case is not settled yet, and the complaint points to several examples of how YouTube is clearly targeting younger children, including its hosting of cartoon videos, nursery rhymes, and toy ads. Some of the most popular channels on YouTube are also those aimed at young kids, like ChuChu TV Nursery Rhymes & Kids Songs, which has 15.9 million subscribers and over 10 billion channel views, and LittleBabyBum, which has 14.6 million subscribers and over 14 billion channel views.
The question is if YouTube is next in line after TIKTOK – or if the motives for regulating the Chinese app more is a question American protectionism than the protection of children’s welfare?
About TikTok:
- Launched in 2014 as a video social networking app.
- In 2018 the app was one of the most downloaded apps in the world.
- Users can follow and message each other and create videos and synchronize them with music and audio clips from the service’s online music library or audio files stored on user’s smartphones.
- A significant percentage of the users are children under 13, and numerous press articles between 2016 and 2018 highlight the popularity of the App among tweens and younger children.
- The app includes child appealing elements like song folders available to use for lip-synching in the application titled “Disney” and “school.”
- In July 2017, the application began requesting age information from new users and prevented individuals who indicated they were under 13 years of age from creating an account. Users who created an account prior to July 2017 were not requested to verify their age.
- The service received thousands of complaints and requests for deletion from parents whose children had created accounts on Musical.ly without their knowledge. The service closed the children’s accounts but did not delete the users’ videos or profile information