TikTok Receives Record Fine For Video Sharing App Over Children’s Data

The app lets users make short videos and set them to music, before sharing with followersImage copyright Getty Images
Image caption The app lets users make short videos and set them to music, before sharing with followers

Short-form video sharing app TikTok has been handed the largest ever fine for a US case involving childrens data privacy.

The company has agreed to pay $5.7m (£4.3m) and implement new measures to handle users who say they are under 13.

The Federal Trade Commission (FTC) said the Musical.ly app, which was later acquired and incorporated into TikTok, knowingly hosted content published by underage users.

It has ordered TikTok to delete the data.

Additionally, as of Wednesday, TikTok users in the US will be required to verify their age when they open the app.

However, like many social networks, age verification is implemented on a trust basis – a person signing up simply has to lie about their date of birth in order to get around the check.

“We care deeply about the safety and privacy of our users,” the firm said. “This is an ongoing commitment, and we are continuing to expand and evolve our protective measures in support of this.”

Despite this, TikTok said it would not be asking existing users in other countries, including the UK, to verify their age as the settlement only applied to the US.

After being one of the most downloaded apps of 2018, TikTok has an estimated base of 1 billion users worldwide.

‘Large percentage’ of underage users

But the FTC was concerned about how old some of those users were. Its report said the Musical.ly app had 65 million users in the US, a “large percentage” of which were underage.

TikTok’s parent company, China-based ByteDance, acquired Musical.ly in 2017, and incorporated it into TikTok, discontinuing the original Musical.ly app. The apps allowed members to create short videos, set to music, to share with other users.

“For the first three years [of its existence], Musical.ly didn’t ask for the user’s age,” the FTC’s statement read.

“Since July 2017, the company has asked about age and prevents people who say they’re under 13 from creating accounts. But Musical.ly didn’t go back and request age information for people who already had accounts.”

The FTC noted media reports suggesting adults on Musical.ly had contacted children who were obviously under 13 because “a look at users’ profiles reveals that many of them gave their date of birth or grade in school”.

Image copyright Getty Images
Image caption TikTok users in the US will be required to verify their age when they open the app

According to the regulators complaint, Musical.ly was contacted by more than 300 concerned parents in just a two-week period in September 2016. While the profiles of the children involved were subsequently deactivated, the content the child had posted was not deleted.

The FTC said TikTok would be fined because of what it saw a Musical.ly’s failure to adhere to the basic principles of the Children’s Online Privacy Protection Act, known as Coppa.

Obligations include being upfront in how children’s data is collected and used, as well as a mechanism by which to inform parents their child is using the service, and obtain their consent.

The company was also said to have not responded adequately to parents’ requests to delete data, and subsequently held onto that data for longer than was reasonable.

TikTok would not share estimates on how many underage users had been, or still were, on the platform.

‘Concerns arise’

TikTok’s settlement does not constitute an admission of guilt, but the BBC understands the firm does not plan to contest any of the FTC’s allegations. The process of deleting the data in question has begun, but the firm could not give an estimate of how long it would take.

To comply with regulations in future, TikTok said it was launching an “experience” for under-13 users that would strip out much of the functionality of the main app.

“While we’ve always seen TikTok as a place for everyone, we understand the concerns that arise around younger users,” the company said.

“In working with the FTC and in conjunction with today’s agreement, we’ve now implemented changes to accommodate younger US users in a limited, separate app experience that introduces additional safety and privacy protections designed specifically for this audience.”

That app experience will disable the ability for users to just about everything TikTok offers, such as “share their videos on TikTok, comment on others’ videos, message with users, or maintain a profile or followers”.

TikTok told the BBC it did not plan to provide the under-13 experience to users outside of the US, and instead would continue to limit use to those 13 and above.

‘A bit complicated’

On social media, panicked TikTok users reported being locked out of their accounts because of making mistakes when entering the date.

Users responded by saying the process did not work properly, or that they did not have the required verification.

“I’m sorry but this is ridiculous, I don’t have a government ID and I’m 14,” wrote one user on Twitter.

The firm admitted it was “a bit complicated”.

___

Follow Dave Lee on Twitter @DaveLeeBBC

Do you have more information about this or any other technology story? You can reach Dave directly and securely through encrypted messaging app Signal on: +1 (628) 400-7370

READ MORE HERE