On Tuesday, the UK’s data regulator announced that it had imposed a fine of £12.7 million ($15.9 million) on TikTok for allowing approximately 1.4 million children under the age of 13 to use its social media platform in breach of its own policies.
The Information Commissioner’s Office stated that the Chinese-owned company had violated UK law by failing to obtain the consent of parents or guardians for the use of their children’s data, despite the fact that these users were too young to create accounts.
TikTok nevertheless welcomed a decision by the ICO to slash the fine from £27 million, which the regulator had previously warned it might impose.
TikTok contested the ICO’s conclusion, which comes in addition to a series of prohibitions on the platform’s use by Western governments on official devices, citing concerns that Beijing may be able to access the data.
“We will continue to review the decision and are considering next steps,” the company said in a statement.
“We invest heavily to help keep under 13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community,” it said.
Despite this, TikTok expressed its appreciation for the ICO’s decision to reduce the fine from £27 million, which the regulator had earlier cautioned might be imposed.
Although TikTok’s terms of service prohibit the creation of accounts by children under the age of 13, the ICO stated that the company had not conducted sufficient checks to prevent such occurrences in the UK, resulting in up to 1.4 million children being impacted in 2020.
“That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll,” Information Commissioner John Edwards said.
“There are laws in place to make sure our children are as safe in the digital world as they are in the physical world,” he said in a statement.
“TikTok did not abide by those laws.”