TikTok is fined £12.7million for data protection law breaches as watchdog says more than one million children under 13 were using the platform in 2020 despite its terms of use forbidding that

  • TikTok has been fined £12.7million for string of UK data protection law breaches
  • Data watchdog blasts TikTok after over a million under-13s found to use platform

TikTok has been fined £12.7million for a number of data protection law breaches, including failing to use children's personal data lawfully, the Information Commissioner's Office (ICO) said.

The ICO said more than one million children under 13 were using TikTok in 2020, despite its terms of use not allowing that.

It added that personal data belonging to those children was used without parental consent and that the company did not do enough to check who was using the social media app and take enough action to remove the underage children that were.

TikTok had faced a fine of £27million, but the final total was reduced to £12.7million. 

Ryan Gracey, partner and data privacy expert at law firm Gordons, said: 'This fine may be well below what the ICO initially threatened, but it is still a significant penalty and one of the largest ever given by the ICO.'

In September last year, Irish regulators fined Instagram €405 million (£348 million) after the platform mishandled teenagers' personal information in violation of EU data privacy rules.

In July 2021, Luxembourg authorities fined Amazon a record €746 million (£654 million) after finding it failed to follow GDPR rules, although this was later partially suspended by the country's administrative court.

TikTok has been fined £12.7million for a number of data protection law breaches, including failing to use children's personal data lawfully

TikTok has been fined £12.7million for a number of data protection law breaches, including failing to use children's personal data lawfully

Information commissioner John Edwards said TikTok had failed to abide by laws to make sure children are as safe in the digital world 'as they are in the physical world'. 

He said: 'As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.

WHAT ARE TIKTOK'S POLICIES AROUND CHILD USERS?

Tiktok insists users must be 13 years of age, although it offers an 'experience' version for younger users with many of its usual features disabled. 

If someone tries to create an account below the age of 13, Tiktok suspends their ability to create another account using a different date of birth.

According to Connect Safely's Tiktok Parents' Guide:

Does TikTok have a minimum age? 

TikTok’s Terms of Service state that users must be at least 13 years old to sign up for an account and have full access to the platform, in compliance with the U.S’s Children’s Online Privacy Protection Act. 

Some countries, including Ireland, impose different age restrictions so, if you’re outside the United States, check with your local laws. 

Is TikTok appropriate for children? 

Although TikTok is a youth-centric app, it is not uncommon to encounter videos on the platform that contain swearing and sexually-infused content. While the app may be appropriate for most teens, it is not aimed at children under 13. 

The app, however, provides several ways for users and parents to restrict mature content that may not be appropriate for younger users. Additionally, TikTok has created a separate “experience” that allows users under 13 access only to curated, clean videos. 

Underage users cannot comment, search, or post their own videos. However, bypassing that restricton only requires entering a false birth date.

Advertisement

'TikTok should have known better. TikTok should have done better. Our £12.7million fine reflects the serious impact their failures may have had.' 

Children's data may have been used to track and profile them, potentially presenting them with harmful or inappropriate content, he added.

A TikTok spokesperson said the company disagreed with the ICO's decision but was pleased the fine had been reduced from the possible £27million set out by the ICO last year.

'We invest heavily to help keep under-13s off the platform and our 40,000 strong safety team works around the clock to help keep the platform safe for our community,' the spokesperson said.

'We will continue to review the decision and are considering next steps.' 

TikTok’s Terms of Service state that users must be at least 13 years old to sign up for an account and have full access to the platform, in compliance with the U.S’s Children’s Online Privacy Protection Act. 

It is unclear what safeguards are in place to prevent underage users signing up to the service, but Tiktok has published a four-page Parents Guide, offering advice on how to manage their child's use of the platform. 

Commenting on the fine, data privacy expert Mr Gracey added: 'It's another acute reminder that technology companies must take steps to protect personal data, especially the data of children online.

'In particular, businesses need to be aware of the ICO's statutory code of practice known as the Children's Code which sets out a series of standards they expect businesses to follow when designing and building online services which may be used by children.

'The standards include using clear language in 'bite-size' chunks for children to tell them what they are doing with the user's personal data, being open about the risks and safeguards involved, and letting the user know what to do if they are unhappy.'

The ICO's fine follows moves by Western governments and institutions in recent weeks, including Britain, to bar usage of TikTok on official devices over security concerns. 

In 2019, US regulators hit the company with a $5.7million (£4.5million) fine for similar practices related to improper data collection from children under 13.

Earlier on Tuesday, Australia became the latest country to ban the Chinese-owned app from its federal government's devices.

Last month the UK Government said it would block TikTok from its devices and networks over safety concerns, with the Scottish Government following suit.

TikTok, owned by Chinese internet company ByteDance, has insisted it does not share data with China.

But Beijing's intelligence legislation requires firms to help the Communist Party when requested.

TikTok chief executive Shou Zi Chew also made a rare public appearance to be questioned by the US Congress over data security and user safety.

'Let me state this unequivocally, ByteDance is not an agent of China or any other country,' he said, as he made his case for why the popular app should not be banned, at the March hearing.

WHAT ARE THE DATA PROTECTION RULES AROUND CHILDREN?

Strict rules are in place to protect children when using online services, as explained by data privacy experts at law firm Gordons.

Who is classed as a ‘child’?

Anyone under 18, but for ‘consent’ to be valid under GDPR, individuals must be at least 13 years old. Parental or guardian consent must be given if the child is under 13 years old. Services do not need not be ‘directed’ at children to be caught by these stringent protections. For example, TikTok is aimed at any online users but attracts a number of teenage users.

Shouldn’t parents be responsible for their children?

Safeguarding a child’s personal data does not only fall to the parents. The UN Convention on the Rights of the Child states the best interest of the child must be a primary consideration in all actions concerning children.

What should businesses be doing to protect children’s personal data?

Businesses can:

  1. Use clear language in ‘bite-size’ chunks for children.
  2. Establish what age range individual users are likely to fall into, so businesses can tailor the safeguards accordingly.
  3. Configure the service’s default settings as private to protect everyone’s privacy even if businesses don’t expect children to use their services.
  4. Draft a Data Protection Impact Assessment ('DPIA') to help assess and mitigate the risks to children.
  5. Have policies to support and demonstrate compliance with data protection legislation.
  6. Ensure that anyone who provides their consent is at least 13 years old and keep and update records of consent received.
  7. Consider providing visual or audio prompts telling children to get help from a parent if they try to change the privacy settings.

The comments below have not been moderated.

The views expressed in the contents above are those of our users and do not necessarily reflect the views of MailOnline.

We are no longer accepting comments on this article.