This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.
You can find the current article at its original source at https://www.theguardian.com/technology/2023/apr/04/tiktok-fined-uk-data-protection-law-breaches
The article has changed 3 times. There is an RSS feed of changes available.
Version 0 | Version 1 |
---|---|
TikTok fined £12.7m for UK data protection law breaches | TikTok fined £12.7m for UK data protection law breaches |
(32 minutes later) | |
Violations include using data of children without parental consent, says information commissioner | |
TikTok has been fined £12.7m for multiple breaches of data protection law, including using the personal data of children aged under 13 without parental consent, Britain’s data watchdog has said. | TikTok has been fined £12.7m for multiple breaches of data protection law, including using the personal data of children aged under 13 without parental consent, Britain’s data watchdog has said. |
The Chinese-owned video app had not done enough to check who was using the platform and remove underage children, the Information Commissioner’s Office (ICO) said on Tuesday. | The Chinese-owned video app had not done enough to check who was using the platform and remove underage children, the Information Commissioner’s Office (ICO) said on Tuesday. |
The failure to enforce age limits led to “up to 1.4 million UK children” under the age of 13 using the platform as of 2020, the ICO estimated, even though the company’s own rules banned the practice. UK data protection law does not have a strict ban on children using the internet, but requires organisations that use the personal data of children to obtain consent from their parents or carers. | The failure to enforce age limits led to “up to 1.4 million UK children” under the age of 13 using the platform as of 2020, the ICO estimated, even though the company’s own rules banned the practice. UK data protection law does not have a strict ban on children using the internet, but requires organisations that use the personal data of children to obtain consent from their parents or carers. |
In a statement, the information commissioner, John Edwards, said: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws. | In a statement, the information commissioner, John Edwards, said: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws. |
“As a consequence, an estimated 1 million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.” | |
“TikTok should have known better,” Edwards added. “TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.” | “TikTok should have known better,” Edwards added. “TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.” |
The ICO’s investigation found that concern was raised internally but that TikTok did not respond “adequately”. | |
In a statement, a TikTok spokesperson said: “TikTok is a platform for users aged 13 and over. We invest heavily to help keep under-13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community. | |
“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.” | |
TikTok emphasised that it had changed its practices since the period the ICO investigated. Now, in common with social media peers, the site uses more signals than a users’ self-declared age when trying to determine how old they are, including training its moderators to identify underage accounts and providing tools for parents to request the deletion of their underage children’s accounts. | |
The accusations also pre-date the introduction of the ICO’s “age appropriate design code”, which specifies an even stricter set of rules that platforms are expected to follow when handling the personal data of children. That code also makes it more explicit that platforms cannot argue ignorance over the ages of younger users as a defence of failing to treat their personal data carefully. | |
In 2019, TikTok was hit with a fine of $5.7m by the US Federal Trade Commission for similar practices. That fine, a record at the time, was also levied against TikTok for improper data collection from children under 13. That year, the company committed to improving its practices and said it would begin keeping younger users in “age-appropriate TikTok environments”, where those under 13 would be pushed into a more passive role, able to watch videos, but not post or comment on the platform. |