TikTok was fined £12.7 million for violating UK data protection laws

351
TikTok was fined £12.7 million for violating UK data protection laws. in Cartoon style
TikTok was fined £12.7 million for violating UK data protection laws. in Cartoon style

TikTok has been penalized £12.7 million for various violations of data protection law, including using the personal data of children under the age of 13 without a parental agreement, according to the UK’s privacy watchdog.

The Information Commissioner’s Office (ICO) said on Tuesday that the Chinese-owned video app had not done enough to check who was using the site and remove underage youngsters.

The inability to enforce age limitations resulted in “up to 1.4 million UK children” under the age of 13 using the platform by 2020, according to the ICO, despite the company’s own regulations prohibiting the practice. The UK data protection law does not outright prohibit children from using the internet, but it does require organizations that use children’s personal data to acquire agreement from their parents or caregivers.

The information commissioner, John Edwards, stated in a statement, “There are laws in place to ensure our children are as safe in the digital world as they are in the physical world.” TikTok did not follow these rules.

“As a result, an estimated 1 million under-13s were improperly granted platform access, with TikTok collecting and using their personal data.” That implies their information may have been used to follow and profile them, potentially sending harmful or inappropriate stuff on their next scroll.”

● Must Read:  A Comparison of the M1 and M2 MacBook Airs: Is It Worth Upgrading?

“TikTok should have known better,” added Edwards. “TikTok should have performed better.” Our £12.7m fine reflects the terrible consequences of their mistakes. They did not do enough to examine who was using their platform or take adequate steps to remove the underage youngsters who were utilizing it.”

According to the ICO’s inquiry, a concern was voiced internally, but TikTok did not reply “adequately.”

“TikTok is a platform for users aged 13 and up,” a TikTok spokeswoman said in a statement. We make significant investments to keep under-13s off the platform, and our 40,000-person safety team works around the clock to keep the platform safe for our community.

“While we disagree with the ICO’s decision, which covers the period from May 2018 to July 2020, we are pleased that the fine announced today is less than half of what was proposed last year.” We will keep reviewing the judgment and determining next moves.”

TikTok emphasized that its policies had improved during the period under investigation by the ICO. In addition to educating its moderators to identify underage accounts and giving options for parents to seek the deletion of their underage children’s accounts, the site, like its social media peers, now uses more signals than a user’s self-declared age to assess how old they are.

● Must Read:  Android App for Apple TV is Supposed to Launch

The claims also predate the establishment of the ICO’s “age-appropriate design code,” which defines an even tougher set of regulations that platforms must follow when dealing with children’s personal data. This rule also makes it clear that platforms cannot claim ignorance about the ages of younger users as an excuse for failing to protect their personal data.

The US Federal Trade Commission fined TikTok $5.7 million in 2019 for similar activities. That record-breaking fine was also assessed against TikTok for unlawful data harvesting from children under the age of 13. That year, the firm pledged to improve its policies, stating that it will begin putting younger users in “age-appropriate TikTok environments,” where those under 13 would be pushed into a more passive position, allowing them to watch videos but not post or comment on the platform.