The UK's Information Commissioner's Office (ICO) has slapped TikTok with a £12.7 million fine for failing to keep users under the age of 13 from accessing its platform and “misusing children’s data.”
Although it has a policy that prohibits children under 13 from using its service, the social media platform needed to have properly enforced it. According to ICO estimates, roughly 1.4 million children were using TikTok in the UK alone in 2020.
The risks of exposing young, impressionable minds to questionable, often unmoderated content are not to be overlooked, especially when paired with other hazards, such as the possibility of engaging with strangers or online profiling.
However, improper data management seems to be the key motivator in ICO’s decision. UK data protection law stipulates that companies that handle personal data to offer their services must acquire the consent of a parent or guardian for users under 13 years of age.
“We all want children to be able to learn and experience the digital world, but with proper data privacy protections,” according to Information Commissioner John Edwards’ statement. “Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.”
An ICO investigation revealed that “TikTok breached UK data protection law between May 2018 and July 2020.” The UK data watchdog found the company in violation of the following:
ICO’s original proposal was a £27 million fine against TikTok for processing “special category data” such as political opinions, sexual orientation, health data, religious beliefs, and ethnic and racial origin. The organization has since cut the fine by more than half.
tags
Vlad's love for technology and writing created rich soil for his interest in cybersecurity to sprout into a full-on passion. Before becoming a Security Analyst, he covered tech and security topics.
View all postsNovember 14, 2024
September 06, 2024