What is the reason behind TikTok being required to pay a fine of $368 million
2 min readTikTok is facing a substantial fine of around $367 million (EUR 345 million) from the Data Protection Commission (DPC) of Ireland for mishandling children’s data. A 2021 investigation was initiated to evaluate TikTok’s adherence to the General Data Protection Regulation (GDPR) laws in Europe. The Irish regulatory body, overseeing the app in the European Union, uncovered multiple instances of GDPR breaches by TikTok. These infractions included defaulting children’s accounts to public visibility, allowing adults to enable direct messaging for users over 16, and insufficient consideration of risks for users under 13 on the platform.
The investigation revealed that individuals aged 13 to 17 were directed through the registration process, automatically setting their accounts to a public visibility setting. Consequently, anyone could access and comment on the content within these accounts.
Moreover, the “family pairing” feature, intended for adults to manage a child’s account settings, lacked a verification process to confirm the adult user’s status as the genuine parent or guardian. This feature links children’s accounts with those of adults, but the investigation found that unverified adult profiles could be linked, potentially leading to the exchange of direct messages.
Concerns have been raised about TikTok’s efficacy in preventing children under 13 from accessing the platform. Although age verification methods were deemed GDPR-compliant, there were deficiencies in safeguarding the privacy of underage users.
The DPC disapproved of TikTok’s past practice of automatically setting underage users’ accounts to a default public setting, allowing anyone to view their content. Features like Duet and Stitch were also automatically activated for users under 17. TikTok has a three-month window to comply with the new regulations, and it’s noteworthy that no GDPR violations were identified in the methods used to verify users’ ages.
Penalties imposed on TikTok in the past
In April, the UK data regulator imposed a fine of GBP 12.7 million on TikTok for improperly handling the data of 1.4 million children under the age of 13 who had used its platform without parental consent.
TikTok responded, expressing disagreement with the decision, particularly regarding the substantial amount of the fine. The Data Protection Commission (DPC) criticisms primarily focused on features and settings that were in place three years ago, which TikTok claims to have already modified well before the investigation commenced. For example, TikTok had already implemented the practice of setting all accounts for users under the age of 16 to private by default.
Starting in 2021, TikTok adopted a standard practice of setting both new and existing accounts for individuals aged 13 to 15 to a private setting by default. This means that only individuals approved by the user can access their content. These changes were implemented in response to concerns raised during the investigation.