UK Girl, 12, Sues TikTok Over Claims of Breaching Her Privacy

By Robert Bateman

Social media app TikTok is facing a class-action lawsuit in the U.K., led by an unnamed 12-year-old girl who claims the app breached her privacy.

The High Court of England and Wales allowed the plaintiff, known only as “SMO,” permission to proceed anonymously in a hearing conducted last month.

Court documents revealed that the plaintiff was citing the General Data Protection Regulation (GDPR), an EU law adopted into U.K. law before Brexit. The law, passed in 2016, restricts how apps and other online services use children’s personal information. 

Under the GDPR, if online service providers want consent to process a child’s personal information, they must request it from a parent or guardian.

For the purposes of the GDPR’s consent rules, U.K. law interprets a “child” as a person under 13. Given that the claimant in the TikTok case is 12, the company may face questions about whether it has properly obtained parental consent.

TikTok did not respond immediately to a request for comment from Digital Privacy News.

$5.7M Settlement

In February 2019, TikTok’s parent company, ByteDance, settled with the U.S. Federal Trade Commission for $5.7 million following allegations that the app “failed to seek parental consent” before collecting childrens’ personal information.

“Children represent the new digital generation.”

Anastasia Karagianni, researcher, Athens, Greece.

TikTok last Wednesday announced new privacy protections for younger users. The profiles of users under 18 now will be “private” by default — and users under 16 no longer will receive comments from strangers.

“We are all digital citizens, and children represent the new digital generation,” Anastasia Karagianni, a law, gender and technology researcher based in Athens, Greece, told Digital Privacy News.

“All the relevant stakeholders should keep in mind children’s rights and needs, adopt them in the digital environment and protect them accordingly.”

‘Verifiable Consent’

Karagianni explained how the law applied to U.K. and EU online service providers when seeking children’s consent.

“They must get verifiable consent from a parent, caregiver or guardian unless the service they offer is a preventative or counseling service,” she said. “They must make reasonable efforts by using available technology to verify that the person giving consent actually holds parental responsibility for the child.

“If they target children,” Karagianni continued, “they must write clear and age-appropriate privacy notices for them, so that they understand what they’re consenting to. The law requires that they take appropriate measures to ensure that (children’s) data is safe-guarded.”

However, Karagianni noted that some ambiguities existed in this area of EU law. She said that the section of the GDPR covering children’s consent was “controversial” and had “received the most criticism.”

Karagianni noted that the law did not define how parents should provide consent and that the law is unclear regarding children’s other data rights — such as the right to rectify or erase their personal information.

“Who is the rights holder of these data rights — a parent, a child, or both?” she posed. “And who can exercise them and at what time?”

‘Good News’

Marisa Marcos, a social media consultant in Rome specializing in children’s privacy, told Digital Privacy News that the case represented “good news” for children’s privacy.

“People are starting to be aware of the risks that some platforms pose for all of us, but especially children and younger kids — in this particular case, users under the age of 13.”

Parental consent matters, Marcos said, because children were “treated as adults when signing up to use the app,” and were asked to understand terms that many adults do not understand.

Marcos also pointed to the use of third-party cookies by social media apps, which allow companies to profit from users’ personal information.

“It is good news that institutions and organizations are starting to see it as a problem that needs to be addressed.”

Marisa Marcos, children’s privacy expert, Rome.

“It is good news that institutions and organizations are starting to see it as a problem that needs to be addressed,” she said. 

Marcos pointed to other recent investigations into TikTok’s privacy practices outside of the U.K., including a probe by the Dutch data-protection authority launched last May, and a $167,000 fine issued by a South Korean regulator in July last year.

“In my view,” she told Digital Privacy News, “we need a transnational approach to try to solve global issues like this.”

Robert Bateman is a writer in Brighton, U.K.

Sources: