TikTok has a ton of young users. Is it doing enough to protect them?

Tiktok
Impact
Updated: 
Originally Published: 

TikTok, the video sharing app formerly known as Musical.ly known best for its viral videos, has exploded in popularity over the last year with more than one billion downloads. A significant portion of that is young users, including those who are under 13. While the app has racked up a massive user base of extremely active teens, there's starting to be a significant amount of scrutiny for TikTok's lax handling of young users — and it could carry a heavy price if changes aren't made.

Regulators in the United Kingdom are launching an investigation into how TikTok has handled data of younger users, including the type of information that the app has collected from underage people. There's already a pretty good basis to believe that TikTok is, or at the very least was, in violation of laws designed to protect information belonging to children. The United States' Federal Trade Commission already slapped TikTok with a $5.7 million dollar fine for sucking up data from users under the age of 13, which is illegal without parental permission thanks to the Children’s Online Privacy Protection Act (COPPA). According to the FTC, TikTok collected names, email addresses, and other personal information from users under the age of 13. The U.K., like the U.S., restricts apps and services from taking this type of information from children without any sort of consent and goes further with the E.U.'s General Data Protection Regulation (GDPR), which requires companies to provide different services with stricter protections for children.

There's little preventing users under 13 from getting on TikTok. Seeing as two-thirds of the app's users are under 30, including 60 percent who are between 16 and 24, it seems extremely likely kids 13 and under have figured out they can simply lie about their birthday and gain access to the entirety of TikTok — an issue that plagues basically every app that simply uses an age gate to try to restrict access. The FTC went so far as to accuse TikTok of being fully aware of this fact and choosing to ignore it. The app makers responded to the agency's criticism by deleting all accounts belonging to users younger than 13 and has since created a "passive" experience for younger users that allows them to view content but not create their own videos, though it's only so effective — people can still create accounts using a fake birthday to appear older.

While that may solve some of TikTok's problems — specifically, its apparent habit of collecting data that it shouldn't — it doesn't completely address one of the app's other biggest flaws, and one that the U.K. also plans to investigate. TikTok operates with an open messaging system that allows users who have public accounts to directly communicate with one another. Because TikTok accounts default to being public, unless a user goes out of their way to make their account private, they can send and receive messages from basically anyone on the app. That has turned the app into what the National Society for the Prevention of Cruelty to Children (NSPCC) has called a "hunting ground" for abusers and child predators. A report earlier this year from child advocacy group Barnardo's highlighted instances of child exploitation on the app, including cases in which kids as young as eight years old were targeted and contacted by adults on the app. Those types of interactions happen with an alarming amount of frequency, per the NSPCC. The organization surveyed more than 40,000 schoolchildren who have used TikTok and other video apps and found one in four had been contacted and communicated with a total stranger. Five percent were asked either in a message or comment to take their clothes off.

For TikTok to properly serve its audience, it needs to vastly improve its safety measures for its young users. That doesn't just mean giving up on collecting their information, it means designing a messaging system and platform as a whole with privacy in mind. It may seem antithetical to an app that is built on virality and people sharing videos, but it's becoming increasingly clear that those protections are necessary.