Is TikTok a looming political disaster?

NurPhoto/NurPhoto/Getty Images
Impact
ByJake Pitre
Updated: 
Originally Published: 

It takes a moment to realize what you’re looking at: a grid showing multiple shots of a cat clapping to the tune of “Mr. Sandman,” assembled into a glorious dance routine with perfect comedic timing. From TikTok user @jade13tr, it’s the kind of thing that has always gone viral online — but it’s also highly specific to the platform that hosts it. With TikTok’s 15-second timeframe, musical inclination, and plug-and-play visual effects, @jade13tr maximized the platform to usher in what may be its most-seen video.

TikTok has 500 million monthly active users, at least 41% of them between the ages of 16 and 24. It’s the ultimate amalgamation of platforms like Vine, Instagram, and YouTube — a massively popular space for mostly young people to flex their creativity, make their friends laugh, and maybe even go viral. But even on an app with all that possibility, there’s one thing TikTok doesn’t want to touch. It really, really doesn’t want to get political.

Last month, the company announced that it will prohibit political ads in the lead-up to the 2020 election. The goal, per Blake Chandlee, TikTok’s vice president of global business solutions, is to focus instead on “the app’s light-hearted and irreverent feeling that makes it such a fun place to spend time.”

Sure. And yet, Chandlee and his company seem to be ignoring what is actually taking place on TikTok like, say, how the platform came under fire last year for failing to properly moderate neo-Nazi propaganda, or Congress’s concerns over the company’s potential national security risks. TikTok’s bosses may think that political ads are an easy bogeyman, but it turns out they might have bigger problems to watch out for.

TikTok launched in 2017 after ByteDance, its parent company, merged with musical.ly, a predecessor that exclusively featured lip-syncing videos. TikTok combines the musical.ly model — short, silly video clips — with the creativity and humor of Vine, resulting in bite-size pieces of content that could span anything from the tale of a toilet paper roll finding its home to a teen girl helping define a new form of social media personality. While the app has successfully captured the ever-coveted young audience, it faces a new problem as a result of all the new attention: The more that TikTok insists that it’s an apolitical haven where users treasure creativity above all, the more susceptible it becomes to some of those users spreading misinformation while the company turns a blind eye.

“People tend to think of TikTok as a kind of superficial platform for goofing off. And it is that,” said Ioana Literat, an assistant professor at Columbia University who specializes in online cultures and civic participation. “But when young people hang out online, they do also talk about meaningful, serious topics, like politics, although they may talk about it in seemingly goofy ways, with memes and lip-syncing and dance challenges.”

Meme literacy is surprisingly crucial for recognizing how politics is discussed online — and especially within emergent spaces, as viral models like Distracted Boyfriend or Woman Yelling At Cat suggest. What makes social media particularly ripe for manipulation is that each platform is designed with certain algorithmic structures that dictate how information spreads. These backend machinations can also influence how much time a user spends on the platform, meaning companies may have increasing control over a user’s life and habits. These hidden incentives can lead users in certain directions — and if weaponized by those with dangerous content, can spread those messages with reckless abandon.

YouTube, rather infamously, has been pinpointed as a place where users generally interested in politics can easily be guided into watching far-right content simply via the site’s algorithm. TikTok organizes its content mostly via hashtags, but has its own background algorithms churning along all the time, meaning that a similarly imperfect architecture is at play there too. As Twitter user @biasbe, a 30-something tech executive and self-professed “closet leftist” living in Temecula, California, tells Mic, “[TikTok] started to pepper in videos by MAGA dudes, and I think their algorithm interpreted me watching them as [being] interested in right-wing politics. I did one search for the word ‘socialism’ just to see if there [were] any leftists making vids, and about a day after I did that search my feed suddenly became about 50% leftists.”

I tried it myself. The hashtag #socialism has 5 million views on TikTok, and within it, you’ll find Trump-supporting young people doing lip dubs referring to Venezeulan socialism; others are trying to parse the differences between socialism and communism, or using High School Musical to understand the relationship between Vermont Sen. Bernie Sanders and New York Rep. Alexandria Ocasio-Cortez, two prominent progressive politicians who have embraced democratic socialism. The hashtag #trump has 55 million views, and #politics has 121 million.

Politics finds young people where they are, no matter what.

Whatever TikTok might say about the purely positive and creative space it intends to provide, its users are obviously getting political — and sometimes, dangerously so. The Washington Post reported in September that TikTok, a Chinese-owned company, was censoring content about the Hong Kong protests. The Guardian obtained leaked documents later that month that showed that the company was instructing moderators to censor anything related to Tiananmen Square, Tibetan independence, or the banned religious group Falun Gong. In response to The Guardian’s report, TikTok said at the time, “We take localized approaches, including local moderators, local content and moderation policies, local refinement of global policies.”

Taking a “localized approach” may sound sensitive and responsible, but in reality it enables TikTok to grant certain groups or governments the power to force censorship of topics they deem controversial. And while in the app’s first year of existence, its punchy little videos were mostly relegated to teen group chats or Twitter threads, it’s clear that’s no longer the case. In recent weeks, teens promoting communism have proliferated on the app; meanwhile, the U.S. government has opened an investigation into TikTok as a potential threat to national security.

Literat and Neta Kligler-Vilenchik, a communications and journalism professor at the Hebrew University of Jerusalem, wrote an article for the journal New Media and Society about TikTok’s predecessor, musical.ly, and the ways young people used it after the 2016 election. The researchers looked at thousands of videos tagged with phrases like #makeamericagreatagain or #notmypresident and found that such hashtags were harnessed to subvert musical.ly’s innocent lip-syncing intentions and enable users on the app to “collectively express political stances in ways that are often surprising to adult audiences.”

Literat says this pattern has continued — and perhaps even intensified — on TikTok. “Political expression on TikTok happens in the language of young people,” Literat says, and that means they’re “lip-syncing political speeches, or using political hashtags to create connections or claim affiliations, or using popular culture resources in political ways.” Literat’s latest research has even found young people discussing immigration policies via Fortnite, the intensely popular battle royale game that made its way into teens’ hearts as well as rappers’ lyrics and NBA locker rooms. The lesson? Politics finds young people where they are, no matter what.

With influence like that, it might benefit TikTok to do a little more due diligence in policing its own platform. It's not a matter of tracking specific users, per se, but rather keeping an eye out specifically for bad actors. Yet so far, the company been lackadaisical on this front. HuffPost reported earlier this year that some (adult) far-right activists have moved to TikTok after being banned from other platforms. One such figure, the alt-right conspiracy theorist Paul Joseph Watson, was a popular YouTuber who posted a video called “The Cultural Significance of TikTok” and described Gen Z’s politically-tinged meme usage as a direct response to millennial outrage culture. Tim “Baked Alaska” Gionet, another far-right personality, also found a home at TikTok after being banned from Twitter.

To be fair, users like Watson and Gionet have relatively small followings on the platform, so it may be too early to balk at TikTok’s right-wing takeover. Still, last December, Motherboard issued a warning about the app’s “Nazi problem.” At the time, TikTok users were calling for violence against people of color and Jews as well as praising Charleston church killer Dylann Roof, Joseph Cox reported, citing videos he uncovered mostly by searching different hashtags. Cox noted that other platforms have similarly struggled with moderation, but he singled out TikTok as “doing a particularly bad job at moderating white supremacists on its platform.”

As Literat points out, you can find extreme examples like the ones Motherboard cites on any platform, and TikTok is just one of many struggling to take care of the problem. “Yes, there is plenty of concerning political content on TikTok,” she says, “but I also see it as a meaningful outlet for youth political expression and discussion.” There have already been examples of real political mobilization thanks to TikTok, most notably when Nevada 16-year-old Gillian Sullivan used the app to organize a strike in support of teachers receiving better wages.

Because the audience of TikTok is generally younger, “the stakes are higher.”

These productive efforts are meaningful, but Literat notes that because the audience of TikTok is generally younger, “the stakes are higher.” Consider that ISIS is sharing videos on the app, for example, and it’s worth remembering that extreme political views can be normalized via repetition. Then consider how embedded TikTok is in many young digital lives; most users spend an average of 52 minutes per day on the app.

It's hard to meaningfully police the actions of 500 million users, let alone 500 million users who spend nearly an hour poking around the app. But the need isn't for censorship, really, so much as rigorous moderation and clearly defined rules and regulations. Without giving Facebook too much credit, the company has hired more moderators in recent years — though it's not going well, necessarily — while most if not all platforms continue to struggle with implementing constructive and efficient reporting features. On TikTok, for example, you counterintuitively have to press on the "Share" icon to find a way to report a video. Users could certainly benefit from a little more clarity on how exactly to flag content they deem problematic.

What's worse, though, is TikTok's avowed neutrality. The lack of stance makes it all the more vague what's allowed and what's not. Political ads are banned, but political discussion is (mostly) OK? TikTok boss Alex Zhu only muddied the waters further in a New York Times interview published Monday, when he suggested that, in light of reports that Hong Kong protester videos had been deleted and other concerns of selective censorship, politics is allowed as long as it aligns with TikTok's "creative and joyful experience" ... whatever that means.

As New Yorker writer Jia Tolentino recently pointed out, “A platform designed for viral communication will never naturally be politics-free.” And where Gen Z is concerned, TikTok is clearly a place for much more than just playing around with cat-centric non sequiturs.