Popular video app TikTok poses valid concerns—and it’s up to us to confront them

By: Kamrin Baker

Opening the TikTok app is akin to opening Pandora’s Box. 

Immediately, the user is brought to the “for you” page where any popular video could appear. The app is riddled with teenagers bopping to catchy pop songs, surfing the wave of viral trends (hello, e-boys) and sharing the details of their lives for the sake of the coveted red heart icon.

It also features Howie Mandel and other “boomers” hoping to cowabunga alongside the kids.

While the app is oft referred to for its “cringey” content (again, hello, e-boys), it is also reminiscent of the late Vine, a social platform that caters to haiku-length videos, memes and jokes that have immortalized in youth culture. 

Despite varying content, TikTok is in the midst of a major social moment. TikTok has amassed over 500 million monthly active users since June of 2018, according to Business of Apps. Tech Crunch calls ByteDance, TikTok’s Chinese parent company the world’s highest-valued tech startup. Adobe Premiere Rush is the first third-party application to publish directly to TikTok. 

Money talks, and TikTok is filibustering the social media scene. 

Although it wins a popularity contest, the platform raises concerns that it is a hub for sexual predation and foreign censorship. 

BuzzFeed News dove into the issue of men preying on young girls’ TikTok profiles, where inappropriate relationships were popping up like lights on a map. 

The Washington Post reported on a “culture clash where U.S. views about censorship were often overridden by Chinese bosses” in which the American branch of the company said its operation does not censor political content or take instructions from China, but former employees said otherwise. 

It should also be noted that despite their reporting, The Washington Post itself has a decent following on the app, posting satirical videos behind-the-scenes from the newsroom. Seeing trusted media publications use the services they write wary investigative pieces about should be unnerving—and yet, it all continues to grow into one amorphous blob. 

If TikTok seems to be inconsistent on monitoring its content and users seem to be inconsistent in their morality, that’s because it is—and we are. The average user seems to still be under the influence of an “ignorance is bliss” mindset.

University of Nebraska at Omaha student (and TikTok user) Kyla Stauffer said she uses the app to laugh and unwind, finding videos that pique her interest and give off “Vine energy.” Stauffer said she was unaware of foreign ownership over the app. 

“I hadn’t heard much about foreign government involvement or anything about a Chinese parent company, but it does sound familiar,” Stauffer said. “I know that a lot of TikToks I have previously watched and saved will be taken down or people will repost old videos with a disclaimer like ‘TikTok will probably take this down again.’”

Despite the widespread concerns of TikTok’s ownership and its apparent pedophilia magnetism, users still download, share and create. Most critics even have an account to lurk along the timeline for a cheap laugh (ahem, me).  

“Nobody wants to be left behind, in case TikTok turns out to be ‘the next big thing,’” said UNO professor and social media expert Jeremy Lipschultz, Ph.D. “I also believe that a lot of users are hoping to become celebrities in the same way that it happened on Instagram for early adopters. Nobody wants to arrive last at the party. The question is: Is this the best party?”

So, should we attend the party? Should we be ashamed if we do? Should we be scared of the hosts?

“We should be cautious in trusting any social media site or digital company when it comes to protecting private data,” Lipschultz said.

In the end, it comes down to being our own watchdogs. We have the power to decide what social media we engage with. As Facebook, too, continues to face political backlash and user anxieties build, we, as the global community, need to decide what our limits are—and if we will ever enforce them.