In a massively popular clip on the video-sharing app TikTok, police snipers atop a Minneapolis police station point their guns at Black Lives Matter protesters. Demonstrators fight tear gas, sirens blare and helicopters fly overhead. The footage is set to “This is America,” Donald Glover’s anthem about race and gun violence.
The video became one of the top clips on TikTok’s #BlackLivesMatter hashtag, and drew more than 43 million views. Its popularity shocked the video’s creator, a 33-year-old comedian named Kareem Rahma. “I’m not an activist,” Rahma said. “I joined TikTok to make funny videos, but when my hometown of Minneapolis turned into a war zone I needed to show people.”
The grim 30-second clip, shot last month during protests over the death of George Floyd, is a far cry from the goofy memes, upbeat lip-syncing and bedroom twerking clips that turned TikTok, owned by ByteDance Ltd., into a global sensation over the last few years. The app has been downloaded over 2 billion times globally according to research firm Sensor Tower. Its success is largely the result of its positioning as a fun, playful corner of the internet devoid of the controversies plaguing sites like Twitter Inc., Facebook Inc. and Reddit Inc.
TikTok’s decision to chart a different course was deliberate. Many American tech companies have preached a flavor of free speech absolutism since their founding, compromising only as difficult content moderation questions became public relations disasters. TikTok is different. It has regularly made a practice of removing posts that didn’t fit its carefully crafted carefree image, or reducing their views even if it did leave them up.
Some of the company’s early guidance on content moderation limited the reach of posts from overweight, queer or disabled users, according to documents obtained by The Guardian. Another early rule banned “highly controversial topics,” such as separatism or ethnic conflict. TikTok has said it has discontinued these practices.
TikTok has faced persistent allegations that its decisions on content align with priorities of the Chinese government. It has targeted videos related to pro-democracy protests in Hong Kong, the mistreatment of Muslims in China’s Xinjiang region, and standoffs at the India-China border. Last year, a Bytedance spokesman told Bloomberg that TikTok didn’t remove videos from the Hong Kong protests for political reasons, saying they may have instead been taken down for violating guidelines around violent, graphic, shocking or sensational content.
That no longer seems to be as much of a consideration. The platform has erupted with images of nationwide protests, featuring videos that show tear gas, police with guns, racist material, and expletive-laced songs denouncing President Trump. Videos with the #BlackLivesMatter hashtag have surpassed 10 billion views. “For some people, their entire feeds are transformed with Black Lives Matter content on TikTok,” said Daniel Sinclair, an independent researcher who studies TikTok and social media. “It’s no different than Twitter.”
Sinclair notes that Chinese officials, state media and social media sites like Douyin, Bytedance’s Chinese version of TikTok, have all been amplifying the violent protests and heated discussions about race unfolding across America. But whether China’s moves to project an image of U.S. instability play into the decision-making at TikTok is hard to decipher, according to Sinclair.
Last October, U.S. Senators Tom Cotton and Chuck Schumer requested a government review of TikTok over national security concerns. One of the issues they cited was the app’s potential to host foreign influence campaigns, and its alleged censorship of topics deemed politically sensitive to the Chinese Communist Party. That same month TikTok banned all political, advocacy, and issue ads from its platform, claiming such ads would undermine TikTok’s “positive environment” according to a company blog post. TikTok has said the Chinese government has not asked the app to remove any posts.
As with other social media sites, TikTok faces questions about whether it is enforcing its own policies consistently. “As I see it, a lot of the popular Black Lives Matter posts actually do go against TikTok’s own stated community guidelines,” said Joseph Seering, a doctoral candidate at Carnegie Mellon University who studies content moderation. He points to prohibitions on things like hate speech and firearms. “My guess would be that TikTok has come to realize that removing some of those videos and records of the protests would just do more harm to the platform than it’s worth,” Seering said.
TikTok said that it released updated community guidelines in January that provide exceptions for videos that are educational, historical, newsworthy, or otherwise aim to raise awareness about issues. The new guidelines also allow for exemptions on displays of firearms for those carried by a police officer.
“While much of the content our users create is light-hearted, more serious content also has a place on our platform,” said a company spokeswoman. She added that TikTok has increased the size of its local safety teams and consulted with outside experts about how to handle sensitive content issues. The site recently left up some posts related to Black Lives Matter that might have violated its previous guidelines, she said, “because its relevance and timeliness to our Black community was evident and made it newsworthy for our users.”
When the nationwide protests originally broke out across the U.S. last month after George Floyd died after a Minneapolis police officer kneeled on his neck, some TikTok users accused the app of suppressing videos related to the Black Lives Matter movement. Many TikTok creators changed their profile photos to the Black power symbol in a virtual protest against what some alleged was a concerted effort to silence Black voices.
TikTok later issued an apology and said that posts with the hashtags #BlackLivesMatter and #GeorgeFloyd appeared to have zero views due to a technical glitch that affected around 200,000 hashtags. “Words can only go so far. I invite our community to hold us accountable,” said Kevin Mayer, TikTok’s new chief executive officer, in one of his first posts on the platform.
Sam Coleman Dancer, a 21-year-old student at Mississippi State University who joined TikTok in 2018, said he welcomed the growing discussion of race on the platform. “Over the last few weeks, the world has gotten the chance to see what’s actually happening to people that look like me and I’m glad people are recording the injustice and abuse and sharing it on TikTok,” he said.
The platform’s embrace of Black Lives Matter content could also signal the maturation of a platform that has to adapt its approach, rules and content moderation to a broader base of users, according to social media experts. They point to other companies like Snap Inc., which created an editorial team in 2015 to ensure it was curating breaking news stories responsibly.
“When a platform becomes as large in scale as TikTok has become, it’s pretty impossible to enforce a rule like, ‘We only allow fun on our site,’” said Kate Klonick, an assistant professor at St. Johns University School of Law who researches online speech and oversight. “Saying ‘Our platform isn’t for politics or serious issues’ is an incredibly naive idea.”
— With assistance by Yuan Gao
(Updates third paragraph to add data on TikTok downloads.)