Blowing the Whistle on TikTok Content Moderation

22 Mar 2023

Today, a content moderator completes their careful analysis of how TikTok misleads workers and the public about its overseas data storage and individual user tracking. This is not about a specific country – “I do not endorse the campaign by the US government to pathologize China” – but about protecting people wherever they are.

A neon pink prism on round blue mirror with a blue and green gradient background

They say they can't, but companies and governments can see right through you / Source

The Worker’s Perspective

by a TikTok content moderator

I’m from a part of the US where there’s not much organizing, or much to keep people there. But, through getting involved in elections as a teenager, I bumped into elder labor organizers and learned about solidarity – which I learned all over again when the pandemic began and people started helping each other with food and basic necessities.

Doing content moderation started off just like any other number of jobs, a way to get by. I found a job at Webhelp, a company that does content moderation for ByteDance, the parent company of TikTok. And I thought to myself, I’m going to be moderating, how bad is going to be? I was also attracted to the slightly better pay, benefits, breaks, and beliefs about making a difference in the world by taking down harmful content. But then I started to notice a fair number of things.

Working in software is incredibly strange. Content a person creates to express themselves through a screen may not be what it seems. The same goes for the companies representing their policies and practices to the public.

I was under the impression that although ByteDance was a Chinese company, it kept its American operations separate. I learned almost immediately that this was incorrect. From the beginning, the training system we used - elearning.kondou.cn - was entirely in Chinese and hosted on a Chinese domain. So too, were other systems. TCS, a ByteDance browser that’s used to moderate videos - and Lark (ByteDance’s equivalent of Microsoft Teams), are based in China and have their data stored there, contrary to what has been told to the public. Lark functions as a chat program almost identical to Microsoft Teams in its layout and functionality, except ByteDance has the ability to moderate every user and chat done on it directly.

Many policies and many trainings directly affect my content moderation role. I learned through my trainer and other members of management, that all communications on new policies - of which, there is approximately 120 – apparently come directly from ByteDance in China. According to those inside the company, ByteDance establish these new policies and tell us how to moderate current world news and events. These policies can include things ranging from abortion controversies to the Paul Pelosi attacks to just controversial TikTok trends like subway surfing.

Perhaps most disturbing for me, though, was that I saw ByteDance also has access to location data. The videos I was moderating would have location data visible by region. I would therefore know as I was moderating a video in what region that video was taken. It was upsetting to think that if they could tell me the region where it was taken, that they might also be able to pinpoint more specifically the individual’s location, especially if there were identifying landmarks in the video. All of this was potentially available not just to me, but to Webhelp and ByteDance.

It might be my attention to detail and thousands upon thousands of repetive tasks each day that makes me hyper aware about discrepancies in the company where I work. What made me decide to blow the whistle on all of this was observing inconsistencies in the guidelines for how we moderate versus the company narratives for the public. This is something the public deserves to know. If a job requires you to hide something unethical to maintain people’s livings, then that job just shouldn’t exist.

Nonetheless, I worry about what will happen to all the content moderators currently employed by Webhelp. There are a lot of people who I know personally who have also been down on their luck and took this job for the benefits that they were promised. They deserve what was promised to them. They also deserve to be able to work in a job they can be proud of – and one that does not ask them to engage in unethical behaviour.

As a principle, I am firmly against mass surveillance by software companies. The issues here are not unique to TikTok, but are prevalent among all mainstream social media companies - Twitter, Facebook, Snapchat, etc. I believe mass surveillance should come to an end, and that these sites need to undergo massive revisions to become nonhierarchical and decentralized.

Also, I want to clarify that I do not endorse the campaign by the US government to pathologize China. They have an approach of “It’s okay if we do mass surveillance, but not you,” which is fundamentally wrong. Mass surveillance and data collection by megacorporations and governments is highly unethical, regardless of who does it - if I were moderating for a different tech company, I would likely have become a whistleblower as well. It just so happened that I ended up moderating for TikTok.


I appreciate everyone I’ve talked with who supported my efforts doing safe and effective whistleblowing, which I believe is part of a larger collective effort to protect our rights as workers and human beings.