TikTok gives preferential treatment to some influencers and celebrities. This is according to internal company information that has now been leaked. According to the information, TikTok used a two-tier moderation system to protect users with over five million followers in case of policy violations.
TikTok twisted its own policies and rules to protect celebrities and influencers from policy violations. That’s according to internal company records obtained by business magazine Forbes.
According to the report, TikTok developed a two-tier moderation system that selectively periodizes certain content and protects users with more than five million followers from policy violations.
TikTok: Top influencers may disregard guidelines
Like all other social media, TikTok has certain community guidelines. These actually apply equally to all users to protect against certain content.
However, internal audio recordings from a company meeting in September 2022 show that the platform marked certain users as “creator labels” via its moderation system in order to give them preferential treatment. These labels, in turn, were reserved for only “special users.”
An employee also commented in the leaked memo:” We don’t want to treat these users like all other accounts. We’re a little more lenient, I would say.” In doing so, the Bytedance subsidiary gave separate and preferential moderation to content from celebrities and influencers with more than five million followers.
Two-tier moderation system?
When asked if TikTok employed a two-tier moderation system in the process, company spokeswoman Jamie Favazza told Forbes, “TikTok is not more lenient in moderating accounts with more than 5 million followers.” She added, “We do not have moderation queues based on follower count.”
However, in the memo, a participant in the conversation expresses, “A famous person could post content and I could post content, and if both were inappropriate, the famous person would be able to stay on top.”
Evelyn Douek, a professor at Stanford Law School, again acknowledged that it was understandable that operators would want to verify high-profile accounts. But she warned that such a system could easily be abused.
That’s especially true, she said, if it’s run by people who have a financial incentive to remove or maintain content: “A system designed to ensure that the rules are applied consistently is very different from a system designed to ensure that the rules are applied inconsistently, for example, when it comes to profit.”