The short-form content platform TikTok houses over 40,000 professionals who work tirelessly to keep the app safe. The moderators review content and apply the correct policies alongside moderation systems. However, their job has continued to remain away from the public’s eyes.
Recently one of the anonymous moderators opened up about their challenges to the Guardian, including reviewing the season and year Recaps trend, only for the platform to refute recognizing it.
TikTok announced that it does not recognize the recap trend
The short-form video-sharing platform is known for its viral trends. Among them, one of the most familiar TikTok trends is the Recap. In this trend, the user selects several pictures from their gallery and creates a quick video that re-captures their memories. These videos include 30 to 100 images celebrating the seasons and years of the user within 15 to 60 seconds in a slideshow format.
2023 recap! Tell us all the places you travelled this year! 👇🏾
📹:brittany_diego via Tiktok // #travelnoire #blacktravel #vacationdestinations pic.twitter.com/C9galzckIk— Travel Noire (@TravelNoire) December 21, 2023
It was named one of the most difficult and overwhelming videos to examine during the training of the moderator. During this period the recruits are provided with all the policies of the app and asked to determine appropriate tags for formerly moderated videos. These practice queues usually include hundreds of videos and test their skills in a limited time.
A moderator is provided around 48 seconds to examine a recap video of 60 seconds and this includes the video description, account bio, and hashtags. Reviewing these videos greatly impacts the stats of the moderators around the time of school and year’s end and they are also very draining.
However, when the Guardian asked the platform to comment on the interview, they insisted on not recognizing the term “Recaps.” It is quite a shocking comment considering that recap videos are one of the app’s trends that consistently goes viral during various periods.
Other policies for TikTok video moderators
Once the moderators pass the training and probation period they join the “live” queue of the platform. Here they moderate the live streams of the users. As per the anonymous source, these are the videos where the mods find the worst content but their actions are limited to ending the stream and/or restricting the user from uploading and going live again.
This is really worriying. This is the feed of a (bot) 13-year-old tiktok account. This algorithm is made only for delivering polarizing content, this should not be allowed in any app, and needs to change immediately. pic.twitter.com/I5epABvbaf
— Lucio (e/acc) (@luussta) December 22, 2023
Other queues include determining if the account is above the age of 13 and moderating single videos, and comments. The moderators usually find phishing, scams, explicit content, underage users’ GRWM clips, recaps, pirated content, extremist ideologies, dangerous activities, unlawful content, and privacy policy violations.
The most difficult part of this job is that their laptops reportedly lock after five minutes of no input. This means the moderators have to continuously keep their screens active or their status will turn to “idle.” It can also happen if the internet goes down or someone forgets to change the status from active to lunch/meeting.
A button letting TikTok's moderators flag video content in language they do not understand has been removed, making it extremely difficult to moderate Israel-Gaza content, a source says https://t.co/Zq6INNkt8g
— Blake Montgomery (@blakersdozen) December 20, 2023
These ‘idle’ statuses are later investigated and affect their performance reviews which can affect their bonus and pay raises. Other difficulties include the alleged removal of the language button which allowed the moderators to only assess the video in their language due to this they are asked to go by what they see on their screens.
However, TikTok has refuted these claims including that the systems shut down after five minutes of inactivity. They have also commented that the app’s policies mentioned in the Guardian article aren’t all accurate.