A TikTok moderator has filed a lawsuit against the social media giant, citing emotional distress.
According to a complaint filed against TikTok and its parent compang, ByteDance Inc., the video-sharing platform's 10,000 content moderators are often exposed to child pornography, rapes, beheadings, and animal cruelty.
Candie Frazier, a content moderator, claimed that it gets so much worse than that.
In her proposed class-action TikTok lawsuit, she stated that she has watched films containing audio of weird cannibalism, smashed heads, school shootings, suicides, and even a tragic fall from a building.
Furthermore, TikTok apparently asks moderators to perform 12-hour shifts with only a one-hour lunch and two 15-minute breaks, contributing to the problem.
According to the complaint via Bloomberg, due to the overwhelming volume of content, content moderators are only limited to 25 seconds per video, and with that, moderators must actively view three to ten films.
The Problem With TikTok
Content moderators bear the weight of gruesome and horrific images that some people post on social media, ensuring that consumers are spared the anguish.
In a consent form, one company that provides content moderators for huge tech companies even stated that the employment can trigger post-traumatic stress disorder (PTSD).
Social media businesses, on the other hand, have been questioned by its moderators and others for not paying enough attention to the psychological risks and for not providing adequate mental health care to their employees, specifically to its moderators.
In congruent to that, in 2018, a similar case was brought against Facebook.
TikTok Causing Mental Trauma
TikTok produced standards to help moderators deal with child abuse and other horrific pictures, in collaboration with other social media firms, such as Facebook and YouTube.
Companies should limit moderator duties to four hours and give psychological assistance, according to the recommendations.
According to the lawsuit, TikTok, on the other hand, allegedly failed to follow such restrictions.
As stated in the class-action lawsuit, TikTok failed to follow the rules, which include providing psychological assistance and restricting shifts to four hours.
The complainant claimed she now suffers from post-traumatic stress disorder as a consequence of having to view so many distressing content.
According to Yahoo Finance, Frazier, who is attempting to represent other TikTok content screeners, is seeking monetary damages or psychological impairments as well as a court order compelling the corporation to establish a medical fund for moderators.
TikTok As a Company
This isn't the only time TikTok is facing a class-action lawsuit.
Since the West raised worries about TikTok's ties to the Chinese government, ByteDance has been separating TikTok from the rest of its Chinese enterprises.
Furthermore, TikTok claimes that all of its data is stored in the United States, with backup servers in Singapore, rather than Beijing, where its parent firm is based.
These actions are insufficient to assuage the concerns of US regulators, as reported by TechCrunch.
TikTok received stiff questions in its first-ever congressional hearing in the United States, and was frequently pushed to clarify its semantics.
ByteDance has long been dubbed as an "app factory" for its proven approach of churning out apps and selling them through a comprehensive back end of shared resources, ranging from engineering to marketing support.
As a result, a number of popular apps have emerged, including Douyin and Toutiao in China, and TikTok to the rest of the world.
With the TikTok lawsuit from its own employees, TikTok haven't yet commented on this matter.