A former Tik Tok content moderator has sued the Chinese Communist Party-linked social media app for psychological trauma after reviewing graphic videos on the platform.
Former content moderator Candie Frazier has filed a lawsuit in a California court, alleging that Tik Tok and its parent company ByteDance failed to provide sufficient safeguards to protect the mental health of moderators who in their job had to view traumatic footage, including acts of extreme and graphic violence.
A complaint proposing a class action suit and demanding a jury trial, filed in the California Central District Court on Dec. 23 and obtained by The Epoch Times, accuses ByteDance Inc. and Tik Tok Inc. of failing to provide a safe workplace for content moderators, who as part of their job to keep the social media platform “sanitized” spend long hours reviewing and moderating videos, including ones that are disturbingly graphic.
“While working at the direction of ByteDance and Tik Tok, Content Moderators—including Plaintiff Frazier—witness thousands of acts of extreme and graphic violence, including sexual assault, genocide, rape, and mutilation,” the complaint states.
Moderators like Frazier spend twelve hours a day reviewing and moderating content so disturbing images don’t reach Tik Tok users, the complaint says, alleging that ByteDance and Tik Tok are aware of the psychological harms viewing such content has on moderators yet failed to implement industry-recognized safety standards to protect them from psychological trauma.
The complaint lists a number of industry standards for protecting moderators, including counseling from professionals specializing in trauma intervention, limiting how much time moderators are exposed to child sexual abuse imagery, as well as technical safeguards like blurring the videos under review or muting audio.
Frazier said in the filing that, as a result of “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace,” she developed and now suffers from significant psychological trauma, including anxiety, depression, and post-traumatic stress disorder.
She alleges that Tik Tok and ByteDance are aware of the impact on moderators of reviewing graphic content but failed to incorporate into their moderation tools such safeguards as reducing video resolution, changing the direction of the video, or superimposing a grid onto the footage, which could “mitigate some of the harm.”
Tik Tok did not immediately respond to a request for comment on the complaint.
The complaint asks the court to force Tik Tok and ByteDance to implement safeguards and establish a fund that would pay for the diagnosis and treatment of psychological trauma to moderators.
It also seeks compensatory damages to Frazier and the class.