Combating the Dangers of PTSD in Content Moderation

0
952

If you are involved with jobs that require the consumption of horrific and disturbing media–journalism, human rights activism, or content moderation—then you need to be aware of the genuine dangers of secondary trauma. Workers who view an unending stream of violent and traumatic imagery risk their mental health daily. Fortunately, there are several options available to mitigate these dangers.

Post-traumatic Stress Disorder (PTSD) is a persistent emotional and mental distress condition, generally caused by experiencing a terrifying event and resulting in sleep disturbance, flashbacks, and intrusive thoughts. While it was once thought dangerous to those with firsthand experiences, newer research suggests that even viewing recorded images of these events can cause the same reaction.

Content Moderation

Workers can perform risk management themselves. Examples include using a smaller image window for viewing disturbing content, taking comprehensive notes on the material so that it does not need to be considered multiple times, and keeping “distraction files” of beautiful or appealing images (think pictures of a cute animal or vacation destinations) to view during breaks. These viewer-based methods can work, but experts say the best aid comes from the top.

First and foremost is understanding and accepting the importance of mental health as a concept. Many employers quickly accommodate physical health concerns but fall short of mental and emotional health. Managers of these workers must understand the nature and dangers of this kind of content consumption and prepare an overall workflow that aids mental health. Second, management must create a work environment tailored for this type of work, scheduling breaks and organizing a comfortable and aesthetically pleasing workspace to combat these stresses. And finally, it is vital to screen employees for mental health regularly. Don’t wait for someone on your staff to ask for counseling; taking the lead on mental health sets the expectation that the employees are encouraged to seek out these services.

Organizations for journalism and human rights activism have been working to implement these kinds of steps. Content moderation, however, faces more complicated hurdles. Social media platforms often receive much greater numbers of potentially disturbing images, which requires many more moderators. And while many of these tech companies acknowledge the risks associated with content moderation, they promise to hire more moderators whenever they face any content-related controversy.

Many corporations have begun integrating image moderation API into their digital platform to reduce the overall number of images that need to be moderated, but this can’t solve everything. Additionally, because most of these companies use contracted moderators, there is always the push to reduce overhead to remain competitive. Cutting overhead often cuts the safeguards for mental health. To make these necessary changes, the entire industry must acknowledge the danger and work together to make a difference.