Facebook Acknowledges that Content Moderators are at risk for PTSD
A major issue in the tech industry has been the mental health of content moderators. Content moderators sort through thousands of disturbing images and videos and make a determination of whether or not they are appropriate for the web. Some of the disturbing images and videos contain bestiality, child abuse, hate speech, self-harm, and terrorism. Recently, Facebook has required their content moderators to sign a form that explicitly acknowledges that their job may cause PTSD.
See "Facebook Acknowledges that Content Moderators are at risk for PTSD ", Madhumita Murgia, Financial Times, January 24, 2020