Facebook content is so graphic that it’s driving moderators insane
Posted by Josh Taylor / February 26, 2019Facebook is one of the largest and most widely used websites on the planet. It also offers users a variety of ways to share media, such as posting stories, videos, or live streams.
In the early days of the Internet, observers thought the web would make the world a better place. If they were to hear about Facebook, they would probably guess that it brought people from around the world together, bridging cultural and geographical gaps and leading to a better, happier world.
If they spent twenty seconds doing Chloe’s job, they would run screaming from the room––and probably into a therapists’ office. Chloe is a content moderator––one of the many interviewed in this Verve report––and that’s exactly what she did. She began having panic attacks after she watched a man get brutally murdered as part of her content moderation. Other moderators smoke weed during breaks, exhibit signs of acute or chronic anxiety, have sex on the job, or demonstrate other behaviors indicative of PTSD.
Chloe works for a company called Cognizant, and she––like all moderators––signed a non-disclosure agreement. Ostensibly, that NDA protects them from Facebook users who disagree with the moderators’ choices. But in practice, that NDA keeps the horror-show that is Facebook media sharing hidden. The Verve report linked above exposes that horror show, and there will inevitably be fallout.
More social media.
Comments are off for this post.