By Wendy Shore on August 08, 2019

Social Networking and the Hidden Costs of Content Moderation

Consider what it would be like to sit in a darkened room for eight hours daily reviewing thousands of pornographic, violent, and abusive images and videos. For human content moderators, this represents just another day at the office. And the demand for good content reviewers is growing. According to a recent study by Pew Internet, an estimated 74% of adults online use social media. As social networking sites have exploded in popularity over the past ten years, more business owners have begun exploring their options for effective content moderation. Content moderation filters out offensive user-generated content, protecting the social network’s brand, enforcing terms and conditions, and minimizing legal risks. Some companies take an integrated approach, employing high-tech solutions that complement their human moderators; others rely solely on a team of content reviewers to pore through the bulk of the user generated material.

Today’s content moderation landscape

Human content moderation is the preferred solution for many large social networking sites. Some would argue that people do a more effective job of flagging inappropriate images and videos than software programs. Unfortunately, there are two major flaws with this argument. The first is that it simply isn’t true. In the earliest days of the Internet, image recognition technology was limited. Early versions of these programs recognized flesh tones in images or video and automatically flagged them as obscene. But in recent years, image recognition technology has advanced considerably. Today’s moderation platforms have advanced algorithms in place that allow businesses to filter out inappropriate content with nearly 100% accuracy. As technology has advanced, more companies have chosen to implement automated solutions to help unburden their human reviewers. Not only do integrated machine solutions help streamline the process for human reviewers, they also effectively reduce content moderation costs for businesses.

The psychological costs to human content moderators

The second (and most troubling) issue with human moderation is that it comes at a considerable cost for the content reviewers. This is a factor that social network managers don’t necessarily consider as they’re exploring options for content moderation. Image and video reviewers who are charged with flagging and filtering inappropriate content on social networks are constantly exposed to some of the web’s most disturbing content. In many cases, these people spend the bulk of their work week looking at obscene and violent images and videos online. In a 2010 New York Times article, one content reviewer reports seeing, “…photographs of graphic gang killings, animal abuse and twisted forms of pornography.” Another moderator mentions that the images are, “hard to walk away from.”

Prolonged exposure to this type of content can have serious effects on the health of human moderators. A recent piece in the London Free Press takes a deeper look at how inappropriate images and videos have negatively impacted content reviewers. The article, which was referenced on image scanning technology provider Image Analyzer’s blog, suggests that many of these people have suffered from varying degrees of health and social issues. “They weren’t walking away unscathed,” says researcher and Western University Professor Sarah Roberts. Human content moderators interviewed in this piece frequently suffered from problems such as drug and alcohol abuse, disturbing flashbacks, and isolation from friends and loved ones.

Closing thoughts

It goes without saying that user generated content is the fundamental building block of any successful social networking site. But there’s also an inherent risk that comes along with it. There are always going to be people that choose not to abide by the rules of the community. In turn, social networking sites need to implement moderation systems to protect their users. Companies that choose to work solely with a team of human content reviewers may think that they’re taking the best approach for their businesses, but it often comes at the expense of the moderators themselves. Reducing exposure by blending an automated solution with a human team can help minimize the risks for content moderators. A psychologist interviewed in the aforementioned New York Times pieces summed it up best when she said, “If you work with garbage, you will get dirty.”

By Jessica Ullrich October 8th 2014

Published by Wendy Shore August 8, 2019