By Wendy Shore on August 08, 2019

Making the Case for Content Moderation

Social Media Today 25th Sept 2014 by Jess Ullrich

Your submission awaits moderation. Whether you manage an online community or you’ve contributed to a website, you’re probably familiar with that phrase. The majority of websites accepting comments, images, and videos from users have a review mechanism in place to filter out content that violates their terms of service. Some site owners choose to review submissions after they’ve already gone live and rely in part on collective user intelligence to identify offensive content. Others place user generated content into a moderation queue for manual review before it’s posted. No matter which approach you take, one thing is for certain: effective content moderation is critical for protecting your community, your business, and your employees. Here are some of the reasons why it’s so important.

Your online community has high standards
Active members of your online community visit multiple sites on their daily tour of the internet. Each website has different mechanisms in place to ensure that users have a positive online experience, and the majority of them place a strong premium on protecting community members. As a result, your website visitors have developed high standards for their online experience. Exposure to obscene, violent, or inappropriate content may be enough to drive away your loyal customers for good and do real damage to your brand in the meantime. Being proactive about content moderation illustrates that you care about the safety of your online community.

Your business is at stake
If your website doesn’t have an effective mechanism to moderate content, your business could be at risk. This is especially true in the case of social networks with high levels of user generated content. Explicit content can easily slip through the cracks if your site isn’t able to manage the influx of user generated material.

“Providers of UGC sites increasingly understand the need to protect their service against abuse,” says Crispin Pikes, CEO and Founder of Image Analyzer, a UK-based image scanning technology provider. He goes on to say that, “In some cases regulatory compliance compels them to have pornographic content removed.”

Your employees may be at risk
Many businesses prefer to employ a team of human reviewers to filter through user generated content. While this may seem like an effective solution, there’s something important to consider here. If your employees are charged with reviewing hundreds of thousands of comments, photos and videos daily, they will almost certainly be exposed to their fair share of harmful content. Studies have shown that prolonged exposure to violent, pornographic, and disturbing material can be damaging for reviewers. Human content moderators are at higher risk for developing depression and anxiety; they may also have anger problems and trouble maintaining healthy relationships. For this reason, some businesses choose to implement an automated solution to remove some of the stress from their human review team. Adding an automated program to help support your human reviewers can help you protect your employees, streamline the moderation process, and drastically reduce your business’ overall costs.

For businesses like social networks, which thrive on user engagement, effective content moderation can be the difference between success and failure. If explicit content slips through the cracks, it can be damaging to your online community and your brand. It can also leave your business vulnerable to potential legal threats. Whether you run an established social networking site or you’re in the process of starting up, it’s important to take content moderation seriously to protect your community, business, and employees.

Published by Wendy Shore August 8, 2019