image-analyzer-blog

Content Management: A delicate balance | Image Analyzer

Written by Wendy Shore | Aug 8, 2019 9:20:03 AM

It’s a story as old as social networks themselves. Social sites begin to gain steam and the influx of user generated content becomes difficult for moderation teams to manage. Site owners who don’t already have strict policies in place are then faced with a choice. Should they modify their terms of service in an effort to protect their brand and their community from potentially harmful content, or leave things as they are? This delicate question, which is directly related to issues of online privacy and censorship, has been hotly debated in recent media reports.

In the past month, a handful of popular social networking sites have made headlines for modifying their policies related to adult content. At the end of February, Google announced that they would be banning sexually explicit content from Blogger. After receiving backlash from several long-time community members about the change, which would have required many users to delete past content, the Google team chose to rescind the ban a few days later. Instead, their focus has shifted to identifying and removing commercial pornography sites hosted within the Blogger platform.

Social sharing giant Reddit has also been in the news for its recent policy changes. The popular site has been a user-focused and user-managed community since its inception, but last year it came under fire when hackers started a sub-reddit to disseminate leaked photos of celebrities. In response to that issue as well as other instances of privacy violation and pornography, the Reddit team is making some changes. They’ve cracked down on photo trading, a popular practice among some community members. Under the new Reddit policy, sexually explicit photos or videos which haven’t been posted with the prior consent of the subjects will be removed from the site.

The recent problems aren’t unique to Blogger and Reddit either. Popular social applications SnapChat and WhatsApp have also been in the spotlight for issues related to inappropriate content. Sex workers have been using SnapChat to share explicit photos in exchange for payment, and WhatsApp was suspended by a judge in Brazil after the company failed to provide assistance with an alleged child pornography case.

Coming up with an effective solution to the problem of inappropriate user-generated content isn’t easy. On the one hand, site owners want to create a safe community for all users, including their youngest members. But on the other, there’s the issue of online censorship and the slippery slope of classifying what’s appropriate versus inappropriate. Where should site owners draw the line when it comes to content management? The answer remains unclear.

But one thing is certain. Moderation is important and necessary for sites that rely on user-generated content to thrive. While approaches vary, nearly all social networks have systems in place for classifying content. Blogger users self-regulate, labeling their own content as sexually explicit, which automatically brings up a warning page for others who visit their site. Reddit relies on volunteer moderators and a community review system to manage user submissions. Other sites employ a team of human moderators, or use some combination of human and machine moderation to screen content.

As social networks continue to grow and evolve, content moderation will remain an ever-present issue. Site owners will be forced to walk a fine line between effective community management and censorship.

By Jess Ullrich