By Wendy Shore on August 08, 2019

Managing the Masses: How One Major Social Network Keeps Things Safe for Its Members

Imagine for a moment that your website received billions of comments, photo and video submissions daily. While you might initially be excited about this level of engagement from your community, most site owners would quickly become overwhelmed by managing the massive influx of user generated content. The magnitude of submissions would make it much more challenging to protect your community members and your brand from potential threats. For large social networks this scenario represents a daily reality. Here’s a deeper look at how one social network with 300 million users moderates their content to manage their risk, and some important takeaways for your business as you develop your own content moderation strategy.

Content moderation: why it matters

While many promising social networking sites died on the vine after Facebook exploded in popularity, Tagged has grown and thrived. Launched in 2004, the unique network is focused on social discovery and has more than 300 million registered users across 220 countries. Tagged members can engage in real-time chat, play games, and send private messages to others. An estimated 100 million new connections are made through the network each month.

As Tagged began to scale, the challenge of managing the influx of user generated content became very real. Community members were uploading hundreds of thousands of photos each day. The social network’s team was relying solely on a post-moderation strategy to filter out explicit images, meaning that inappropriate photos often went live on the site for a few minutes before they were flagged by human reviewers. User reports of explicit images were on the rise, and Tagged was faced with finding a more effective solution to protect its online community.

The majority of entrepreneurs place a premium on creating a positive and safe online experience for their audience. Reducing exposure to harmful content can help preserve your brand’s image and keep your customers happy.

Moving to integrated content moderation

Ultimately, the Tagged team decided to take a more integrated approach to content moderation, which would allow them to streamline the process of filtering out images that violated the site’s terms of service. Instead of relying solely on post-moderation by human reviewers, Tagged elicited the help of Image Analyzer, a high-tech content moderation software that would allow them to identify and remove obscene images quickly and effectively. Tagged’s goal with the integration was to minimize members’ exposure to explicit content, and reduce the strain on human moderators by 50%. Here are three tips that you can learn from their case study:

Customize your solutions to your community’s needs: Tagged worked with the Image Analyzer team to come up with a customized solution that would seamlessly integrate with and support their existing content moderation system. Potentially harmful images were identified in a matter of seconds and placed into a queue for moderation. Images that aligned with Tagged’s terms of service were instantly posted on the site. Within a few weeks of implementation, the social network saw a considerable reduction in the total number of reports of obscene content from its users. The queue of images flagged for human moderation was also significantly shorter.

For businesses developing their own integrated content moderation strategies there are a couple of takeaways. One is that it’s important to find a solution that works with your overall community guidelines; no two sites have exactly the same needs. A second is to think strategically about how moderation software can support your team and make them more effective, rather than immediately thinking about replacing all human moderation (or discounting software because you believe in the power of human moderation).

Set clear moderation goals to improve the user experience: Effective content moderation is essential to the user experience. In today’s instant online experience, users are frustrated by long waits to see their content posted. By taking a more integrated approach to content moderation, Tagged was able to achieve their content moderation goals within a very short time frame. Implementing a two-pronged strategy allowed the network to meet user expectations of ease and speed, while also protecting the business along some fundamental lines.

Improve your business model: Supportive content moderation software also helps your business function more effectively on two levels. Not only did they minimize the risk of exposing active community members and moderators to harmful content. They also helped streamline their content moderation process considerably, thereby reducing the stress on their employees.

For businesses that are experiencing growth and struggling to manage the influx of user generated content, sometimes partnering with an outside resource can help ease the burden. Whether you have specific goals in mind like Tagged or you simply want to create a more effective process, consider exploring your integrated content moderation software options to come up with the best solution for your business.

By Jess Ulrich – Posted on October 28th 2014

Published by Wendy Shore August 8, 2019