By Wendy Shore on June 30, 2021

IA In the News: Image Analyzer’s artificial intelligence is saving workers from PTSD

 

Cris Pikes, CEO of Image Analyser “When we consider that it’s been estimated that it would take someone 950 years to check all of the Snaps uploaded to SnapChat every 24 hours, it’s obvious that companies cannot moderate this volume of images using human power alone,” says Cris Pikes, CEO and co-founder of Image Analyzer.”

“Even massive social media firms like Facebook, which outsource content moderation, struggle to keep up with the growth in harmful, extremist and false content. It has reached the point that the individuals who work in moderation are starting to sue for burn-out, and even post-traumatic stress. ”

“Organisations also face increasing pressure from draft legislation, like the UK’s Online Safety Bill and the EU’s Digital Services Act. These require firms to take down illegal content quickly, with large financial penalties for failure – up to 10 per cent of global annual turnover.”

“However, there may be a solution, in the form of artificial intelligence. Pikes – whose company won the Best Emerging Technology in AI Award at this year’s AI & Machine Learning Awards – says, “Artificial intelligence can be used at the right point in a digital platform’s workstream to remove the majority of harmful content before it reaches the platform. This aids compliance with impending legislation and leaves only the more nuanced content for human moderators to review.”

 

Image Analyzer is a world leader in the provision of automated content moderation and user generated content moderation for image, video and streaming media.
Our AI content moderation delivers unparalleled levels of accuracy, producing near-zero false-positive results in milliseconds. If you have any questions about our content moderation software or Image moderation software, please get in contact today.

Published by Wendy Shore June 30, 2021