How AI can help detect & filter offensive images and videos

Advances in Artificial Intelligence and Deep Learning have transformed the way computers understand images and videos. Over the last few years, innovative neural network structures and high-end hardware have helped research teams achieve groundbreaking results in object detection and scene description. Those structures have in turned been used to build generalist models aiming to recognize any object in any image.
Those breakthroughs are now being applied to specific use-cases, one of which is Content Moderation. Sightengine, an A.I. company, is making its image and video moderation service available worldwide through a simple API. Built upon specialist Neural Networks, the API analyzes incoming images or videos and detects if they contain offensive material such as nudity, adult content or suggestive scenes. Just like Human moderators would. As opposed to the networks that companies like Google, Facebook or Microsoft use for object detection, these neural networks are specialists, designed and trained to excel at one specific task.
Historically, content moderation has been mostly in demand with Dating and Social Networking websites. They relied either on staff who had to go through user-submitted content manually, or on their community who had to flag and report content. But today, content moderation is no longer restricted to niche markets. As camera-equipped smartphones have become ubiquitous, and as the usage of social networks and self-expression tools have continued to rise, photo generation and sharing have literally exploded over the last few years.
It is estimated that more than 3 Billion images are shared every day online, along with millions of hours of video streams. Which is why more and more app owners, publishers and developers are looking for solutions to make sure their audience and users are not exposed to unwanted content. This is a moral as well as a legal imperative, and is key to building a product users trust and like.
Sightengine’s Image Moderation and Nudity Detection Technology is a ready-to-use SaaS offering, accessible via a simple and fast API.