Optimize Your Image Moderation Process With These Five Best Practices

 In Online Safety, Social Networks

If you run or moderate a social sharing site or app where users can upload their own images, you know how complex image moderation can be. We’ve compiled five best practices that will make you and your moderation team’s lives a lot easier:

1. Create robust internal moderation guidelines

While you’ll probably rely on AI to automatically approve and reject the bulk of submitted images, there will be images that an algorithm misses, or that users have reported as being inappropriate. In those cases, it’s crucial that your moderators are well-trained and have the resources at their disposal to make what can sometimes be difficult decisions.

Remember the controversy surrounding Facebook earlier this year when they released their moderation guidelines to the public? Turns out, their guidelines were so convoluted and thorny that it was near-impossible to follow them with any consistency. (To be fair, Facebook faces unprecedented challenges when it comes to image moderation, including incredibly high volumes and billions of users from all around the world.) There’s a lesson to be learned here, though — internal guidelines should be clear and concise.

Consider — you probably don’t allow pornography on your platform, but how do you feel about bathing suits or lingerie? And what about drugs — where do you draw the line? Do you allow images of pills? Alcohol?

Moderation isn’t a perfect science; there will always be grey areas. That’s why it’s important that you also…

2. Consider context

When you’re deciding whether to approve or reject an image that falls into the grey area, remember to look at everything surrounding the image. What is the user’s intent with posting the image? Is their intention to offend? Look at image tags, comments, and previous posts.

While context matters, it’s also key that you remember to…

3. Be consistent when approving/rejecting images and sanctioning users

Your internal guidelines should ensure that you and your team make consistent, replicable moderation decisions. Consistency is so important because it signals to the community that 1) you’re serious about their health and safety, and 2) you’ve put real thought and attention into your guidelines.

A few suggestions for maintaining consistency:

  • Notify the community publically if you ever change your moderation guidelines
  • Consider publishing your internal guidelines
  • Host moderator debates over challenging images and ask for as many viewpoints as possible ; this will help avoid biased decision-making
  • When rejecting an image (even if it’s done automatically by the algorithm), automate a warning message to the user that includes community guidelines
  • If a user complains about an image rejection or account sanction, take the time to investigate and fully explain why action was taken

Another effective way to ensure consistency is to…

4. Map out moderation workflows

Take the time to actually sketch out your moderation workflows on a whiteboard. By mapping out your workflows, you’ll notice any holes in your process.

 

Image Moderation Workflow for new users

Example of a possible image moderation workflow

Here are just a few scenarios to consider:

  • What do you do when a user submits an image that breaks your guidelines? Do you notify them? Sanction their account? Do nothing and let them submit a new image?
  • Do you treat new users differently than returning users (see example workflow for details)?
  • How do you deal with images containing CSAM (child sexual abuse material; formally referred to as child pornography)?

Coming across an image that contains illegal content can be deeply disturbing. That’s why you should…

5. Have a process to escalate illegal images

The heartbreaking reality of the internet is that it’s easier today for predators to share images than it has ever been. It’s hard to believe that your community members would ever upload CSAM, but it can happen, and you should be prepared.

If you have a Trust & Safety specialist, Compliance Officer, or legal counsel at your company, we recommend that you consult them for their best practices when dealing with illegal imagery. One option to consider is using Microsoft’s PhotoDNA, a free image scanning service that can automatically identify and escalate known child sexual abuse images to the authorities.

You may never find illegal content on your platform, but having an escalation process will ensure that you’re prepared for the worst-case scenario.

On a related note, make sure you’ve also created a wellness plan for your moderators. We’ll be discussing individuals wellness plans — and other best practices — in more depth in our Image Moderation 101 webinar on August 22nd. Register today to save your seat for this short, 20-minute chat.

 

Photo by Leah Kelley from Pexels

Recommended Posts
×
Subscribe to the blog and never miss a post!
Contact Us

Hello! Send us your question, and we'll get back to you as soon as possible.

Start typing and press Enter to search