CEO Chris Priebe founded Two Hat Security back in 2012, with a big goal: To protect people of all ages from online bullying. Over the last six years, we’ve been given the opportunity to help some of the largest online games, virtual worlds, and messaging apps in the world grow healthy, engaged communities on their platforms.
Organizations like The Cybersmile Foundation provide crucial services, including educational resources and 24-hour global support, to victims of cyberbullying and online abuse.
But what about the platforms themselves? What can online games and social networks do to prevent cyberbullying from happening in the first place? And how can community managers play their part?
In honour of #StopCyberbullyingDay 2018 and our official support of the event, today we are sharing our top three techniques that community managers can implement to stop cyberbullying and abuse in their communities.
1. Share community guidelines.
Clear community standards are the building blocks of a healthy community. Sure, they won’t automatically prevent users from engaging in toxic or disruptive behaviour , but they go a long way in setting language and behaviour expectations up front.
Post guidelines where every community member can see them. For a forum, pin a “Forum Rules, Read Before Posting” post at the top of the page. For comment sections, include a link or popup next to the comment box. Online games can even embed code of conduct reminders within their reporting feature. Include consequences — what can users expect to happen if policies are broken?
Don’t just include what not to do — include tips for what you would like your users to do, as well. Want the community to encourage and support each other? Tell them!
2. Use proactive moderation.
Once community standards are clearly communicated, community managers need a method to filter, escalate, and review abusive content.
Often, that involves choosing the right moderation software. Most community managers use either a simple profanity filter or a content moderation tool. Proactive moderation involves filtering cyberbullying and abuse before it reaches the community. Profanity filters use a strict blacklist/whitelist to detect harassment, but they’re not sophisticated or accurate enough to understand context or nuance, and some only work for the English language.
Instead, find a content moderation tool that can accurately identify cyberbullying, remove it in real-time — and ultimately prevent users from experiencing abuse.
Of course, platforms should still always have a reporting system. But proactive moderation means that users only have to report questionable, “grey-area” content or false positive, instead of truly damaging content like extreme bullying and hate speech.
3. Reward positive users.
Positive user experience leads to increased engagement, loyalty, and profits.
Part of a good experience involves supporting the community’s code of conduct. Sanctioning users who post abusive comments or attack other community members is an essential technique in proactive moderation.
But with so much attention paid to disruptive behaviour, positive community members can start to feel like their voices aren’t heard.
That’s why we encourage community managers to reinforce positive behaviour by rewarding power users.
Emotional rewards add a lot of value, cost nothing, and take very little time. Forum moderators can upvote posts that embody community standards. Community managers can comment publicly on encouraging or supportive posts. Mods and community managers can even send private messages to users who contribute to community health and well-being.
Social rewards like granting access to exclusive content and achievement badges work, too. Never underestimate the power of popularity and peer recognition when it comes to encouraging healthy behaviour!
When choosing a content moderation tool to aid in proactive moderation, look for software that measures user reputation based on behaviour. This added technology takes the guesswork and manual review out of identifying positive users.
The official #StopCyberbullyingDay takes place once every year, on the third Friday in June. But for community managers, moderators, and anyone who works with online communities (including those of us at Two Hat Security), protecting users from bullying and harassment is a daily task. Today, start out by choosing one of our three healthy community building recommendations — and watch your community thrive.
After all, doesn’t everyone deserve to share online without fear of harassment or abuse?