Online communities have unlimited potential to be forces for positive change.
Too often we focus on the negative aspects of online communities. How many articles have been written about online toxicity and rampant trolling? It’s an important topic — and one we should never shy away from discussing — but for all the toxicity in the online world, there are many acts of kindness and generosity that go unlooked.
There are a few steps that Community Managers can take to reinforce and reward positive behavior in their communities:
Promote and reinforce community guidelines. Before you can begin to champion positive behavior, ensure that it’s clearly outlined in your code of conduct. It’s not enough to say that you don’t allow harassment; if you want to prevent abuse, you have to provide a clear definition of what abuse actually entails.
A study was conducted to measure the effects of boundaries on children’s play. In one playground, students were provided with a vast play area, but no fences. They remained clustered around their teacher, unsure how far they could roam, uncertain of appropriate behavior. In another playground, children were given the same amount of space to play in, but with one key difference—a fence was placed around the perimeter. In the fenced playground, the children confidently spread out to the edges of the space, free to play and explore within the allotted space.
The conclusion? We need boundaries. Limitations provide us with a sense of security. If we know how far we can roam, we’ll stride right up to that fence.
Online communities are the playgrounds of the 21st century—even adult communities. Place fences around your playground, and watch your community thrive.
The flipside of providing boundaries/building fences is that some people will not only stride right up to the fence, they’ll kick it until it falls over. (Something tells us this metaphor is getting out of our control… ) When community members choose not to follow community guidelines and engage in dangerous behavior like harassment, abuse, and threats, it’s imperative that you take action. Taking action doesn’t have to be Draconian. There are innovative techniques that go beyond just banning users.
Some communities have experimented with displaying warning messages to users who are about to post harmful content. Riot Games has conducted fascinating research on this topic. They found that positive in-game messaging reduced offensive language by 62%.
For users who repeatedly publish dangerous content, an escalated ban system can be useful. On their first offence, send them a warning message. On their second, mute them. On their third, temporarily ban their account, and so on.
Every community has to design a moderation flow that works best for them.
Harness the power of user reputation and behavior-based triggers. These techniques use features that are unique to Community Sift, but they’re still valuable tools.
Toxic users tend to leave signatures behind. They may have their good days, but most days are bad—and they’re pretty consistently bad. On the whole, thee users tend to use the same language and indulge in the same antisocial behavior from one session to the next.
The same goes for positive users. They might have a bad day now and then; maybe they drop the stray F-bomb. But all in all, most sessions are positive, healthy, and in line with your community guidelines.
What if you could easily identify your most negative and most positive users in real time? And what if you could measure their behavior over time, instead of a single play session? With Community Sift, all players start out neutral, since we haven’t identified their consistent behavior yet. Over time, the more they post low-risk content, the more “trusted” they become. Trusted users are subject to a less restrictive content filter, allowing them more expressivity and freedom. Untrusted users are given a more restrictive content filter, limiting their ability to manipulate the system.
You can choose to let users know if their chat permissions have been opened up or restricted, thereby letting your most positive users know that their behavior will be rewarded.
Publicly celebrate positive users. Community managers and moderators should go out of their way to call out users who exhibit positive behavior. For a forum or comments section, that could mean upvoting posts or commenting on posts. In a chat game, that could look like publicly thanking positive users, or even providing in-game rewards like items or currency for players who follow guidelines.
We believe that everyone should be free to share without fear of harassment or abuse. We think that most people tend to agree. But there’s more to stopping online threats than just identifying the most dangerous content and taking action on the most negative users. We have to recognize and reward positive users as well.
Originally published on Quora