Building healthy and safe digital spaces begins with healthy community managers and moderators. We need to help community managers be mindful and take care of their mental health as they are often exposed to some of the worst of the internet – on a daily basis.
Occupational burnout is an all-too-common result that we, as an industry, must highlight and focus on changing. Identifying job stress and giving employees flexibility to prioritize their wellbeing improves our communities.
We suggest that companies encourage community managers to follow these 5 tips to prioritize their wellness and resilience:
1 – Create a wellness plan
Community managers are often repeatedly exposed to the worst online behaviors and are left feeling emotionally drained at the end of the workday. A wellness plan helps them manage their stress and mentally recharge. This actionable set of activities helps community managers to take wellness breaks throughout the day and to create a buffer between work and their personal lives. Whether it’s taking a walk outside, listening to music, meditating, talking to family or friends, a wellness plan can help community managers decompress before transitioning to the next moment of their day.
2 – Leverage AI Plus
Community managers monitor for hate speech, graphic images, and other types of high-risk content. Prolonged exposure to traumatic content can severely impact an individual’s mental health and wellbeing. Content moderators can develop symptoms of P.T.S.D., including insomnia, nightmares, anxiety, and auditory hallucinations as a result of consistent exposure to traumatic content.
By proactively leveraging technology to filter content, reducing the exposure to human moderators, our partners have reduced the workload of their community managers by as much as 88%*. This gives community managers more time to focus on other aspects of their job and protects their wellbeing by minimizing the amount of time they’re exposed to high-risk content.
3 – Be mindful of the types of content you’re moderating for
Rotating the types of content for which each team member is monitoring can help alleviate the negative impact that constant exposure to a singular focus area may cause. Threats of harm and self-harm, racism, sexism, predatory behavior, and child grooming are just a few of the types of content community managers monitor for and are exposed to daily.
4 – Focus on the positive
Most chat, images, and videos in online communities are aligned with the intended experiences of those products. In our experience, about 85% of user-generated content across different verticals is what we classify as low-risk, very positive types of behavior. Think community members discussing matters pertinent to the community, their hobbies and passions, or sharing pictures of their pets. Focusing on the positive side of your community will help you keep this reality in mind, and also remember why you do what you do everyday.
One of the ways in which you can focus on the positive aspects of your community is spending time in your product and seeing how community members are engaging, their creativity and passion. Make a point to do that at least once a week with the intent of focusing on the positive side of the community. Similarly, if you leverage a classification and filtering system like Community Sift, you should dedicate time to looking at chat logs that are positive. After either of these activities, you should write down and reflect on 3 to 5 things that were meaningful to you.
5 – Remember you’re making an impact
Monitoring an endless stream of high-risk content can make community managers feel like their work isn’t making an impact. That couldn’t be further from the truth. Their work is directly contributing to the health and safety of social and online play communities. When community managers identify a self-harm threat or protect children from predators, they are immediately making an impact in the life of that individual. In addition to monitoring content, community managers help to ensure that users have a positive and happy experience when engaging with their platform.
According to a 2020 survey conducted by the Anti-Defamation League, 81 percent of U.S. adults aged 18-45 who played online multiplayer games experienced some form of harassment. Approximately 22% of those community members went onto quit an online platform because of harassment they experienced. Harassment is an issue actively driving community members away from engaging with their favorite platforms. By helping to create a safe and healthy space, community managers are creating an environment where individuals can make friends, feel like they belong to a community, and have overall positive social interactions without the fear of harassment – while also helping drive the success of the community and overall acquisition and retention metrics. A true win-win.
Help protect the well-being of your community managers. Request a demo today to see how Two Hat’s content moderation platform can reduce your community manager’s workload and exposure to harmful content.
* Two Hat Customer analysis, 2020