Adding Chat to Your Online Platform? First Ask Yourself These 4 Critical Questions

Want to retain users and lower the cost of acquisition on your platform? In 2018, social features including chat, private messaging, usernames, and user profiles are all must-haves in an overstuffed market where user retention is critical to long-term success. Nothing draws a crowd like a crowd — and a crowd of happy, loyal, and welcoming users will always bring in more happy, loyal, and welcoming users.

But there will always be risks involved when adding social features to your platform. A small percentage of users will post unwanted content like hate speech, NSFW images, or abusive language, all of which can cause serious damage to your brand’s reputation.

So while social features are must-haves in 2018, understanding — and mitigating — the risks inherent in adding those features are equally important.

If you’re just getting started with chat moderation (and even if you’ve been doing it for a while), here are four key questions to ask.

1. How much risk is my platform/brand willing to accept?

Every brand is different. Community demographic will usually be a major factor in determining your risk tolerance.

For instance, communities with users under 13 in the US have to be COPPA compliant, so preventing users from sharing PII (personally identifiable information) is essential. Edtech platforms have to mitigate risk by ensuring that they’re CIPA and FERPA compliant.

With legal ramifications to consider, these platforms that are designed for young people will always be far more risk-averse than brands that are marketed towards more mature audiences.

However, many older, more established brands — even if they cater to an older audience — will likely be less tolerant of risk than small or new organizations.

Consider your brand’s tone and history. Review your corporate guidelines to understand what your brand stands for. This is a great opportunity to define exactly what kind of an online community you want to create.

2. What kind of content is most dangerous to my platform/brand?

Try this exercise: Imagine that one item (say, a forum post or profile pic) containing pornography was posted on your platform. How would it affect the brand? How would your audience react to seeing pornography on your platform? How would your executive team respond? What would happen if the media/press found out?

Same with PII — for a brand associated with children or teens, this could be monumental. (And if it happens on a platform aimed at users under 13 in the US, a COPPA violation can lead to potentially millions of dollars in fines.)

What about hate speech? Sexual harassment? What is your platform/brand’s definition of abuse or harassment? The better you can define these terms in relation to your brand, the better you will understand what kind of content you need to moderate.

3. How will I communicate my expectations to the community?

Don’t expect your users to automatically know what is and isn’t acceptable on your platform. Post your community guidelines where users can see them. Make sure users have to agree to your guidelines before they can post.

In a recent blog for CMX, Two Hat Director of Community Trust & Safety Carlos Figueiredo explores writing community guidelines you can stick to. In it, he provides an engaging framework for everything from creating effective guidelines from the ground up, to collaborating with your production team to create products that encourage healthy interactions.

4. What tools can I leverage to manage risk and enforce guidelines in my community?

We recommend taking a proactive instead of a reactive approach to managing risk. What does that mean for chat moderation? First, let’s look at the different kinds of chat moderation:

  • Live moderation: Moderators follow live chat in real time and take action as needed. High risk, very expensive, and not a scalable solution.
  • Pre-moderation: Moderators review, then approve or reject all content before it’s posted. Low risk, but slow, expensive, and not scalable.
  • Post-moderation: Moderators review, then approve or reject all content after it’s posted. High-risk option.
  • User reports: Moderators depend on users to report content, then review and approve or reject. High-risk option.

On top of these techniques, there are also different tools you can use to take a proactive approach, including in-house filters (read about the build internally vs buy externally debate), and content moderation solutions like Two Hat’s Community Sift (learn about the difference between a simple profanity filter and a content moderation tool).

So what’s the best option?

Regardless of your risk tolerance, always use a proactive filter. Content moderation solutions like Two Hat’s Community Sift can be tuned to match your risk profile. Younger communities can employ a more restrictive filter, and more mature communities can be more permissive. You can even filter just the topics that matter most. For example, mature communities can allow sexual content while still blocking hate speech.

By using a proactive filter, you’ve already applied the first layer of risk mitigation. After that, we recommend using a blend of all four kinds of moderation, based on your brand’s unique risk tolerance. Brands that are less concerned about risk can depend on user reports for the most part, while more risk-averse platforms can pre or post-moderate content that they deem potentially risky, but not risky enough to filter automatically.

Once you understand and can articulate your platform/brand’s risk tolerance, you can start to build Terms of Use and community guidelines around it. Display your expectations front and center, use proven tools and techniques to manage risk, and you’ll be well on your way to building a healthy, thriving, and engaged community of users — all without putting your brand’s reputation at risk.

Now, with your brand protected, you can focus on user retention and revenue growth.

Online Moderators: Ten Simple Steps to Decrease Your Stress

As a community manager or content moderator, you experience the dark side of the internet every day. Whether you are reviewing chat, social media, forum comments, or images, high-risk content can be draining — and you may not even realize the damage it’s doing.

Studies show that community teams on the front lines of chat, image, or video moderation are especially vulnerable to stress-related symptoms including depression, insomnia, vicarious trauma (also known as “compassion fatigue”), and even PTSD. It’s critical that you have the right tools and techniques at your disposal to support your mental health.

Therapist and wellness trainer Carol Brusca recently hosted a “Stress, Wellness, and Resilience” training session for Two Hat’s clients and partners. Here are her top 10 wellness tips for online moderators and community managers

Talk to someone.

Having and using social supports is the number one indicator of resilience. Asking for help from someone who cares about you is a wonderful way to get through a difficult time.

Does your company health plan provide access to a mental health professional? Take advantage of it. There’s no shame in talking to a therapist. Sometimes, talking to a stranger can be even more effective than confiding in a loved one.

Learn to say no.

If we do not set boundaries with others we can find ourselves feeling stressed out and overwhelmed. If you notice this might be a habit for you, try saying “no” once a day and see if you begin to feel better.

Of course, saying “no” at work isn’t always an option. But if you’re spending too much time reviewing high-risk content, talk to your manager. Ask if you can vary your tasks; instead of spending all of your workday reviewing user reports, break up the day with 15-minute gameplay breaks. Check out our blog post and case study about different moderation techniques you can use to avoid chat moderation burnout.

Go easy on yourself.

We are quick to criticize ourselves and what we have done wrong, but not as likely to give ourselves credit for what went right, or all the things we did well.

Remember that you work hard to ensure that your online community is healthy, happy, and safe. Pat yourself on the back for a job well done, and treat yourself to some self-care.

Remember, this too will pass.

There are very few situations or events in our lives that are forever. Try repeating this mantra during a stressful time: this struggle will pass. It will make getting through that time a little easier.

(Maybe just repeat it silently in your head. Your co-workers will thank you.)

Get plenty of sleep.

We need sleep to replenish and rejuvenate. Often when we are feeling stressed, we struggle with sleeping well. If this happens to you, make sure your bedroom is dark and cool; try some gentle music to help you get to sleep, or use an app that plays soothing sounds on a loop. If staying asleep is the problem, try having a notepad and pen by your bed to write down your worries as they come up.

Pro tip: Save the marathon 3:00 am Fortnite sessions for the weekend.

Have a hobby.

Having a hobby is a great distraction from the stressors of everyday life. If you can do something outside, all the better. For many people being in nature automatically decreases stress.

Or, stick to video games. Playing Tetris has been proven to help people who experience trauma.

Drink tea.

A large dose of caffeine causes a short-term spike in blood pressure. It may also cause your hypothalamic-pituitary-adrenal axis to go into overdrive. Instead of coffee or energy drinks, try green tea.

We know that the smell of a freshly-brewed pot of coffee is like catnip to most moderators… but hear us out. Green tea has less than half the caffeine of coffee and contains healthy antioxidants, as well as theanine, an amino acid that has a calming effect on the nervous system.

Laugh it off.

Laughter releases endorphins that improve mood and decrease levels of the stress-causing hormones cortisol and adrenaline. It literally tricks your nervous system into making you happy. Try a comedy movie marathon or a laughter yoga class (this is a real thing!).

And hey, a 10-minute meme break never hurt anyone.

Exercise.

Getting plenty of exercise will decrease stress hormones and increase endorphins, leaving you feeling more energized and happier.

Ever had a 30-second, impromptu dance party at your desk?

No, really!

Often referred to as the “stress hormone,” cortisol is released in our body when we’re under pressure. Excess cortisol can cause you to feel stress, anxiety, and tension. Exercise brings your cortisol levels back down to normal, allowing you to relax and think straight again.

So crank up a classic, stand up… and get down.

Try the “3 Good things” exercise.

Each night, write down three good things that happened during the day. This practice makes you shift your perspective to more positive things in your life — which in turn can shift your mood from stressed to happy…

… even if the three good things are tacos for lunch, tacos for 2 pm snack, and tacos for 4 pm snack. Good things don’t have to be earth-shattering. Gratitude comes in all sizes.

So, whether you’re sipping a mug of green tea, talking to a professional, or shaking your groove thing in the name of science and wellness, never forget that a little self-care can go a long way.

At Two Hat, we empower gaming and social platforms to build healthy and engaged online communities with our content filter and automated moderation software Community Sift — and that can’t be done without healthy and engaged community teams.