How We Maintain a Chat Filter in a Changing World

Have you ever wondered how Two Hat maintains a dynamic and up-to-date chat filter in an ever-evolving world, in 20 languages?

In light of recent global events, including the COVID-19 pandemic and the #BlackLivesMatter movement, we wanted to provide insight into our internal process.

We spoke to Portuguese Language Specialist Richard Amante, a journalist with 15 years of international experience, about how he and the team stay on top of the latest news.

As Richard explains, the Language & Culture team researches trending topics and new linguistic expressions and adds them to the dictionary in multiple languages. It doesn’t end there.

After the team adds new expressions to the chat filter, they review and adjust how those expressions are used across different communities and in different contexts. That includes finding and fixing false positives (low-risk content incorrectly flagged as high-risk) and false negatives (you guessed it — high-risk content incorrectly flagged as low-risk), detecting new and unexpected linguistic patterns, and upgrading and downgrading the riskiness of a phrase based on cultural shifts.

Especially today, in a world where so many of us are stuck at home, the lines separating our “real life” from our “online life” are blurry, if they exist at all. Whether we’re forming a clan in a multiplayer game or commenting on a friend’s profile pic, we’re having conversations online about everything, from the latest news to pop culture.

You might be wondering — why does this matter? It’s simple. Our clients want to know what their players and users are talking about; and more importantly, they want to keep them safe from abuse, hate speech, and harassment — while still promoting healthy conversations.

Every online community has different standards of what a healthy conversation looks like. An edtech platform designed for 7-12-year-olds will likely have a different standard from a social network catering to millennials. Here is the great power of Two Hat’s Community Sift: While the Language & Culture team provides a baseline by adding new words and phrases to the dictionary, clients can augment those words and phrases in real-time, as needed.

In this way, we work closely with our clients to maintain a living, breathing global chat filter in an ever-changing world.

Want to learn how Two Hat can help you maintain a safe and healthy online community? Request a demo today.


Tech Perspectives: Surpassing 100 billion online interactions in a month

In 2020, social platforms that wish to expand their product and scale their efforts are faced with a critical decision — how will they automate the crucial task of content moderation? As platforms grow from hundreds to thousands to millions of users, that means more usernames, more live chat, and more comments, all of which require some form of moderation. From app store requirements to legal compliance with global legislation, ensuring that all user-generated content is aligned with community guidelines is nothing short of an existential matter.

When it comes to making a technical choice for a content moderation platform, what I hear in consultations and demos can be distilled down to this: engineers want a solution that’s simple to integrate and maintain, and that can scale as their product scales. They are also looking for a solution that’s battle-tested and allows for easy troubleshooting — and that won’t keep them up at night with downtime issues!

“Processing 100 billion online interactions in one month is technically hard to achieve. That is not simply just taking a message and passing it on to users but doing deep textual analysis for over 3 million patterns of harmful things people can say online. It includes building user reputation and knowing if the word on the line above mixed with this line is also bad. Just trying to maintain user reputation for that many people is a very large technical challenge. And to do it all on 20 milliseconds per message is incredible”.  Chris Priebe, Two Hat’s CEO and Founder

Surpassing 100 Billion Online Interactions in a Month
I caught up with Laurence Brockman, Two Hat’s Vice President of Core Services, and Manisha Eleperuma, our Manager of Development Operations, just as we surpassed the mark of 100 billion pieces of human interactions processed in one month.

I asked them about what developers value in a content moderation platform, the benefits of an API-based service, and the technical challenges and joys of safeguarding hundreds of millions of users globally.

Carlos Figueiredo: Laurence, 100 billion online interactions processed in one month. Wow! Can you tell us about what that means to you and the team, and the journey to getting to that landmark?

“At the core, it’s meant we were able to keep people safe online and let our customers focus on their products and communities. We were there for each of our customers when they needed us most”.

Laurence Brockman: The hardest part for our team was the pace of getting to 100 billion. We tripled the volume in three months! When trying to scale & process that much data in such a short period, you can’t cut any corners.  And you know what? I’m pleased to say that it’s been business as usual – even with this immense spike in volume. We took preventative measures along the way, we focused on key areas to ensure we could scale. Don’t get me wrong, there were few late nights and a week of crazy refactoring a system but our team and our solution delivered. I’m very proud of the team and how they dug in, identified any potential problem areas and jumped right in. At 100 billion, minor problems can become major problems and our priority is to ensure our system is ready to handle those volumes. 

“What I find crazy is our system is now processing over 3 billion events every day! That’s six times the volume of Twitter”.

CF: Manisha, what are the biggest challenges and joys of running a service that safeguards hundreds of millions of users globally?

Manisha Eleperuma: I would start off with the joys. I personally feel really proud to be a part of making the internet a safer place. The positive effect that we can have on an individual’s life is immense. We could be stopping a kid from harming themself, we could be saving them from a predator, we could be stopping a friendly conversation turning into a cold battle of hate speech. This is possible because of the safety net that our services provide to online communities. Also, it is very exciting to have some of the technology giants and leaders in the entertainment industry using our services to safeguard their communities. 

It is not always easy to provide such top-notch service, and it definitely has its own challenges. We as an Engineering group are maintaining a massive complex system and keeping it up and running with almost zero downtime. We are equipped with monitoring tools to check the system’s health and engineers have to be vigilant for alerts triggered by these tools and promptly act upon any anomalies in the system even during non-business hours. A few months ago, when the pandemic situation was starting to affect the world, the team could foresee an increase in transactions that could potentially start hitting our system. 

“This allowed the team to get ahead of the curve and pre-scale some of the infrastructure components to be ready for the new wave so that when traffic increases, it hits smoothly without bringing down the systems”. 

Another strenuous exercise that the team often goes through is to maintain the language quality of the system. Incorporating language-specific characteristics into the algorithms is challenging, but exciting to deal with. 

CF: Manisha, what are the benefits of using an API-based service? What do developers value the most in a content moderation platform?

ME: In our context, when Two Hat’s Community Sift is performing as a classification tool for a customer, all transactions happen via customer APIs. In every customer API, based on their requirements, it has the capability to access different components of our platform side without much hassle. For example, certain customers rely on getting the player/user context, their reputation, etc. The APIs that they are using to communicate with our services are easily configurable to fetch all that information from the internal context system, without extra implementation from the customer’s end.

This API approach has accelerated the integration process as well. We recently had a customer who was integrated with our APIs and went live successfully within a 24 hour period”.

Customers expect reliability and usability in moderation platforms. When a moderator goes through content in a Community Sift queue, we have equipped the moderator with all the necessary data, including player/user information with the context of the conversation, history and the reputation of the player which eases decision-making. This is how we support their human moderation efforts. Further, we are happy to say that Two Hat has expanded the paradigm to another level of automated moderation, using AI models that make decisions on behalf of human moderators after it has learned from their consistent decisions, which lowers the moderation costs for customers. 

CF: Laurence, many of our clients prefer to use our services via a server to server communication, instead of self-hosting a moderation solution. Why is that? What are the benefits of using a service like ours?

LB: Just as any SaaS company will tell you, our systems are able to scale to meet the demand without our customers’ engineers having to worry about it. It also means that as we release new features and functions, our customers don’t have to worry about expensive upgrades or deployments. While all this growth was going on, we also delivered more than 40 new subversion detection capabilities into our core text-classification product.

Would you like to see our content moderation platform in action? Request a demo today.

How to Support Content Moderator Wellness in 2020

We started 2020 having no doubt about the importance of content moderation for social platforms. As Talerton Gillespie, author of Custodians of the Internet, writes, “Content moderation is constitutional to the functioning of platforms, essential to what platforms are.” However, I believe that we have yet to fully appreciate the work of moderators, support them, and safeguard their wellbeing.

Take a step back to May 2019, when I spoke at and attended the Content Moderation in 2019 workshop hosted by IAPP. Gillespie was the keynote speaker, and he made a compelling case for redefining the moderator role: not as the custodian responsible for keeping the Internet clean, but rather one of guardianship.

In that same workshop, I witnessed the inception of an idea to create a professional association that would defend the interest of moderators and Trust & Safety professionals. Fast forward a year later, and enter The Trust & Safety Professional Association. The soon-to-launch organization will be focused on advancing the trust and safety profession through a shared community of practice. It’s encouraging to see this happening! Content moderators need support, especially now, during the COVID-19 pandemic, that we are spending more time online and moderation duties get more challenging.

“Moderation is the most essential, yet underrated, aspect of social-digital experiences. You couldn’t have an event in Times Square without police officers, Comic-Con couldn’t exist without event staff, and you wouldn’t send your kid to camp without camp counselors. Why should digital experiences be anything else?”

Izzy Neis, Head of Digital at ModSquad (Why Community Moderation Matters)

A critical piece of support needed is the protection of moderator wellbeing. Here at Two Hat, a mission-driven company founded on the ideals of Trust and Safety, we believe in giving moderators the tools and the best practices to ensure that:

  1. The impact of high-risk content on their wellbeing is minimized;
  2. They can focus their time on purposeful moderation, intentionally using their human expertise instead of doing manual work that machines can take care of;
  3. Wellness & resilience is a priority

Reduce Exposure to High-Risk Content

Protecting users from damaging user-generated content like gory images, dangerous/hateful speech, and chat that encourages suicide is a fundamental responsibility for online platforms. Equally important is the need to protect the moderators who ensure your online community remains a productive and thriving space.

Filters that identify and action on abusive behaviors are table stakes. There’s no need to expose moderators to high-risk content that AI can identify and proactively block without the need for human review. Furthermore, by proactively filtering you can immediately reduce the amount of user-generated reports, drastically reducing the workload of moderators.

In a large mobile game, user-generated reports decreased by 88% with the addition of a chat filter
From the e-book Content Moderation in Challenging Times

Purposeful Moderation

The work of a content moderator can feel like an endless battle against an ever-growing pile of content. I know this because I was a moderator once and I felt that challenge every day.

This is why it’s critical for a moderator to feel like they are doing something meaningful and purposeful. Knowing that the work you do every day has a tangible and positive impact on the community and in the world will help you connect and reconnect with why you’re doing that job in the first place. It’s very challenging to moderate and feel that connection without having the right tools, processes, and procedures at your disposal. One of the most important processes is to prioritize staff wellness and resilience.

Wellness & Resilience

May is Mental Health Awareness Month. This is an opportunity to raise awareness of what content moderators contend with from an emotional point of view. As Pascal Debroek, Head of Customer Service at Hatch Entertainment Ltd. said earlier this year:

“You have to understand that these can also be emotionally draining jobs. Most of these are pure customer-facing, and in a lot of the cases deal with either sensitive topics, aggravated end-users dealing with a situation that did not meet their expectations or even outright insults and threats.

Let’s not forget, most contacts with players stem from an emotional state; happy, sad, angry, you’ll encounter them all in these roles. And because it pays off to be empathetic in such roles, it also means your employees are more susceptible to the emotions that surround them.”

Today, I want to share a critical step you can take with your team to level up everyone’s wellness and resilience: build team and individual wellness plans. A wellness plan is an actionable set of activities that help you manage stress and recharge yourself.

For example, my own personal wellness plan might look something like this:

  1. Play drums for at least 15 minutes a day
  2. Take a short walk when it’s sunny outside
  3. Play video games with my family and friends
  4. Meditate (here’s a link to a 1-month free experience in the meditation app I use)
  5. Phone a friend or family member who knows my area of work and is OK with discussing challenging topics with me. (I had to use this one last year when I returned from a child protection conference and heard lots of stories that were heart-crushing. Hearing about such realities already pays a toll. Imagine those true heroes who review child abuse imagery and help protect kids globally.)
  6. Listen to some of my favorite songs

I hope this inspires you to build your own plan. You can go as specific as you want and as prescriptive as you need. Perhaps you want a set of actions you can take every day. Or maybe you prefer to have a pool of actions you pull from depending on the day. Also, as a moderation team, you can build team activities that help you cope with everyday stress. Playing games together might be a great way to do that. In an age of social distancing, you can play online games and also get creative with games you can play via video conferencing!

Back in 2018, we collaborated with therapist and wellness trainer Carol Brusca on a “Stress, Wellness, and Resilience” training session for Two Hat clients. She also shared her top 10 wellness tips for online moderators and community professionals. We just republished that piece adapting a few points to the new social distancing reality we are living.


If you saw value in these tips and would like to know more about how technology can protect your moderation team’s wellbeing, we can help.

Request a demo today to see how Two Hat’s content moderation platform can reduce your workload and your exposure to harmful content.

Content Moderators: 10 Tips to Manage Stress During COVID-19

Back in 2018, we collaborated with therapist and wellness trainer Carol Brusca on a “Stress, Wellness, and Resilience” training session for Two Hat clients. She also shared her top 10 wellness tips for online moderators and community professionals.

Content moderation and player support are tough jobs on the best of days. Today, due to the COVID-19 pandemic and resulting lockdowns, most moderators and player support professionals are now working from home. Online platforms are experiencing exponential growth, which means that moderators are busier — and under more stress — than ever.””

“It’s really important to explain to moderators and staff is that we acknowledge that this is a difficult time, and this is why we [as leaders] are playing our part in terms of doing moderation… Moderators are doing a difficult job at the best of times and right now they’re working a lot of hours and it’s extremely important that we communicate with members of staff about how they’re feeling.” Vernon Jones, Head of Safety at MovieStarPlanet

Content Moderation in Challenging Times webinar

Below, we’ve updated Carol’s original tips for managing stress to reflect today’s new reality.


As a community manager or content moderator, you experience the dark side of the internet every day. Whether you are reviewing chat, social media, forum comments, or images, high-risk content can be draining — and you may not even realize the damage it’s doing.

Studies show that community teams on the front lines of chat, image, or video moderation are especially vulnerable to stress-related symptoms including depression, insomnia, vicarious trauma (also known as “compassion fatigue”), and even PTSD. Now, more than ever, it’s critical that you have the right tools and techniques at your disposal to support your mental health.

1. Talk to someone.
Having and using social supports is the number one indicator of resilience. Asking for help from someone who cares about you is a wonderful way to get through a difficult time.

Does your company’s health plan provide access to a mental health professional? Take advantage of it. There’s no shame in talking to a therapist. Sometimes, talking to a stranger can be even more effective than confiding in a loved one.

If you can’t see a therapist in person right now, there are virtual options available.

2. Learn to say no.
If we do not set boundaries with others we can find ourselves feeling stressed out and overwhelmed. If you notice this might be a habit for you, try saying “no” once a day and see if you begin to feel better.

Of course, saying “no” at work isn’t always an option. But if you’re spending too much time reviewing high-risk content, talk to your manager. Ask if you can vary your tasks; instead of spending all of your workday reviewing user reports, break up the day with 15-minute gameplay breaks. Check out our blog post and case study about different moderation techniques you can use to avoid chat moderation burnout.

Setting boundaries is essential when you work from home. You cannot be “on” 24/7. Again, work with your manager to set fair and reasonable expectations.

3. Go easy on yourself.
We are quick to criticize ourselves and what we have done wrong, but not as likely to give ourselves credit for what went right, or all the things we did well.

Remember that you work hard to ensure that your online community is healthy, happy, and safe. Pat yourself on the back for a job well done, and treat yourself to some self-care.

The last few months have been at times scary, confusing, and deeply uncomfortable. David Kessler, who co-wrote On Grief and Grieving: Finding the Meaning of Grief through the Five Stages of Loss with Elisabeth Kübler-Ross, says that the discomfort we are collectively feeling is actually grief. He says that it’s important that we acknowledge our grief and name it:

“We’re feeling a number of different griefs. We feel the world has changed, and it has. We know this is temporary, but it doesn’t feel that way, and we realize things will be different… The loss of normalcy; the fear of economic toll; the loss of connection. This is hitting us and we’re grieving. Collectively. We are not used to this kind of collective grief in the air.”

4. Remember, this too will pass.
There are very few situations or events in our lives that are forever. Try repeating this mantra during a stressful time: this struggle will pass. It will make getting through that time a little easier.

(Maybe just repeat it silently in your head. Your co-workers — aka, spouse and pets — will thank you.)

This is especially hard right now, when we’re stuck at home, trying to find a balance between our work life and home life. While we cannot know when there will be a “return to normal”, it’s still important to acknowledge that the added daily stress isn’t permanent.

David Kessler offers these words of wisdom: “This is a temporary state. It helps to say it… This is survivable. We will survive.”

5. Get plenty of sleep.
We need sleep to replenish and rejuvenate. Often when we are feeling stressed, we struggle with sleeping well. If this happens to you, make sure your bedroom is dark and cool; try some gentle music to help you get to sleep, or use an app that plays soothing sounds on a loop. If staying asleep is the problem, try having a notepad and pen by your bed to write down your worries as they come up.

Pro tip: Save the marathon 3:00 am Animal Crossing sessions for the weekend.

6. Have a hobby.
Having a hobby is a great distraction from the stressors of everyday life. If you can do something outside, all the better. For many people being in nature automatically decreases stress. Remember to wear a mask and practice good social distancing!

Or, stick to video games. Playing Tetris has been proven to help people who experience trauma.

7. Drink tea.
A large dose of caffeine causes a short-term spike in blood pressure. It may also cause your hypothalamic-pituitary-adrenal axis to go into overdrive. Instead of coffee or energy drinks, try green tea.

We know that the smell of a freshly-brewed pot of coffee is like catnip to most moderators… but hear us out. Green tea has less than half the caffeine of coffee and contains healthy antioxidants, as well as theanine, an amino acid that has a calming effect on the nervous system.

8. Laugh it off.
Laughter releases endorphins that improve mood and decrease levels of the stress-causing hormones cortisol and adrenaline. It literally tricks your nervous system into making you happy. Try a comedy movie marathon or a laughter yoga class (this is a real thing; hopefully there’s a virtual version now!).

And hey, a 10-minute meme break never hurt anyone.

9. Exercise.
Getting plenty of exercise will decrease stress hormones and increase endorphins, leaving you feeling more energized and happier.

Ever had a 30-second, impromptu dance party at your desk? (Zoom dance party, anyone?)

No, really!

Often referred to as the “stress hormone,” cortisol is released in our body when we’re under pressure. Excess cortisol can cause you to feel stress, anxiety, and tension. Exercise brings your cortisol levels back down to normal, allowing you to relax and think straight again.

So crank up a classic, stand up… and get down.

10. Try the “3 Good Things” exercise.
Each night, write down three good things that happened during the day. This practice makes you shift your perspective to more positive things in your life — which in turn can shift your mood from stressed to happy, even if the three good things are tacos for lunch, tacos for 2 pm snack, and tacos for 4 pm snack (thank you, SkipTheDishes!). Good things don’t have to be earth-shattering.

Gratitude comes in all sizes, especially now.

So, whether you’re sipping a mug of green tea, talking to a professional, or shaking your groove thing in the name of science and wellness, never forget that a little self-care can go a long way.


For tips on reducing your moderation workload during the pandemic, download our e-book Content Moderation in Challenging Times: Techniques to Moderate Chat & Manage Increased Volumes.

Content Moderation in Challenging Times

As much of the world’s population faces an extended period of staying home, people are spending more time on online platforms. What does this mean for online communities and those who manage them? The increase in traffic volumes in popular games and social networks is spiking sharply.

Chat volumes are soaring during the COVID-19 pandemic.

But volume alone is not the problem:

  • How are the exponential increases in user chats impacting content moderation practices and business workflows?
  • What are the new trends related to COVID-19 and what are online communities experiencing at this time?
  • How can we as an industry provide safe and inclusive spaces for users during and after the crisis?

To help answer these questions, I recently chatted with Vernon Jones, Head of Safety at MovieStarPlanet, and Two Hat’s Amy Vezeau, Manager of Client Integration, who shared with me their views on the state of online communities and content moderation. We focused on three main topics:

  • The scope of the challenge, especially as it relates to a spike in chat volumes
  • How COVID-19 has affected content moderation practices and impacted teams, business, and users
  • Practical tips and actionable approaches to add to your content moderation strategy during this challenging time

I’m so excited to announce that we’ve gathered these insights in a brand-new e-book Content Moderation in Challenging Times: Techniques to Moderate Chat & Manage Increased Volumes that you can download today

I’ve spoken to multiple organizations over the last two months that are looking for guidance, and this is a great piece of content that will help you navigate the changing landscape of content moderation during and after the pandemic. You can find it here.

How to Monitor COVID-19 Chat in Your Online Community

Back in 2017, I hosted a webinar called Preparing for Breaking News & Trending Topics. In it, I spoke about my time moderating large online communities at Disney Interactive, and the importance of staying on top of pop culture and culture-defining events both large and small.

In 2017, I spoke about the tragic events in Charlottesville as a cultural touchstone; an example of platform operators having to make difficult decisions about how to let their users process and discuss the attack. I shared a six-step protocol that Community Managers and Trust & Safety professionals can follow to ensure that their team is prepared to handle breaking news and trending topics.

While the COVID-19 pandemic may not be breaking news, it is an ever-evolving global event, and everyone is talking about it online, regardless of the platform. We’re seeing COVID-19 chat in mobile games, kids’ platforms, teens’ social networks, and MMOs.

With that in mind, I hope you find this six-step protocol to monitor COVID-19 chat on your platform valuable.

1. Compile vocabulary
The first step is to compile a list of words and phrases that you expect to see the community use. We’re going to use the term COVID-19 as a starting point. Obvious examples include:

  • alcohol wipe
  • border closing
  • confirmed case
  • corona
  • coronavirus
  • covid
  • covid19
  • epidemic
  • hand sanitizer
  • outbreak
  • pandemic
  • quarantine
  • social distancing
  • virus
  • WHO
  • world health organization
  • cdc
  • centers of disease control
  • infected

You’ll want to ensure that you’re watching for these words in your community – and in particular, how they’re being used. Is the community simply sharing their experiences with the pandemic, or are they harassing each other and potentially spreading misinformation?

2. Evaluate
The next step is to go beyond assumptions and review how your community is actually chatting.

Are they using words and phrases that you didn’t account for in your original list? Are there common misspellings? On the internet, language can change within a matter of hours. New compound terms including “covidvacay” and “coronacation” have come out of the pandemic and this rapid adoption of languages shows no signs of slowing down.

As you go through this process, it’s critical that you and your moderation team ask yourselves difficult questions, including:

  • Is quoting what could be construed as dangerous/hateful speech (kungflu, Chinese virus, wuflu, etc) acceptable for the purposes of discussing it?
  • When does humor cross the line?
  • How will you handle misinformation and the spread of fake and potentially dangerous news? Do you need to update your content moderation policies?

In a quick, 5 minute sampling of a single hour of chat across a variety of online communities, we saw COVID-19 referenced in a variety of different ways (spelling and grammatical errors included):

  • “what if the coronavirus is fake and its part of the placebo effect”
  • “so dont meme corona”
  • “my grandpa died of Corona rlvirus”
  • “and i have no shifts at work to pay them back cus of corona”
  • “it depends on if its a serious conversation. joking about corona has become an offense. which personally i find ridiculous. who are we without our jokes”
  • “well my mom is staying with me until the covid dies down so i can’t play games during the week until after 10pm -_-“
  • “the whole world got corona not just Italy”
  • “noone was expecting to get covid 19”

Another thing to consider is languages other than English. For example, in the Dutch language diseases are commonly used for bullying. Our Dutch Language & Culture specialist was quick to notice Dutch community members using bullying phrases like “corona child”, “corona loser”, and “corona face”.

Pay special attention to permanent UGC like usernames. You may allow users to discuss COVID-19 in chat, but do you want them to create a display name like CovidVectorGuy2020? Probably not.

3. Adjust
Now that you know how users are chatting, it’s time to adjust your chat filter to account for these new words and phrases.

Before you make any changes, consider:

  • How often was an expression used? One time in 1 million lines of chat? 20 times?
  • If you adjust a rule, what’s the impact?
  • Have you inadvertently created chat rules that are too strict? For example, “corona” is a kind of beer, and also refers to the circle of light around the sun or moon.

This is where using a sophisticated chat filter that recognizes context is critical.

4. Validate
Now that you’ve adjusted your filter, monitor your changes to ensure that you’ve avoided creating false positives and false negatives.

For example, you don’t want a phrase like “Corona means crown in Spanish” to trigger an action, whereas you would likely want “I hope you get corona” to result in moderation action (or a false send; whatever works for your community).

Tools that give you a live view of community chat can be very helpful here.

5. Analyze stats and trends
In the Two Hat content moderation platform, clients can run reports to view all chat within a specific time period, or to identify trends and common words.

Whatever reports are at your disposal, we recommend that you compile a regular report of trends and word count for all relevant stakeholders.


  • How is sentiment trending? Positive or negative?
  • After you’ve identified a new trending word or phrase, how often is it used? Is there an upward or downward trend?
  • How many warning messages, mutes, or suspensions did you have to issue daily, weekly, and/or monthly to users who are using the topic to harass others, target someone due to their nationality, or spread misinformation?

6. Review regularly
New trends will arise. The term “social distancing” is common now, but it wasn’t two months ago.

Just today, “China’s Chernobyl” began trending on Twitter. By the time this blog is published, there will be a new trending term that you should be aware of.

At times like this, staying on top of chat trends is critical. With increased volumes as more people are in lockdown and spend more time online, it’s important to safeguard all users and ensure a positive and productive experience in your platform.

To that end, I’m currently offering free community consultations. We can use the time to discuss your content moderation approaches and policies and see if there are any opportunities to update and adapt it in this dynamic online landscape.

Request a consultation using the form below.

Upcoming Digital Panel at GamesBeat 2020: “Player Behavior: Your Secret Growth Tool”

The player/user behavior, content moderation, and Trust & Safety disciplines tend to be viewed as part of a cost center. Proactive moderation is usually tacked on to games and platforms after the product is launched, or as the result of a crisis/PR nightmare.

Content moderation is smart, it’s necessary, and even lawyers like it, but can it also be an opportunity to facilitate amazing player and viewer experiences in your platform and reach a bigger and more diverse audience?

As it turns out, a proactive approach to player behavior, moderation, and Trust & Safety can accomplish that, and more. This is especially important during the COVID-19 crisis, in a time when gaming companies have experienced exponential growth in chat volumes. Some communities report up to 3000% more chat in March than in February! A massive increase like that means companies are opened up to both risks and opportunities that our industry simply cannot afford to ignore.

Luckily, there are multiple opportunities to better safeguard your player and userbase as well as foster an inclusive and healthy gaming environment. These include things like proven techniques that have reduced disruptive behavior by 40%, to understanding that the social stickiness of moderated chat in games increases engagement and drives long term value in online platforms.

The Opportunity to Chat
From Two Hat’s whitepaper “The Opportunity to Chat”

There are a lot of other techniques out there – and we are pleased to announce that we’ll be sharing some of them with the industry during the upcoming GamesBeat (virtual) summit!

On April 29th, 4:15 pm PST, join the GamesBeat Summit 2020: The Dawn of a New Generation stream to catch the panel “Player Behavior: Your Secret Growth Tool”, and hear from experts at the forefront of platform and game design as well as Trust & Safety and culturalization in games to learn more about:

  • What approaches can help companies harness this growth opportunity
  • How to get started right away and start making an impact on how you design and manage your communities

Hear more from our fellow panelists Kim Voll from Stray Bombay and Clara Siegel from Facebook in our GamesBeat Summit 2020 panel discussion “Player Behaviour: Your Secret Growth Tool” — and make sure to download the accompanying checklist with new best practices and insights inspired by our panel discussion.

Sky: Children of the Light is the Kind of Game We Need Right Now

I play sky with my nephew, he is 6yrs old. We have a routine he gets home from school and the evenings I’m home from work we play for a while. He loves sky and he’s such a kind boy. Sky is such a beautiful place where he uses all his candles making friends.

I moved from my parents house 2 years ago now and I don’t get to see my siblings very often. However, my little sister and I always find some time to play to the game at the same time like that we can keep in touch

I’ve met so many awesome people on Sky, most of them live on the other side of the world. We keep track of each other’s timezones so we can play together. I have Sky friends who don’t speak the same language as me but that’s what the cute emotes are for!

Those are just a few of the tweets from Sky: Children of the Light fans, in answer to the question “Do you play #thatskygame with family, long-distance friends, or your significant other? 🥰 We are so inspired by these stories and would love to hear more from people who use Sky to connect with loved ones.”

Sky is a new social adventure game from thatgamecompany, the studio responsible for some of the most innovative and beloved games of the last 10+ years, including Journey, Flower, and Flow.

Featuring inventive gameplay based on compassion and community, gorgeous music, and a world populated with broken constellations, ancestor spirits, and Candles, Sky is the gaming experience we need right now. It’s a compelling example of a game designed with a singular purpose – connection and community.

The results speak for themselves – Sky has won multiple awards, including iPhone Game of the Year 2019, the Audience Award at the Game Developers Choice Awards 2020, and most recently the Pocket Gamer People’s Choice award. With over 10 million downloads already on iOS, Sky was just released for Android last week and will be available on the Nintendo Switch this summer.

We’ve paired up with Sky’s Community Developer Robert Hornbek to bring you a short webinar exploring how the studio designed its newest game with positive social features at the core, and how they maintain that healthy community spirit using intentional and purposeful moderation.

In this webinar, you’ll learn:

  • Why thatgamecompany is committed to player safety (5:00 – 6:32)
  • How Sky: Children of the Light leverages innovative social features for better user engagement and experience (12:38 – 19:08, 25:08 – 28:30)
  • Their moderation best practices built with Two Hat (19:09 – 24:38)
  • Three pieces of advice for teams launching a new social game (28:35 – 33:15)

About thatgamecompany

thatgamecompany is a game studio dedicated to creating timeless interactive entertainment that inspires human connection worldwide. Creator of the critically-acclaimed games Sky, Journey, Flower, and Flow.

Top Tips for Managing a Healthy and High-Performing Player Experience Team

Player experience and content moderation stories have dominated the gaming news cycle in the first half of 2020, and show no signs of slowing down. As industry and player interest
in the previously oft-dismissed world of support agents and moderators grows stronger every day, we at Two Hat thought it was time to shine a light on the professionals who have been doing this work in the shadows for years.

We recently caught up with Pascal Debroek, a longtime player experience professional who has worked at some of the biggest mobile gaming startups that Helsinki has to offer, including Supercell, Next Games, and currently Hatch Entertainment Ltd. As Head of Customer Service at Hatch, he is responsible for providing a safe, fair, and frictionless environment for all players.

In this conversation, Pascal shares his experience running successful player support teams and provides invaluable advice for leaders in similar positions.

Two Hat: Let’s start with the obvious question. Why do player experience roles have a higher churn rate than other roles?

Pascal Debroek: Support functions, such as player support, community management, and moderation roles are often not considered to be integral to product or game development, despite the
obvious value they bring to customer retention and product development. CS departments are, more often than not, perceived to be a cost-center, a necessary money-sink to appease
consumers when they might run into a problem and some form of consumer-facing communication is required.

When these functions or departments are not perceived to be part of the “core” of your studio, and the employees don’t feel empowered, nor get the right training and tools to do their job, that will make them feel unappreciated by their employer. Having user feedback being dismissed and waved away – “Oh, that’s just another customer complaining about something not working” – that certainly doesn’t aid the situation. That’s basic Employee Experience knowledge and not just for studios, but for any organization out there.

You have to understand that these can also be emotionally draining jobs. Most of these are pure customer-facing, and in a lot of the cases deal with either sensitive topics,
aggravated end-users dealing with a situation that did not meet their expectations or even outright insults and threats. Let’s not forget, most contacts with players stem from an emotional state; happy, sad, angry, you’ll encounter them all in these roles. And because it pays off to be empathetic in such roles, it also means your employees are more susceptible
to the emotions that surround them.

And that’s not taking into account the exposure to personal insults, bullying, threats of harm and self-harm, racism, sexism, predatory behaviour, and child grooming to name a few. While these (hopefully) don’t occur all the time, it takes perseverance to stomach them and see the good in things. But I can promise you, it does affect most at one point or another in their career. And that is why it is so important to focus on well-being for these roles. We all have a threshold on how much we can handle before the job starts requiring more than we can give back.

It’s a shame because it’s such an important role for any gaming studio and can be a really valuable stepping stone for those who may not want to make support their career. People
who get their start in these support roles understand more often than not what the business is about, they understand basic game design, they understand production. The player support and moderation experience allows them to perform in a slightly different and more player-centric way in other roles because they already understand the perspective of the user.

TH: What are some things that leaders can do to keep their player experience staff happy, safe, and healthy?

PD: As a team leader, the first thing that is required is a safe environment built on trust. Especially if you are in a place where you’re constantly dealing with other humans, and as I
mentioned earlier, potential emotionally-laden communication.

Your team needs to be able to trust each other, both in doing their job to the best of their abilities, but also in being able to support every single person in that team. Without that,
your team will never feel safe and that will have an impact on efficiency, well-being, and ultimately performance. Even more so, if even one team member feels the team lead doesn’t have their back, you’ll end up in a very dangerous situation that could escalate at any moment.

You need to create a team with people who have empathy and people skills, it will make their job a lot easier. While I’m not going to suggest you would need to hire similar people – always go for compatible people who, in addition to a practical skill set are also good at reading and dealing with emotions. Communication is not only about what is being said, but also about what is not.

Once I started involving the whole team in the recruitment process, I saw a surge in team member compatibility and with that an increase in trust and performance over time. We would go through several screenings, talks with me as the hiring manager, an interview with HR, followed by talks with two more senior people from the team. And finally, they would meet up with the rest of the team for a casual half-hour to one-hour chat. Just talking about things the team would want to know like, “Hey, what kind of movies do you like? How do you unwind? Any interest in sports? Are you into superheroes?” which can lead to the dreaded Marvel or DC argument [laughs].

It’s important to hire the right people because every team member that you’re adding to the mix will affect and even change your team culture.

Then when it comes down to team culture itself, you need to ensure that you have a fair and open culture and an understanding that you’re all in this together, for the same cause. And people need to be very receptive to feedback and are expected to provide feedback. I believe you can be honest, to the point, and still be respectful and mindful. If you have that initial trust, it should be a lot easier to have those conversations, including the more difficult ones.

TH: You mentioned involving the team in the hiring process. How do you ensure that they continue to work together closely once they’re hired?

PD: There is also no reason why people on the team couldn’t have a one-on-one with each other. I’m not talking about HR-related topics here but professional development, mental support. It’s more like people asking for advice, another point of view, coaching on a particular topic. Or it can just as well be someone feels they need to lift some weight off their shoulders because of a personal situation.

And of course, there is the obvious “Hey, I have this kind of message. How would you deal with people who talk like this?” There are times when people will send a Slack message to each other and say, “Hey, do you need to talk? You want to grab a cup of coffee? Do we want to go for a short walk?” It’s ok to get frustrated or stuck at times, just as long as you realise it. In the end, everyone should know that they are among peers and they should assist each other. And as a supervisor or as a manager, you need to allow those kinds of things to happen.

Continuous learning and sharing experiences having open, honest feedback, people being able to tell each other how they feel – that’s the most important part. If you’re not feeling good, then how are you going to be able to do your job? You need moral support from your supervisor, but also from your team members.

TH: Because player experience roles and responsibilities are so emotionally charged, do you find that you have to look at success metrics differently?

PD: Metrics are important, but they will never show you the full story. Personally I feel many companies oversimplify by trying to fully quantify performance in support and moderation functions. In a lot of cases, there are external and irrational or emotional factors that will affect your metrics. If you then use those metrics to judge the performance of an individual, now that’s not really fair or motivational, is it?

At a previous employer, we decided not to use KPIs as a determination of whether staff was performing well, but rather used it as a benchmark for industry standards and it would allow the team to push themselves constantly. Of course, this doesn’t mean we were not paying close attention to the KPIs, yet by simply removing the “fear” aspect of employees not meeting certain artificial performance metrics, we created an environment where we would constantly challenge ourselves to work smarter and be proud as a team of our achievements.

The following is a perfect example of why the “traditional” take on support KPIs can be detrimental to a CS agent’s mental health, efficiency, and overall customer satisfaction: If you’re assigned a queue and you’re the one that is dealing with all the sensitive topics and negative feedback that comes in, it takes an emotional toll on you, and it becomes a lot harder to reply with each subsequent message. In order to create an understanding throughout the whole team, and to protect them, every team member would, in turn, be taking care of these more challenging topics and conversations.

Anyone dealing with these more sensitive or negative content and tickets was allowed to take more time. As long as a player received a timely and correct answer, they could take as long as they needed to reply, within sensible limits. The reasoning for this was that when you’re dealing with heavy topics in player support or player moderation, it can suck you emotionally dry very fast. If that means you’re only doing a quarter of tickets in a whole day compared to a top performer, then no one is going to ask why you didn’t do more, because everyone understood the challenge. The worst thing that can happen when you are feeling emotionally drained is adding time pressure. Ultimately, the end-result for the player or all players is more important than adhering to artificial time constraints that don’t reflect the context of the issue.

TH: What’s the business benefit of investing in experienced player experience leaders?

PD: There are a few reasons why a company would want to invest in more experienced Player Experience leaders. All of these stem from the mere fact that there is no substitute for experience. Rarely do companies ask for the added business value of hiring an experienced backend developer or a more senior product lead. Yet when it comes down to customer-facing roles, many companies still seem to struggle with the answer to that question; as if it somehow would be any different than for other roles.

If you’re looking to set up a department that communicates directly with your players from all over the globe, in order to create actionable insights on your product development and provide a safe online environment for all while maintaining scalable cost-efficient operations and always needing to keep in line with the expectations of your audience, would you rather not invest in someone who has the experience?

In order to succeed in your CX endeavours, you hire the right people, so they can hire the right talent, train them, coach them, empower them. They understand the expectations of the audience, what channels to use, how to approach communication about particular topics. They know the tools out there, how to tweak them, can create processes, analyze support metrics and plan resources accordingly. And often forgotten, these are the key people who need to influence decisions across departments and teams, walking the fine line between customer-centricity and profitability of the product or service.  And those insights only come with experience.

However, it’s not just about investing in experienced people. It is also about the resources allocated to the team and the tools they are provided to do their job best. You can hire a top leader, but if they have to make do with an email-only contact center, you won’t get far. The most obvious answer is that more experienced leaders can boost retention metrics in the mid to long term when given access to the right resources. That in itself is a major advantage for studios, especially in the competitive f2p [free-to-play] mobile space, but it extends well beyond that.

What will our players thank us for? Image credit: Mark Pincus

There is this quote from Zynga that I feel more companies should ask themselves. On a wall in one of their offices is this one question: “What will our players thank us for?”

That is a very important thing to reflect on as a company because it all comes down to player expectations and surpassing them. You don’t create fantastic experiences without thought, without understanding your audience, nor without investing in the contact points your audience will reach out to. Because those interactions, the quality, the lack of friction and the efficiency, will leave a lasting impression.

As a follow-up question, I would suggest companies also start asking themselves what their brand will be remembered for later on. This is something companies like Apple, Amazon and Netflix understand. Within games, simply take a look at companies like Activision Blizzard or Supercell. It’s obvious they are not just making games, they are building experiences and are differentiating themselves as a brand. People these days download Supercell games not solely on the premise that it is a good game, but because they have come to expect good games from Supercell as a brand.

TH: Can you tell us about any initiatives you’ve done to boost company awareness of player experience teams?

PD: At Next Games, we created a management-endorsed shadowing initiative dubbed “Player Support Bootcamp”. You would sign up for three-hour sessions where you would be told how player support works, what tools we use, how we communicate, learn about our processes or what happens when we log a bug, how we do investigations, how we do moderation.

It was purely voluntary, and at the high point of the program, we had more than 62,5% of the company signing up for sessions. So we decided to gamify it: Come to three sessions, you get a fantastic-looking degree, created by one of our very talented marketing artists. People started competing for spots in the program; we were fully booked for months. Degrees appeared hanging from the wall or stood framed on desks as a badge of honour, while developers shared personal experiences with the Bootcamp over coffee in the kitchen.

We saw two huge changes come out of the program. A UX designer kicked in gear a big feature redesign of how users save their game progress, based on the feedback she saw from players during Bootcamp. After the implementation of the redesign, we saw a decrease of more than 20% of tickets regarding lost accounts. Huge impact on our bottom line there.

The other change happened when a senior client programmer who went through the program noticed that we were wasting time trying to localize some of our questions since we sometimes would get messages in languages we didn’t have native speakers for. We were copy/pasting the tickets into Google translate, putting them back into the CRM, then replying using Google translate. So in his spare time, he actually started programming a bot for us that would go through the CRM and automatically translate emails for us in advance, saving us time and money.

The buy-in from the management team was crucial to the success of the project. Our CEO was actually one of the early adopters and possibly the biggest proponent. We could see that shortly after the producers of the games actually got a really big interest in the program and gently persuaded people to sign up. Especially for those teams working on new projects, the Bootcamp was a source of inspiration. So it had a huge impact. And I’m sure that there are still people who are talking about the program today.

TH: In your opinion, what does the future hold for player experience teams?

PD: Users nowadays have a better understanding of game mechanics and social dynamics, and they also have higher expectations. Seven to eight years ago, if there was a game on the
app store, you were just comparing that game to the next best game. Nowadays, you compare that game to the last best experience that you’ve ever had, which could include pretty much anything on the app store and beyond, from Amazon to Spotify. I’m expecting every game experience to be just as frictionless as on Clash of Clans, but just as deep as Skyrim. Is that fair? Perhaps not, considering technical and other limitations, yet it is what is happening.

I’m not the only one who thinks companies need to invest more in the service aspects of their games. There’s an unstoppable mindset change happening in the retail and on-demand landscape; the service industry is getting disrupted. Games-as-a Service is already a really big thing right now and it shows no signs of slowing down. But the games industry will need to adapt fast to keep up with this evolution, which obviously doesn’t happen without a change in attitude towards supporting functions and towards the gaming audience.

The whole idea of simply hiring a junior person who can answer email messages for cheap, that’s also going to disappear. The need for emotionally smart, educated and experienced support and moderation personnel is going to skyrocket. The more technology advances, the more need there will be for people who can rely on experience and a higher understanding of what they’re doing, who understand the tools and the processes they’ll be using, but most of all, understand humans and human behaviour.

TH: That’s a great point to end this on. Thank you for sharing your insights, Pascal!

PD: My pleasure, Carlos! Thanks for speaking with me.

Join a Webinar: Taking Action on Offensive In-Game Chat

I’m excited to announce that Two Hat is co-hosting an upcoming webinar with the International Game Developers Association on Friday, February 21st, 2020.

The incredible Liza Wood (check out her bio below), our Director of Research and Data Science, will be joining me as we present Defining, Identifying and Actioning Offensive Chat: Approaches and Frameworks.

We will start by examining why defining, identifying and actioning offensive chat matters to game development, with tangible supporting stats. Later we will provide an overview of the Five Layers of Community Protection.

Here’s what you can expect to get out of it:

  • Compelling arguments for adding player safeguarding mechanisms to your game’s social features
  • Actionable takeaways for creating internal alignment on categories and terminology
  • Action plans for identifying and addressing disruptive behavior

We hope you will join us on February 21st at 3 pm PST on IGDA’s Twitch Channel. Mark your calendars!

To celebrate this collaboration with the IGDA, we’re offering exclusive early access to our brand new Content Moderation Best Practices PDF, containing practical applications that you can start leveraging today. Download in advance of the full release happening later this month by filling out the form below.

When you sign up, we will also send you an email reminder on the 20th so you don’t miss the webinar. See you there!

About Liza Wood

Liza brings a wealth of experience and remarkable work in the games industry. After 13 years in video game development, Liza joined Two Hat Security as Director of Research and Data Science in August 2019. There she leads a team of researchers who are helping customers and partners build safe and healthy online communities by removing negative interactions to make room for positive human connections. Prior to starting this new phase of her career, she was the Executive Producer of Disney’s Club Penguin Island, the successor to Club Penguin, where she saw the positive impact that online communities can have.

About the IGDA

We are encouraged by and fully believe in IGDA’s mission to support and empower game developers around the world. Having worked for a gaming organization and co-founded the Fair Play Alliance, I strongly believe in the power of games to create meaningful and life-changing experiences for billions of players collectively. And that starts with supporting the dedicated professionals who are committed to creating those experiences.