Content Moderation in Challenging Times

As much of the world’s population faces an extended period of staying home, people are spending more time on online platforms. What does this mean for online communities and those who manage them? The increase in traffic volumes in popular games and social networks is spiking sharply.

Chat volumes are soaring during the COVID-19 pandemic.

But volume alone is not the problem:

  • How are the exponential increases in user chats impacting content moderation practices and business workflows?
  • What are the new trends related to COVID-19 and what are online communities experiencing at this time?
  • How can we as an industry provide safe and inclusive spaces for users during and after the crisis?

To help answer these questions, I recently chatted with Vernon Jones, Head of Safety at MovieStarPlanet, and Two Hat’s Amy Vezeau, Manager of Client Integration, who shared with me their views on the state of online communities and content moderation. We focused on three main topics:

  • The scope of the challenge, especially as it relates to a spike in chat volumes
  • How COVID-19 has affected content moderation practices and impacted teams, business, and users
  • Practical tips and actionable approaches to add to your content moderation strategy during this challenging time

I’m so excited to announce that we’ve gathered these insights in a brand-new e-book Content Moderation in Challenging Times: Techniques to Moderate Chat & Manage Increased Volumes that you can download today

I’ve spoken to multiple organizations over the last two months that are looking for guidance, and this is a great piece of content that will help you navigate the changing landscape of content moderation during and after the pandemic. You can find it here.

How to Monitor COVID-19 Chat in Your Online Community

Back in 2017, I hosted a webinar called Preparing for Breaking News & Trending Topics. In it, I spoke about my time moderating large online communities at Disney Interactive, and the importance of staying on top of pop culture and culture-defining events both large and small.

In 2017, I spoke about the tragic events in Charlottesville as a cultural touchstone; an example of platform operators having to make difficult decisions about how to let their users process and discuss the attack. I shared a six-step protocol that Community Managers and Trust & Safety professionals can follow to ensure that their team is prepared to handle breaking news and trending topics.

While the COVID-19 pandemic may not be breaking news, it is an ever-evolving global event, and everyone is talking about it online, regardless of the platform. We’re seeing COVID-19 chat in mobile games, kids’ platforms, teens’ social networks, and MMOs.

With that in mind, I hope you find this six-step protocol to monitor COVID-19 chat on your platform valuable.

1. Compile vocabulary
The first step is to compile a list of words and phrases that you expect to see the community use. We’re going to use the term COVID-19 as a starting point. Obvious examples include:

  • alcohol wipe
  • border closing
  • confirmed case
  • corona
  • coronavirus
  • covid
  • covid19
  • epidemic
  • hand sanitizer
  • outbreak
  • pandemic
  • quarantine
  • social distancing
  • virus
  • WHO
  • world health organization
  • cdc
  • centers of disease control
  • infected

You’ll want to ensure that you’re watching for these words in your community – and in particular, how they’re being used. Is the community simply sharing their experiences with the pandemic, or are they harassing each other and potentially spreading misinformation?

2. Evaluate
The next step is to go beyond assumptions and review how your community is actually chatting.

Are they using words and phrases that you didn’t account for in your original list? Are there common misspellings? On the internet, language can change within a matter of hours. New compound terms including “covidvacay” and “coronacation” have come out of the pandemic and this rapid adoption of languages shows no signs of slowing down.

As you go through this process, it’s critical that you and your moderation team ask yourselves difficult questions, including:

  • Is quoting what could be construed as dangerous/hateful speech (kungflu, Chinese virus, wuflu, etc) acceptable for the purposes of discussing it?
  • When does humor cross the line?
  • How will you handle misinformation and the spread of fake and potentially dangerous news? Do you need to update your content moderation policies?

In a quick, 5 minute sampling of a single hour of chat across a variety of online communities, we saw COVID-19 referenced in a variety of different ways (spelling and grammatical errors included):

  • “what if the coronavirus is fake and its part of the placebo effect”
  • “so dont meme corona”
  • “my grandpa died of Corona rlvirus”
  • “and i have no shifts at work to pay them back cus of corona”
  • “it depends on if its a serious conversation. joking about corona has become an offense. which personally i find ridiculous. who are we without our jokes”
  • “well my mom is staying with me until the covid dies down so i can’t play games during the week until after 10pm -_-“
  • “the whole world got corona not just Italy”
  • “noone was expecting to get covid 19”

Another thing to consider is languages other than English. For example, in the Dutch language diseases are commonly used for bullying. Our Dutch Language & Culture specialist was quick to notice Dutch community members using bullying phrases like “corona child”, “corona loser”, and “corona face”.

Pay special attention to permanent UGC like usernames. You may allow users to discuss COVID-19 in chat, but do you want them to create a display name like CovidVectorGuy2020? Probably not.

3. Adjust
Now that you know how users are chatting, it’s time to adjust your chat filter to account for these new words and phrases.

Before you make any changes, consider:

  • How often was an expression used? One time in 1 million lines of chat? 20 times?
  • If you adjust a rule, what’s the impact?
  • Have you inadvertently created chat rules that are too strict? For example, “corona” is a kind of beer, and also refers to the circle of light around the sun or moon.

This is where using a sophisticated chat filter that recognizes context is critical.

4. Validate
Now that you’ve adjusted your filter, monitor your changes to ensure that you’ve avoided creating false positives and false negatives.

For example, you don’t want a phrase like “Corona means crown in Spanish” to trigger an action, whereas you would likely want “I hope you get corona” to result in moderation action (or a false send; whatever works for your community).

Tools that give you a live view of community chat can be very helpful here.

5. Analyze stats and trends
In the Two Hat content moderation platform, clients can run reports to view all chat within a specific time period, or to identify trends and common words.

Whatever reports are at your disposal, we recommend that you compile a regular report of trends and word count for all relevant stakeholders.


  • How is sentiment trending? Positive or negative?
  • After you’ve identified a new trending word or phrase, how often is it used? Is there an upward or downward trend?
  • How many warning messages, mutes, or suspensions did you have to issue daily, weekly, and/or monthly to users who are using the topic to harass others, target someone due to their nationality, or spread misinformation?

6. Review regularly
New trends will arise. The term “social distancing” is common now, but it wasn’t two months ago.

Just today, “China’s Chernobyl” began trending on Twitter. By the time this blog is published, there will be a new trending term that you should be aware of.

At times like this, staying on top of chat trends is critical. With increased volumes as more people are in lockdown and spend more time online, it’s important to safeguard all users and ensure a positive and productive experience in your platform.

To that end, I’m currently offering free community consultations. We can use the time to discuss your content moderation approaches and policies and see if there are any opportunities to update and adapt it in this dynamic online landscape.

Request a consultation using the form below.

Upcoming Digital Panel at GamesBeat 2020: “Player Behavior: Your Secret Growth Tool”

The player/user behavior, content moderation, and Trust & Safety disciplines tend to be viewed as part of a cost center. Proactive moderation is usually tacked on to games and platforms after the product is launched, or as the result of a crisis/PR nightmare.

Content moderation is smart, it’s necessary, and even lawyers like it, but can it also be an opportunity to facilitate amazing player and viewer experiences in your platform and reach a bigger and more diverse audience?

As it turns out, a proactive approach to player behavior, moderation, and Trust & Safety can accomplish that, and more. This is especially important during the COVID-19 crisis, in a time when gaming companies have experienced exponential growth in chat volumes. Some communities report up to 3000% more chat in March than in February! A massive increase like that means companies are opened up to both risks and opportunities that our industry simply cannot afford to ignore.

Luckily, there are multiple opportunities to better safeguard your player and userbase as well as foster an inclusive and healthy gaming environment. These include things like proven techniques that have reduced disruptive behavior by 40%, to understanding that the social stickiness of moderated chat in games increases engagement and drives long term value in online platforms.

The Opportunity to Chat
From Two Hat’s whitepaper “The Opportunity to Chat”

There are a lot of other techniques out there – and we are pleased to announce that we’ll be sharing some of them with the industry during the upcoming GamesBeat (virtual) summit!

On April 29th, 4:15 pm PST, join the GamesBeat Summit 2020: The Dawn of a New Generation stream to catch the panel “Player Behavior: Your Secret Growth Tool”, and hear from experts at the forefront of platform and game design as well as Trust & Safety and culturalization in games to learn more about:

  • What approaches can help companies harness this growth opportunity
  • How to get started right away and start making an impact on how you design and manage your communities

Hear more from our fellow panelists Kim Voll from Stray Bombay and Clara Siegel from Facebook in our GamesBeat Summit 2020 panel discussion “Player Behaviour: Your Secret Growth Tool” — and make sure to download the accompanying checklist with new best practices and insights inspired by our panel discussion.

Sky: Children of the Light is the Kind of Game We Need Right Now

I play sky with my nephew, he is 6yrs old. We have a routine he gets home from school and the evenings I’m home from work we play for a while. He loves sky and he’s such a kind boy. Sky is such a beautiful place where he uses all his candles making friends.

I moved from my parents house 2 years ago now and I don’t get to see my siblings very often. However, my little sister and I always find some time to play to the game at the same time like that we can keep in touch

I’ve met so many awesome people on Sky, most of them live on the other side of the world. We keep track of each other’s timezones so we can play together. I have Sky friends who don’t speak the same language as me but that’s what the cute emotes are for!

Those are just a few of the tweets from Sky: Children of the Light fans, in answer to the question “Do you play #thatskygame with family, long-distance friends, or your significant other? 🥰 We are so inspired by these stories and would love to hear more from people who use Sky to connect with loved ones.”

Sky is a new social adventure game from thatgamecompany, the studio responsible for some of the most innovative and beloved games of the last 10+ years, including Journey, Flower, and Flow.

Featuring inventive gameplay based on compassion and community, gorgeous music, and a world populated with broken constellations, ancestor spirits, and Candles, Sky is the gaming experience we need right now. It’s a compelling example of a game designed with a singular purpose – connection and community.

The results speak for themselves – Sky has won multiple awards, including iPhone Game of the Year 2019, the Audience Award at the Game Developers Choice Awards 2020, and most recently the Pocket Gamer People’s Choice award. With over 10 million downloads already on iOS, Sky was just released for Android last week and will be available on the Nintendo Switch this summer.

We’ve paired up with Sky’s Community Developer Robert Hornbek to bring you a short webinar exploring how the studio designed its newest game with positive social features at the core, and how they maintain that healthy community spirit using intentional and purposeful moderation.

In this webinar, you’ll learn:

  • Why thatgamecompany is committed to player safety (5:00 – 6:32)
  • How Sky: Children of the Light leverages innovative social features for better user engagement and experience (12:38 – 19:08, 25:08 – 28:30)
  • Their moderation best practices built with Two Hat (19:09 – 24:38)
  • Three pieces of advice for teams launching a new social game (28:35 – 33:15)

About thatgamecompany

thatgamecompany is a game studio dedicated to creating timeless interactive entertainment that inspires human connection worldwide. Creator of the critically-acclaimed games Sky, Journey, Flower, and Flow.

Top Tips for Managing a Healthy and High-Performing Player Experience Team

Player experience and content moderation stories have dominated the gaming news cycle in the first half of 2020, and show no signs of slowing down. As industry and player interest
in the previously oft-dismissed world of support agents and moderators grows stronger every day, we at Two Hat thought it was time to shine a light on the professionals who have been doing this work in the shadows for years.

We recently caught up with Pascal Debroek, a longtime player experience professional who has worked at some of the biggest mobile gaming startups that Helsinki has to offer, including Supercell, Next Games, and currently Hatch Entertainment Ltd. As Head of Customer Service at Hatch, he is responsible for providing a safe, fair, and frictionless environment for all players.

In this conversation, Pascal shares his experience running successful player support teams and provides invaluable advice for leaders in similar positions.

Two Hat: Let’s start with the obvious question. Why do player experience roles have a higher churn rate than other roles?

Pascal Debroek: Support functions, such as player support, community management, and moderation roles are often not considered to be integral to product or game development, despite the
obvious value they bring to customer retention and product development. CS departments are, more often than not, perceived to be a cost-center, a necessary money-sink to appease
consumers when they might run into a problem and some form of consumer-facing communication is required.

When these functions or departments are not perceived to be part of the “core” of your studio, and the employees don’t feel empowered, nor get the right training and tools to do their job, that will make them feel unappreciated by their employer. Having user feedback being dismissed and waved away – “Oh, that’s just another customer complaining about something not working” – that certainly doesn’t aid the situation. That’s basic Employee Experience knowledge and not just for studios, but for any organization out there.

You have to understand that these can also be emotionally draining jobs. Most of these are pure customer-facing, and in a lot of the cases deal with either sensitive topics,
aggravated end-users dealing with a situation that did not meet their expectations or even outright insults and threats. Let’s not forget, most contacts with players stem from an emotional state; happy, sad, angry, you’ll encounter them all in these roles. And because it pays off to be empathetic in such roles, it also means your employees are more susceptible
to the emotions that surround them.

And that’s not taking into account the exposure to personal insults, bullying, threats of harm and self-harm, racism, sexism, predatory behaviour, and child grooming to name a few. While these (hopefully) don’t occur all the time, it takes perseverance to stomach them and see the good in things. But I can promise you, it does affect most at one point or another in their career. And that is why it is so important to focus on well-being for these roles. We all have a threshold on how much we can handle before the job starts requiring more than we can give back.

It’s a shame because it’s such an important role for any gaming studio and can be a really valuable stepping stone for those who may not want to make support their career. People
who get their start in these support roles understand more often than not what the business is about, they understand basic game design, they understand production. The player support and moderation experience allows them to perform in a slightly different and more player-centric way in other roles because they already understand the perspective of the user.

TH: What are some things that leaders can do to keep their player experience staff happy, safe, and healthy?

PD: As a team leader, the first thing that is required is a safe environment built on trust. Especially if you are in a place where you’re constantly dealing with other humans, and as I
mentioned earlier, potential emotionally-laden communication.

Your team needs to be able to trust each other, both in doing their job to the best of their abilities, but also in being able to support every single person in that team. Without that,
your team will never feel safe and that will have an impact on efficiency, well-being, and ultimately performance. Even more so, if even one team member feels the team lead doesn’t have their back, you’ll end up in a very dangerous situation that could escalate at any moment.

You need to create a team with people who have empathy and people skills, it will make their job a lot easier. While I’m not going to suggest you would need to hire similar people – always go for compatible people who, in addition to a practical skill set are also good at reading and dealing with emotions. Communication is not only about what is being said, but also about what is not.

Once I started involving the whole team in the recruitment process, I saw a surge in team member compatibility and with that an increase in trust and performance over time. We would go through several screenings, talks with me as the hiring manager, an interview with HR, followed by talks with two more senior people from the team. And finally, they would meet up with the rest of the team for a casual half-hour to one-hour chat. Just talking about things the team would want to know like, “Hey, what kind of movies do you like? How do you unwind? Any interest in sports? Are you into superheroes?” which can lead to the dreaded Marvel or DC argument [laughs].

It’s important to hire the right people because every team member that you’re adding to the mix will affect and even change your team culture.

Then when it comes down to team culture itself, you need to ensure that you have a fair and open culture and an understanding that you’re all in this together, for the same cause. And people need to be very receptive to feedback and are expected to provide feedback. I believe you can be honest, to the point, and still be respectful and mindful. If you have that initial trust, it should be a lot easier to have those conversations, including the more difficult ones.

TH: You mentioned involving the team in the hiring process. How do you ensure that they continue to work together closely once they’re hired?

PD: There is also no reason why people on the team couldn’t have a one-on-one with each other. I’m not talking about HR-related topics here but professional development, mental support. It’s more like people asking for advice, another point of view, coaching on a particular topic. Or it can just as well be someone feels they need to lift some weight off their shoulders because of a personal situation.

And of course, there is the obvious “Hey, I have this kind of message. How would you deal with people who talk like this?” There are times when people will send a Slack message to each other and say, “Hey, do you need to talk? You want to grab a cup of coffee? Do we want to go for a short walk?” It’s ok to get frustrated or stuck at times, just as long as you realise it. In the end, everyone should know that they are among peers and they should assist each other. And as a supervisor or as a manager, you need to allow those kinds of things to happen.

Continuous learning and sharing experiences having open, honest feedback, people being able to tell each other how they feel – that’s the most important part. If you’re not feeling good, then how are you going to be able to do your job? You need moral support from your supervisor, but also from your team members.

TH: Because player experience roles and responsibilities are so emotionally charged, do you find that you have to look at success metrics differently?

PD: Metrics are important, but they will never show you the full story. Personally I feel many companies oversimplify by trying to fully quantify performance in support and moderation functions. In a lot of cases, there are external and irrational or emotional factors that will affect your metrics. If you then use those metrics to judge the performance of an individual, now that’s not really fair or motivational, is it?

At a previous employer, we decided not to use KPIs as a determination of whether staff was performing well, but rather used it as a benchmark for industry standards and it would allow the team to push themselves constantly. Of course, this doesn’t mean we were not paying close attention to the KPIs, yet by simply removing the “fear” aspect of employees not meeting certain artificial performance metrics, we created an environment where we would constantly challenge ourselves to work smarter and be proud as a team of our achievements.

The following is a perfect example of why the “traditional” take on support KPIs can be detrimental to a CS agent’s mental health, efficiency, and overall customer satisfaction: If you’re assigned a queue and you’re the one that is dealing with all the sensitive topics and negative feedback that comes in, it takes an emotional toll on you, and it becomes a lot harder to reply with each subsequent message. In order to create an understanding throughout the whole team, and to protect them, every team member would, in turn, be taking care of these more challenging topics and conversations.

Anyone dealing with these more sensitive or negative content and tickets was allowed to take more time. As long as a player received a timely and correct answer, they could take as long as they needed to reply, within sensible limits. The reasoning for this was that when you’re dealing with heavy topics in player support or player moderation, it can suck you emotionally dry very fast. If that means you’re only doing a quarter of tickets in a whole day compared to a top performer, then no one is going to ask why you didn’t do more, because everyone understood the challenge. The worst thing that can happen when you are feeling emotionally drained is adding time pressure. Ultimately, the end-result for the player or all players is more important than adhering to artificial time constraints that don’t reflect the context of the issue.

TH: What’s the business benefit of investing in experienced player experience leaders?

PD: There are a few reasons why a company would want to invest in more experienced Player Experience leaders. All of these stem from the mere fact that there is no substitute for experience. Rarely do companies ask for the added business value of hiring an experienced backend developer or a more senior product lead. Yet when it comes down to customer-facing roles, many companies still seem to struggle with the answer to that question; as if it somehow would be any different than for other roles.

If you’re looking to set up a department that communicates directly with your players from all over the globe, in order to create actionable insights on your product development and provide a safe online environment for all while maintaining scalable cost-efficient operations and always needing to keep in line with the expectations of your audience, would you rather not invest in someone who has the experience?

In order to succeed in your CX endeavours, you hire the right people, so they can hire the right talent, train them, coach them, empower them. They understand the expectations of the audience, what channels to use, how to approach communication about particular topics. They know the tools out there, how to tweak them, can create processes, analyze support metrics and plan resources accordingly. And often forgotten, these are the key people who need to influence decisions across departments and teams, walking the fine line between customer-centricity and profitability of the product or service.  And those insights only come with experience.

However, it’s not just about investing in experienced people. It is also about the resources allocated to the team and the tools they are provided to do their job best. You can hire a top leader, but if they have to make do with an email-only contact center, you won’t get far. The most obvious answer is that more experienced leaders can boost retention metrics in the mid to long term when given access to the right resources. That in itself is a major advantage for studios, especially in the competitive f2p [free-to-play] mobile space, but it extends well beyond that.

What will our players thank us for? Image credit: Mark Pincus

There is this quote from Zynga that I feel more companies should ask themselves. On a wall in one of their offices is this one question: “What will our players thank us for?”

That is a very important thing to reflect on as a company because it all comes down to player expectations and surpassing them. You don’t create fantastic experiences without thought, without understanding your audience, nor without investing in the contact points your audience will reach out to. Because those interactions, the quality, the lack of friction and the efficiency, will leave a lasting impression.

As a follow-up question, I would suggest companies also start asking themselves what their brand will be remembered for later on. This is something companies like Apple, Amazon and Netflix understand. Within games, simply take a look at companies like Activision Blizzard or Supercell. It’s obvious they are not just making games, they are building experiences and are differentiating themselves as a brand. People these days download Supercell games not solely on the premise that it is a good game, but because they have come to expect good games from Supercell as a brand.

TH: Can you tell us about any initiatives you’ve done to boost company awareness of player experience teams?

PD: At Next Games, we created a management-endorsed shadowing initiative dubbed “Player Support Bootcamp”. You would sign up for three-hour sessions where you would be told how player support works, what tools we use, how we communicate, learn about our processes or what happens when we log a bug, how we do investigations, how we do moderation.

It was purely voluntary, and at the high point of the program, we had more than 62,5% of the company signing up for sessions. So we decided to gamify it: Come to three sessions, you get a fantastic-looking degree, created by one of our very talented marketing artists. People started competing for spots in the program; we were fully booked for months. Degrees appeared hanging from the wall or stood framed on desks as a badge of honour, while developers shared personal experiences with the Bootcamp over coffee in the kitchen.

We saw two huge changes come out of the program. A UX designer kicked in gear a big feature redesign of how users save their game progress, based on the feedback she saw from players during Bootcamp. After the implementation of the redesign, we saw a decrease of more than 20% of tickets regarding lost accounts. Huge impact on our bottom line there.

The other change happened when a senior client programmer who went through the program noticed that we were wasting time trying to localize some of our questions since we sometimes would get messages in languages we didn’t have native speakers for. We were copy/pasting the tickets into Google translate, putting them back into the CRM, then replying using Google translate. So in his spare time, he actually started programming a bot for us that would go through the CRM and automatically translate emails for us in advance, saving us time and money.

The buy-in from the management team was crucial to the success of the project. Our CEO was actually one of the early adopters and possibly the biggest proponent. We could see that shortly after the producers of the games actually got a really big interest in the program and gently persuaded people to sign up. Especially for those teams working on new projects, the Bootcamp was a source of inspiration. So it had a huge impact. And I’m sure that there are still people who are talking about the program today.

TH: In your opinion, what does the future hold for player experience teams?

PD: Users nowadays have a better understanding of game mechanics and social dynamics, and they also have higher expectations. Seven to eight years ago, if there was a game on the
app store, you were just comparing that game to the next best game. Nowadays, you compare that game to the last best experience that you’ve ever had, which could include pretty much anything on the app store and beyond, from Amazon to Spotify. I’m expecting every game experience to be just as frictionless as on Clash of Clans, but just as deep as Skyrim. Is that fair? Perhaps not, considering technical and other limitations, yet it is what is happening.

I’m not the only one who thinks companies need to invest more in the service aspects of their games. There’s an unstoppable mindset change happening in the retail and on-demand landscape; the service industry is getting disrupted. Games-as-a Service is already a really big thing right now and it shows no signs of slowing down. But the games industry will need to adapt fast to keep up with this evolution, which obviously doesn’t happen without a change in attitude towards supporting functions and towards the gaming audience.

The whole idea of simply hiring a junior person who can answer email messages for cheap, that’s also going to disappear. The need for emotionally smart, educated and experienced support and moderation personnel is going to skyrocket. The more technology advances, the more need there will be for people who can rely on experience and a higher understanding of what they’re doing, who understand the tools and the processes they’ll be using, but most of all, understand humans and human behaviour.

TH: That’s a great point to end this on. Thank you for sharing your insights, Pascal!

PD: My pleasure, Carlos! Thanks for speaking with me.

Join a Webinar: Taking Action on Offensive In-Game Chat

I’m excited to announce that Two Hat is co-hosting an upcoming webinar with the International Game Developers Association on Friday, February 21st, 2020.

The incredible Liza Wood (check out her bio below), our Director of Research and Data Science, will be joining me as we present Defining, Identifying and Actioning Offensive Chat: Approaches and Frameworks.

We will start by examining why defining, identifying and actioning offensive chat matters to game development, with tangible supporting stats. Later we will provide an overview of the Five Layers of Community Protection.

Here’s what you can expect to get out of it:

  • Compelling arguments for adding player safeguarding mechanisms to your game’s social features
  • Actionable takeaways for creating internal alignment on categories and terminology
  • Action plans for identifying and addressing disruptive behavior

We hope you will join us on February 21st at 3 pm PST on IGDA’s Twitch Channel. Mark your calendars!

To celebrate this collaboration with the IGDA, we’re offering exclusive early access to our brand new Content Moderation Best Practices PDF, containing practical applications that you can start leveraging today. Download in advance of the full release happening later this month by filling out the form below.

When you sign up, we will also send you an email reminder on the 20th so you don’t miss the webinar. See you there!

About Liza Wood

Liza brings a wealth of experience and remarkable work in the games industry. After 13 years in video game development, Liza joined Two Hat Security as Director of Research and Data Science in August 2019. There she leads a team of researchers who are helping customers and partners build safe and healthy online communities by removing negative interactions to make room for positive human connections. Prior to starting this new phase of her career, she was the Executive Producer of Disney’s Club Penguin Island, the successor to Club Penguin, where she saw the positive impact that online communities can have.

About the IGDA

We are encouraged by and fully believe in IGDA’s mission to support and empower game developers around the world. Having worked for a gaming organization and co-founded the Fair Play Alliance, I strongly believe in the power of games to create meaningful and life-changing experiences for billions of players collectively. And that starts with supporting the dedicated professionals who are committed to creating those experiences.

Conversations About Gaming & Digital Civility With Laura Higgins from Roblox

In November, Laura Higgins, Director of Community Safety & Digital Civility at Roblox shared the fascinating results of a recent survey with International Bullying Prevention Association Conference attendees in Chicago. The results provide a refreshingly honest peek into the world of online gaming and family communication – and should serve as a wake-up call for family-based organizations.

Roblox conducted two separate surveys – in the UK and the US. In the UK, they spoke to over 1500 parents, and in the US they surveyed more than 3500 parents and 580 teenagers, with different questions but some of the similar themes.

Two Hat Director of Community Trust & Safety Carlos Figueiredo was lucky enough to share the stage with Laura during the Keynote Gaming Panel at the same conference. During the panel and in conversations afterward, he and Laura spoke at length about the surprising survey results and how the industry needs to adopt a “Communication by Design” approach when talking to parents.

What follows is a condensed version of their conversations, where Laura shares her biggest takeaways, advice for organizations, and thoughts on the future of digital conversations.

Carlos Figueiredo: Some fascinating and surprising results came out of these surveys. What were your biggest takeaways?

Laura Higgins: In the UK survey, unsurprisingly, 89% of parents told us that they were worried about their kids playing games online. They cited concerns about addiction, strangers contacting their children, and that gaming might lead to difficulties forming real-life friends or social interactions.

What was really interesting is that nearly the same number of parents said they could see the benefits of gaming, so that’s something we’re going to really unpack over the next year. They recognize improved cognitive skills, they loved the cooperation and teamwork elements that gaming provided, the improved STEM skills. They recognize that playing games can help kids in the future as they will need digital skills as adults, which was really interesting for us to hear about.

The big thing that came out of this that we really need to focus on is that, of those people who said they were worried about gaming, half of them told us that their fears were coming from stories they saw on media and social media, instead of real-life experience. We know there’s a lot of negativity in the press, particularly around grooming and addiction/gambling, so I think we need to be mindful of the way we talk to parents so that whilst we’re educating them about possible risks (and we know that there are risks), we’re also discussing how to raise resilient digital citizens and are giving them the tools to manage risks rather than just giving them bad news. We’re trying to proactively work with media outlets by telling them, if you want to talk about the risks, that’s fine, but let’s share some advice in there as well, empower rather than instill even more fear.

CF: Did you see different results with the US survey?

LH: With the US research we were also able to reach 580 teens and compare the data from them and parents. Some of the most startling stuff for us was the disconnect between what parents think is really happening versus what kids think is happening.

For example, 91% of parents were convinced that their kids would come and talk to them if they were being bullied. But only 26% of kids said they would tell their parents if they had witnessed bullying. In fact, they would tell anyone else but their parents; they would report it to the platform, they would challenge the bully directly, or they would go to another adult instead of their parents.

The gap was echoed throughout the whole survey. We asked if parents talked to their kids about online safety and appropriate online behavior, and 93% of parents said that they were at least occasionally or regularly discussing this topic with their kids, while 60% of teens said that their parents never or rarely talked to them about appropriate online behavior. So, whatever it is that parents are saying — kids aren’t hearing it.

We need to make sure we’re reaching kids. It’s more than just sitting down and talking to them; it’s how it’s being received by kids as well.

CF: It seems like your surveys are uncovering some uncomfortable realities – and the things that the industry needs to focus on. We talk a lot about Safety by Design, but it seems like a focus we’re missing is Communication by Design.

LH: We were surprised with how honest parents were. Over half of UK parents, for example, are still not checking privacy and security settings that are built in. Part of my role at Roblox has been to review how accessible the advice is, how easy to understand it is, and it’s an ongoing process. We appreciate how busy parents are – they don’t have time to go looking for things.

We asked US parents who rarely or never had conversations with their kids about appropriate behavior online, why they didn’t feel like they were necessary, and we got some fascinating quotes back. Parents think they’re out of their depth, they think that their kids know more than them. In some cases that may be true, but not really – digital parenting is still parenting.

We heard quotes like, “If my kid had a problem, they would tell me.” The research tells us that’s not true.

“If my child was having problems, I would know about it.” But if you’re not talking about it, how is that going to happen?

“I brought my kid up right.” Well, it’s not their behavior we always have to look at – it’s their vulnerabilities as well.

We need to talk more broadly than just how to use the settings, so I think there are many layers to these conversations for parents as well.

CF: What are some other things we can do as an industry to help parents?

LH: One is, give them the skills and easy, bite-sized tips: here’s how you check your safety settings, here’s how you set privacy settings, here’s how you report something in-game, practical things they can teach their kids as well.

There’s also a broader conversation that empowers parents to learn how to have conversations. At Roblox, we do lots of work around things like, how to say no to your kids, what is an appropriate amount of screen time for your child, how to manage in-game purchases, and setting boundaries and limits, all advice that parents are grateful for. But if we just had an advice section or FAQ on the website, they would never get to hear those messages.

It’s about amplifying the message, working with the media as much as possible, having some different outlets like our Facebook page that we just launched. So parents who are sitting on the bus on the way to work scrolling through and finding those little reminders is really helpful.

CF: Speaking of your new Facebook page, Roblox has been really innovative in reaching out to parents.

LH: We’re also taking it offline. We have visits to Roblox, for instance, with kids. We’ll be holding an online safety session for parents while kids are off doing other activities. So I’m helping to write that. And working with parents in organizations as well, so they can still get those messages out where people are.

Schools have a key place in all of these conversations. We know that the quality of online safety conversations in schools is poor, it’s often still an assembly once a year and we’re going to scare you silly, not actually talk about practical stuff, rather than delivering these lessons through the curriculum. They should be reminding kids of appropriate online behavior at all times and giving them those digital literacy skills as well.

We’re doing webinars, we’re doing visits, and hopefully, gradually we’ll keep feeding them those messages.

CF: It’s encouraging that you’re so committed to this, trying to change culture. Not every platform is putting in this effort.

LH: I think we have to. I’ve been working in digital safeguarding for years, and I don’t think that we’ve hit that sweet spot yet. We haven’t affected enough change, and we need to move even faster.

Now, with all of these conversations about online harms papers and regulations, we’ve worked with partners in Australia and New Zealand where they have the Harmful Digital Communications Act but it’s still not really changing, This is just a new approach – that drip-feed, that persistence that hopefully will affect change.

We’re very lucky at Roblox – our community is really lovely. By the way, 40% of Roblox users are females which is rare in gaming. And they are very young and very supportive of each other. They are happy to learn at that age. And we can help to shape them and mold them, and they can take those attitudes and behaviors through their online digital life, as they grow up.

In the survey, we wanted the kids to tell us about the positive and negative experiences that they’ve had online. Actually, what most of them reflected wasn’t necessarily around things like bullying and harassment – they were actually saying that the things that made them feel really bad were when they did badly in a game and they were a bit tough on themselves. And they said they would walk away for 10 minutes, come back, and it was fine. And when people were positive to them in-game, they were thinking about it a few days later. So when we’re looking at how we manage bad behavior in our platform, it’s really important that we have rules, that we have appropriate sanctions in place, and that we can use the positive as an educational tool. I think we really need that balance.

CF: I love that framing. It’s a reminder that most players are having a good time and enjoying the game the way it was meant to be enjoyed. We all have bad days but nasty behavior is not the norm.

LH: It’s in everybody’s interest to make it a positive experience. We have a role to play in that but so do the kids themselves. They self-regulate, they call out bad behaviors, they are very supportive of each other.

We asked them why they play online games and 72% said, “Because it’s fun!”

That should be the starting point. Ultimately, it’s about play and how important that is for all of us.

CF: What is your best advice for gaming organizations, from reinforcing positive behavior to better communicating with parents?

LH: Great question. The first thing is to listen to your community. Their voice is really important. Without our players and their families, we would not have Roblox. Gaming companies can sometimes make decisions that are good for the business, rather than what the players want and what the community needs. And act on it. Take their feedback.

If you’re working with children, have a Duty of Care to make it as safe as possible. That’s a difficult one, because we know that small companies and startups might struggle financially. We’re working with trade bodies on the idea of Safety by Design – what are the bare minimums that must be met before we let anyone communicate on your platform? It doesn’t have to be all of the best equipment, tools, systems in place, but there are some standards that I think we should all have in place.

For example, if you have chat functions you need to make that you’ve got the right filters in place. Make sure it is age-appropriate all the way through.

Ultimately, machine learning and AI is wonderful, but it can never replace humans in certain roles or situations. You need well-trained, good moderators. Moderators have one of the most important roles in gaming platforms, so making sure they’re really well supported is important. They have a tough job. They are dealing with very upsetting things that might happen, so making sure that they aren’t just trained to deal with it, but that they have after-care as well.

If you are a family-based platform make sure you reach out to parents. I met with a delegate and she said it was the first time she’s heard a tech company talk about engaging with parents. I think if we could all start doing that a little bit more, it would be better.

CF: You mentioned that in your 20 years in the digital civility industry, the needle has barely moved. Do you think that’s changing?

LH: I’m really hopeful for the future. I had talked with journalists a few months ago who were slightly scoffing at my aspirations of digital civility. If you’re coming from a starting point where you just assume that games are bad and the players are bad and the community is bad – you’re wrong. People are kind. People do have empathy. They want to see other people succeed.

For example, nearly all teens (96%) in our survey said they would likely help a friend they see being bullied online, and the majority of teens confirmed they get help from other players when they need it at least “sometimes,” with 41% saying they get peer help “often” or “always.” Those are all things we see all the time in gaming. And we have this opportunity to spread that out even more and build those really good positive online citizens.

This is much bigger than Roblox.

These kids are the future. The more that we can invest in them, the better.

We all need to enable those conversations, encourage those conversations, and equip parents with the right messages.


Further reading:

London Calling: A Week of Trust & Safety in the UK

Two weeks ago, the Two Hat team and I packed up our bags and flew to London for a jam-packed week of government meetings, media interviews, and two very special symposiums.

I’ve been traveling a lot recently – first to Australia in mid-September for the great eSafety19 conference, then London, and I’m off to Chicago next month for the International Bullying Prevention Association Conference – so I haven’t had much time to reflect. But now that the dust has settled on the UK visit (and I’m finally solidly back on Pacific Standard Time), I wanted to share a recap of the week as well as my biggest takeaways from the two symposiums I attended.

Talking Moderation

We were welcomed by several esteemed media companies and had the opportunity to be interviewed by journalists who had excellent and productive questions.

Haydn Taylor from GamesIndustry.Biz interviewed Two CEO and founder Chris Priebe, myself, and Cris Pikes, CEO of our partner Image Analyzer about moderating harmful online content, including live streams.

Rory Cellan-Jones from the BBC talked to us about the challenges of defining online harms (starts at 17:00).

Chris Priebe being interviewed
Chris Priebe being interviewed about online harms

I’m looking forward to more interviews being released soon.

We also met with branches of government and other organizations to discuss upcoming legislation. We continue to be encouraged by their openness to different perspectives across industries.

Chris Priebe continues to champion his angle regarding transparency reports. He believes that making transparency reports truly transparent – ie, digitizing and displaying them in app stores – has the greatest potential to significantly drive change in content moderation and online safety practices.

Transparency reports are the rising tide that will float all boats as nobody will want to be that one site or app with a report that doesn’t show commitment and progress towards a healthier online community. Sure, everyone wants more users – but in an age of transparency, you will have to do right by them if you expect them to join your platform and stick around.

Content Moderation Symposium – “Ushering in a new age of content moderation”

On Wednesday, October 2nd Two Hat hosted our first-ever Content Moderation Symposium. Experts from academia, government, non-profits, and industry came together to talk about the biggest content moderation challenges of our time, including tackling complex issues like defining cyberbullying and child exploitation behaviors in online communities to unpacking why a content moderation strategy is business-critical going into 2020.

Alex Holmes, Deputy CEO of The Diana Award opened the day with a powerful and emotional keynote about the effects of cyberbullying. For me, the highlight of his talk was this video he shared about the definition of “bullying” – it really drove home the importance of adopting nuanced definitions.

Next up were Dr. Maggie Brennan, a lecturer in clinical and forensic psychology at the University of Plymouth, and an academic advisor to Two Hat, and Zeineb Trabelsi, a third-year Ph.D. student in the Information System department at Laval University in Quebec, and an intern in the Natural Language Processing department at Two Hat.

Dr. Brennan and Zeineb have been working on academic frameworks for defining online child sexual victimization and cyberbullying behavior, respectively. They presented their proposed definitions, and our tables of six discussed them in detail. Discussion points included:

Are these definitions complete and do they make sense? What further information would we require to effectively use these definitions when moderating content? How do we currently define child exploitation and cyberbullying in our organizations?

My key takeaway from the morning sessions? Defining online harms is not going to be easy. It’s a complicated and nuanced task because human behavior is complicated and nuanced. There are no easy answers – but these cross-industry and cross-cultural conversations are a step in the right direction. The biggest challenge will be taking the academic definitions of online child sexual victimization and cyberbullying behaviors and using them to label, moderate, and act on actual online conversations.

I’m looking forward to continuing those collaborations.

Our afternoon keynote was presented by industry veteran David Nixon, who talked about the exponential and unprecedented growth of online communities over the last 20 years, and the need for strong Codes of Conduct and the resources to operationalize good industry practices. This was followed by a panel discussion with industry experts and several Two Hat customers. I was happy to sit on the panel as well.

My key takeaway from David’s session and the panel discussion? If you design your product with safety at the core (Safety by Design), you’re setting yourself up for community success. If not, reforming your community can be an uphill battle. One of our newest customers Peer Tutor is implementing Safety by Design in really interesting ways, which CEO Wayne Harrison shared during the panel. You’ll learn more in an upcoming case study.

Man standing in front of a screen that says Transparency Reports

Finally, I presented our 5 Layers of Community Protection (more about that in the future – stay tuned!), and we discussed best practices for each layer of content moderation. The fifth layer of protection is Transparency Reports, which yielded the most challenging conversation. What will Transparency Reports look like? What information will be mandatory? How will we define success benchmarks? What data should we start to collect today? No one knows – but we looked at YouTube’s Transparency Report as an example and guidance on what may be legislated in the future.

My biggest takeaway from this session? Best practices exist – many of us are doing them right now. We just need to talk about them and share them with the industry at large. More on that in an upcoming blog post.

Fair Play Alliance’s First European Symposium

Being a co-founder of the Fair Play Alliance and seeing it grow from a conversation between a few friends to a global organization of over 130 companies and many more professionals has been incredible, to say the least. This was the first time the alliance held an event outside of North America. As a global organization, it was very important to us, and it was a tremendous success! The feedback has been overwhelmingly positive, and we are so happy to see that it provided lots of value to attendees.

Members of the Fair Play Alliance

It was a wonderful two-day event held over October 3rd and 4th, with excellent talks and workshops that were hosted for members of the FPA. Chris Priebe, a couple of industry friends/veteran Trust & safety leaders, and I hosted one of the workshops. We’re all excited to take that work forward and see the results that will come out of it and benefit the games industry!

What. A. Week.

As you can tell, it was a whirlwind week and I’m sure I’ve forgotten at least some of it! It was great to connect with old friends and make new friends. All told, my biggest takeaway from the week was this:

Everyone I met cares deeply about online safety, and about finding the smartest, most efficient ways to protect users from online harms while still allowing them the freedom to express themselves. At Two Hat, we believe in an online world where everyone is free to share without fear of harassment or abuse. I’ve heard similar sentiments echoed countless times from other Trust & Safety professionals, and I truly believe that if we continue to collaborate across industries, across governments, and across organizations, we can make that vision a reality.

So let’s keep talking.

I’m still offering free community audits for any organization that wants a second look at their moderation and Trust & Safety practices. Sign up for a free consultation using the form below!