How Do You Calculate the ROI of Proactive Moderation in Chat?

On Tuesday, October 30th, I’m excited to be talking to Steve Parkis, a senior tech and entertainment executive who drove amazing growth in key products at Disney and Zynga, about how chat has a positive effect on user retention and overall revenue. It would be great to have you join us — you can sign up here. Until then, I would like to get the conversation started here.

There is a fundamental understanding in online industries that encouraging prosocial, productive interactions and curbing anti-social, disruptive behavior in our online communities are important things to do.

The question I’ve been asking myself lately is this — do we have the numbers to prove that proactive moderation and other approaches are business crucial? In my experience, our industries (games, apps, social networks, etc) lack the studies and numbers to prove that encouraging the productive and tackling negative interactions has a key impact on user engagement, retention, and growth.

This is why I’m on a mission this quarter to create new resources, including a white paper, that will shed light on this matter, and hopefully help as many people as possible in their quest to articulate this connection.

First steps and big questions

We already know that chat and social features are good for business — we have lots of metrics around this — but the key info that we’re missing is the ROI of proactive moderation and other community measures. Here’s where I need your help, please:

  • How have you measured the success of filtering and other approaches to tackle disruptive behavior (think spam, fraud, hate speech, griefing, etc) as it relates to increased user retention and growth in your communities?
  • Have you measured the effects of implementing human and/or automated moderation in your platforms, be it related to usernames, user reports, live chat, forum comments, and more?
  • Why have you measured this?

I believe the way we are currently operating is self-sabotage. By not measuring and surfacing the business benefits of proactive moderation and other measures to tackle anti-social and disruptive behaviour, our departments are usually seen as cost-centers rather than key pieces in revenue generation.

I believe that our efforts are crucial to removing the blockers to growth in our platforms, and also encouraging and fostering stronger user engagement and retention.

Starting the conversation

I’ve talked to many of you and I’m convinced we feel the same way about this and see similar gaps. I invite you to email your comments and thoughts to carlos.figueiredo@twohat.com.

Your feedback will help inform my next article as well as my next steps. So what’s in it for you? First, I’ll give you a shoutout (if you want) in the next piece about this topic, and will also give you exclusive access to the resources once they are ready, giving you credit where it’s due. You will also have my deepest gratitude : ) You know you can also count on me for help with any of your projects!

To recap, I would love to hear from you about how you and your company are measuring the return on investment from implementing measures (human and/or technology driven) to curb negative, antisocial behaviour in your platforms. How are you thinking about this, what are you tracking, and how are you analyzing this data?

Thanks in advance for your input. I look forward to reading it!

Originally posted on LinkedIn

3 Myths About “Toxic” Gaming Communities: What We Learned at LA Games Conference

Two months after the successful Fair Play Alliance summit at GDC 2018, the LA Games Conference hosted the panel “Fighting Toxicity in Gaming.” Two Fair Play Alliance members were on hand to discuss the role technology plays in addressing disruptive player behavior.

Moderated by Dewey Hammond (Vice President of Games Research at Magid Advisors) and featuring panelists J Goldberg (Head of Community at Daybreak Game Company), Kat Lo (online moderation researcher and PhD student at the University of California), and Carlos Figueiredo (Director of Community Trust & Safety at Two Hat Security), the panelists tackled a series of challenging questions about growing healthy gaming communities. They opened with a frank discussion about misconceptions in the world of gaming.

We spoke with Carlos Figueiredo, co-founder of the Fair Play Alliance and digital citizenship and moderation expert. He shared the top three misconceptions about toxicity in video games — beginning with that tricky word, “toxic.”

Myth 1: How we describe toxicity doesn’t need to change

The gaming industry has been using words like “toxicity” and “trolling” for years now. They began as a necessity — catchy phrases like “toxic gamer culture” and “don’t feed the trolls” became a shorthand for player behavior that anyone in the community could reference.

Language, and our understanding of how the internet shapes behavior, has naturally evolved over time. Terminology like “toxic players” and “internet trolls” may be ingrained in our culture, but they are no longer sufficient when describing variable, nuanced human behavior. Human, in all our complexity, cannot be categorized using broad terms.

In fact, in a study released in 2017, Stanford University showed that, given the right set of circumstances (including the mood and context of a conversation), anyone can become a “troll.”

“When I say disruptive behavior, I’m referencing what we would normally refer to as toxicity, which is a very broad term,” Carlos says. “Disruptive behavior assumes different shapes. It’s not just language, although abuse and harassment are often the first things we think of.”

As its name suggests, disruptive behavior can also include cheating, griefing, and deliberately leaving a match early.

“Human behavior is complex,” says Carlos. “Think of it — we’re putting people from different cultures, with different expectations, together in games. They’ve never seen each other, and for fifteen minutes in a match, we’re expecting everything to go well. But what are we doing as an industry to facilitate healthy interactions?”

The first step in getting rid of internet trolls and toxic online gaming communities fostering healthier online gaming communities? Challenge, define, and refine the words we use and their meaning.

From left to right: J Goldberg, Dewey Hammond, Kat Lo, Carlos Figueiredo Image credit: Ferriz, V. (2018, May 8). LA Games Conference [Digital image]. Retrieved from https://www.facebook.com/DigitalMediaWire/

Myth 2: Anonymity is the problem

Countless articles have been written about the dangers of anonymous apps. And while it’s true that anonymity can be dangerous — the toxic (that word again!) side of online disinhibition — it can also be benign. As John Suler, Ph.D. writes in The Online Disinhibition Effect, “Sometimes people share very personal things about themselves [online]. They reveal secret emotions, fears, wishes [and] show unusual acts of kindness and generosity, sometimes going out of their way to help others.”

So what’s a bigger cause of disruptive player behavior, if not users hiding behind the mask of anonymity? “The lack of social consequences,” says Carlos.

“There are different social protocols when we are interacting face to face, and we know very well that our actions have tangible consequences,” he explains. “That’s not always the case online. We’re still figuring out the relatively new virtual spaces and how we are socializing within them.”

“Anonymity alone,” he continues, “is not the biggest driver of disruptive behavior.”

Kat Lo and Carlos Figueiredo Image credit: Ferriz, V. (2018, May 8). LA Games Conference [Digital image]. Retrieved from https://www.facebook.com/DigitalMediaWire/

Myth 3: AI alone will be the savior of player behavior

Disruptive behavior won’t be solved by algorithms or humans alone. Instead, as Carlos says, “A machine/human symbiosis that leverages the best of both can make all the difference.”

Using AI, gaming communities can proactively filter and triage the obviously unhealthy text or images, leaving humans to review the in-between content that requires empathy and human understanding to make a decision.

He also advocates for having a highly skilled and well-trained team of moderators who are well-versed in the game, understand the community, and have access to the right tools to do their jobs.

Having humans review content periodically is imperative, Carlos says. “You can’t just use automation and expect it to run blind. Models have biases if you don’t adjust and have eyes on them.”

He adds, “It’s important to remember that teams will have biases as well, and will require a lot of conscious effort to consider diverse views and motivations, overcome limitations in their thinking, and apply checks and balances along the way.”

Carlos believes that the answer to unhealthy communities is probably a lot more human than we realize. As he says, making a difference in gaming communities comes down to “People who care about games, care about players, care about the community, and are prepared to make the hard decisions.”

“We won’t always get things right,” he continues, “or have all the answers, but we can work really hard towards better online communities when we truly care about them. The answers are as human as they are technological.”

 

Want to learn more about combating toxicity deterring disruptive player behavior? Sign up for the Two Hat Security newsletter and receive monthly community management tips and tricks, invites to exclusive workshops, moderation best practices, and more!

* indicates required


“Community Design is Video Game Design”: Insights from the Fair Play Alliance Summit at GDC 2018

If the lineup outside the first Fair Play Alliance (FPA) panel at GDC this March was any indication, the gaming industry is poised to make some major changes this year.

Following the rousing and packed keynote speech delivered by Riot Games’ Senior Technical Designer Kimberly Voll (a founding member of the Alliance), the “Player Behavior by Game Design” panel centered around the mechanics that drive player behavior.

Featuring devs and designers from industry heavyweights Epic Games, Supercell, Kabam, Blizzard, and Two Hat Security, the first FPA panel of the day addressed the ways gaming companies can engineer their products to empower healthy communication among users.

The room was full to capacity.

Not only that, members of the FPA counted anywhere between 100-200 additional people lined up outside the door, waiting to get in.

Throughout the day, the Fair Play Alliance Summit featured more panels and talks, including “Root Causes of Player Behavior,” a Developer Q&A, “Microtalks in Player Behavior,” and the closing talk “The Advocates Journey: Changing Culture by Changing Yourself, ” presented by Georgrify’s Kate Edwards.

Two Hat Director of Community Trust & Safety Carlos Figueiredo is one of the founding members of the FPA and moderated “Player Behavior By Game Design.” He also attended the Community Management Summit on Tuesday, March 20th. Several panels — most notably “Mitigating Abuse Before it Happens” — closely mirrored the conversations in the FPA room the next day.

Carlos shared his three key insights from the day:

1. “Community design is game design.”

The concept of community design as video game design was truly the biggest insight of GDC. Player behavior — the good, bad, and the ugly — doesn’t come out of nowhere. How a game is designed and engineered has a significant effect on player behavior and community interactions.

So, how can game designers engineer a product that encourages healthy interactions?

A few examples from panels throughout the day:

  • Engineering a healthy player experience from the very first moment they enter the game
  • Ensuring that players are fairly and equally paired in matchmaking
  • Sharing rewards equally among teammates
  • Turning off friendly fire (read about Epic Games’ decision to remove friendly fire from Fornite)
  • Providing feedback to players who submit a report
  • Building intuitive and proactive systems to protect players

How one designs and engineers the game mechanics and sets the stages for player interactions are a crucial foundation in terms of player behavior. What sort of gaming communities are we trying to create, what is this game about and what are we encouraging with the systems we are creating? It’s much better to consider this from the ground up, instead of treating it like an afterthought.  – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

2. Disruptive behavior > toxicity.

Traditionally, the word “toxicity” has been used by the industry to describe a wide range of negative behaviors. Over the years its meaning has become diluted and unclear. Instead, the Fair Play Alliance suggest using the term “disruptive behavior” — literally, any kind of behavior that disrupts the experience of another player.

Human behavior is complex. The way we act changes based on many circumstances. We have bad days. The word “toxicity” is fairly ambiguous and can lead to misunderstandings and misconceptions. Disruptive behavior speaks to the heart of the matter: an action that disrupts the very purpose of a game. – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

3. Focus on fostering & reinforcing healthy interactions.

When we discuss online behavior, the conversation almost always turns to negative behavior, instead of celebrating and encouraging the positive, healthy interactions that actually make up most of our online experiences. The Fair Play Alliance is keen on making games fun, and its members are passionate about supporting positive play — as opposed to just preventing negative interactions.

So the question is no longer, “How do we prevent disruptive behavior?” Instead, it’s time we ask, “How do we encourage players to engage in healthy, spirited competition?”

Games are fun. We want to encourage that enjoyment and focus on creating awesome experiences. – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

Engineering, game design, terminology, and a shift in focus — the gaming industry has a lot of work ahead of it if it wants to understand and discourage disruptive behavior. But the folks in the FPA are confident that the industry is ready to talk — and listen in return.

 

Sign up for Two Hat’s monthly emails to learn more about fostering positive user interactions in online communities! You’ll receive invites to exclusive webinars and workshops, community management and moderation tips, and product updates.

We will never share your information with third parties and you can unsubscribe at any time.