Two months after the successful Fair Play Alliance summit at GDC 2018, the LA Games Conference hosted the panel “Fighting Toxicity in Gaming.” Two Fair Play Alliance members were on hand to discuss the role technology plays in addressing disruptive player behavior.

Moderated by Dewey Hammond (Vice President of Games Research at Magid Advisors) and featuring panelists J Goldberg (Head of Community at Daybreak Game Company), Kat Lo (online moderation researcher and PhD student at the University of California), and Carlos Figueiredo (Director of Community Trust & Safety at Two Hat Security), the panelists tackled a series of challenging questions about growing healthy gaming communities. They opened with a frank discussion about misconceptions in the world of gaming.

We spoke with Carlos Figueiredo, co-founder of the Fair Play Alliance and digital citizenship and moderation expert. He shared the top three misconceptions about toxicity in video games — beginning with that tricky word, “toxic.”

Myth 1: How we describe toxicity doesn’t need to change
The gaming industry has been using words like “toxicity” and “trolling” for years now. They began as a necessity — catchy phrases like “toxic gamer culture” and “don’t feed the trolls” became a shorthand for player behavior that anyone in the community could reference.

Language, and our understanding of how the internet shapes behavior, has naturally evolved over time. Terminology like “toxic players” and “internet trolls” may be ingrained in our culture, but they are no longer sufficient when describing variable, nuanced human behavior. Human, in all our complexity, cannot be categorized using broad terms.

In fact, in a study released in 2017, Stanford University showed that, given the right set of circumstances (including the mood and context of a conversation), anyone can become a “troll.”

“When I say disruptive behavior, I’m referencing what we would normally refer to as toxicity, which is a very broad term,” Carlos says. “Disruptive behavior assumes different shapes. It’s not just language, although abuse and harassment are often the first things we think of.”

As its name suggests, disruptive behavior can also include cheating, griefing, and deliberately leaving a match early.

“Human behavior is complex,” says Carlos. “Think of it — we’re putting people from different cultures, with different expectations, together in games. They’ve never seen each other, and for fifteen minutes in a match, we’re expecting everything to go well. But what are we doing as an industry to facilitate healthy interactions?”

The first step in getting rid of internet trolls and toxic online gaming communities fostering healthier online gaming communities? Challenge, define, and refine the words we use and their meaning.

From left to right: J Goldberg, Dewey Hammond, Kat Lo, Carlos Figueiredo Image credit: Ferriz, V. (2018, May 8). LA Games Conference [Digital image]. Retrieved from https://www.facebook.com/DigitalMediaWire/

Myth 2: Anonymity is the problem
Countless articles have been written about the dangers of anonymous apps. And while it’s true that anonymity can be dangerous — the toxic (that word again!) side of online disinhibition — it can also be benign. As John Suler, Ph.D. writes in The Online Disinhibition Effect, “Sometimes people share very personal things about themselves [online]. They reveal secret emotions, fears, wishes [and] show unusual acts of kindness and generosity, sometimes going out of their way to help others.”

So what’s a bigger cause of disruptive player behavior, if not users hiding behind the mask of anonymity? “The lack of social consequences,” says Carlos.

“There are different social protocols when we are interacting face to face, and we know very well that our actions have tangible consequences,” he explains. “That’s not always the case online. We’re still figuring out the relatively new virtual spaces and how we are socializing within them.”

“Anonymity alone,” he continues, “is not the biggest driver of disruptive behavior.”

Kat Lo and Carlos Figueiredo Image credit: Ferriz, V. (2018, May 8). LA Games Conference [Digital image]. Retrieved from https://www.facebook.com/DigitalMediaWire/

Myth 3: AI alone will be the savior of player behavior
Disruptive behavior won’t be solved by algorithms or humans alone. Instead, as Carlos says, “A machine/human symbiosis that leverages the best of both can make all the difference.”

Using AI, gaming communities can proactively filter and triage the obviously unhealthy text or images, leaving humans to review the in-between content that requires empathy and human understanding to make a decision.

He also advocates for having a highly skilled and well-trained team of moderators who are well-versed in the game, understand the community, and have access to the right tools to do their jobs.

Having humans review content periodically is imperative, Carlos says. “You can’t just use automation and expect it to run blind. Models have biases if you don’t adjust and have eyes on them.”

He adds, “It’s important to remember that teams will have biases as well, and will require a lot of conscious effort to consider diverse views and motivations, overcome limitations in their thinking, and apply checks and balances along the way.”

Carlos believes that the answer to unhealthy communities is probably a lot more human than we realize. As he says, making a difference in gaming communities comes down to “People who care about games, care about players, care about the community, and are prepared to make the hard decisions.”

“We won’t always get things right,” he continues, “or have all the answers, but we can work really hard towards better online communities when we truly care about them. The answers are as human as they are technological.”



Request Demo