“Community Design is Video Game Design”: Insights from the Fair Play Alliance Summit at GDC 2018

If the lineup outside the first Fair Play Alliance (FPA) panel at GDC this March was any indication, the gaming industry is poised to make some major changes this year.

Following the rousing and packed keynote speech delivered by Riot Games’ Senior Technical Designer Kimberly Voll (a founding member of the Alliance), the “Player Behavior by Game Design” panel centered around the mechanics that drive player behavior.

Featuring devs and designers from industry heavyweights Epic Games, Supercell, Kabam, Blizzard, and Two Hat Security, the first FPA panel of the day addressed the ways gaming companies can engineer their products to empower healthy communication among users.

The room was full to capacity.

Not only that, members of the FPA counted anywhere between 100-200 additional people lined up outside the door, waiting to get in.

Throughout the day, the Fair Play Alliance Summit featured more panels and talks, including “Root Causes of Player Behavior,” a Developer Q&A, “Microtalks in Player Behavior,” and the closing talk “The Advocates Journey: Changing Culture by Changing Yourself, ” presented by Georgrify’s Kate Edwards.

Two Hat Director of Community Trust & Safety Carlos Figueiredo is one of the founding members of the FPA and moderated “Player Behavior By Game Design.” He also attended the Community Management Summit on Tuesday, March 20th. Several panels — most notably “Mitigating Abuse Before it Happens” — closely mirrored the conversations in the FPA room the next day.

Carlos shared his three key insights from the day:

1. “Community design is game design.”
The concept of community design as video game design was truly the biggest insight of GDC. Player behavior — the good, bad, and the ugly — doesn’t come out of nowhere. How a game is designed and engineered has a significant effect on player behavior and community interactions.

So, how can game designers engineer a product that encourages healthy interactions?

A few examples from panels throughout the day:

  • Engineering a healthy player experience from the very first moment they enter the game
  • Ensuring that players are fairly and equally paired in matchmaking
  • Sharing rewards equally among teammates
  • Turning off friendly fire (read about Epic Games’ decision to remove friendly fire from Fornite)
  • Providing feedback to players who submit a report
  • Building intuitive and proactive systems to protect players

How one designs and engineers the game mechanics and sets the stages for player interactions are a crucial foundation in terms of player behavior. What sort of gaming communities are we trying to create, what is this game about and what are we encouraging with the systems we are creating? It’s much better to consider this from the ground up, instead of treating it like an afterthought.  – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

2. Disruptive behavior > toxicity.
Traditionally, the word “toxicity” has been used by the industry to describe a wide range of negative behaviors. Over the years its meaning has become diluted and unclear. Instead, the Fair Play Alliance suggest using the term “disruptive behavior” — literally, any kind of behavior that disrupts the experience of another player.

Human behavior is complex. The way we act changes based on many circumstances. We have bad days. The word “toxicity” is fairly ambiguous and can lead to misunderstandings and misconceptions. Disruptive behavior speaks to the heart of the matter: an action that disrupts the very purpose of a game. – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

3. Focus on fostering & reinforcing healthy interactions.
When we discuss online behavior, the conversation almost always turns to negative behavior, instead of celebrating and encouraging the positive, healthy interactions that actually make up most of our online experiences. The Fair Play Alliance is keen on making games fun, and its members are passionate about supporting positive play — as opposed to just preventing negative interactions.

So the question is no longer, “How do we prevent disruptive behavior?” Instead, it’s time we ask, “How do we encourage players to engage in healthy, spirited competition?”

Games are fun. We want to encourage that enjoyment and focus on creating awesome experiences. – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

Engineering, game design, terminology, and a shift in focus — the gaming industry has a lot of work ahead of it if it wants to understand and discourage disruptive behavior. But the folks in the FPA are confident that the industry is ready to talk — and listen in return.



Top Three Reasons You Should Meet us at Gamescom

Heading to Gamescom or devcom this year? It’s a huge conference, and you have endless sessions, speakers, exhibits, and meetings to choose from. Your time is precious — and limited. How do you decide where you go, and who you talk to?

Here are three reasons we think you should meet with us while you’re in Cologne.

You need practical community-building tips.

Got trolls?

Our CEO & founder Chris Priebe is giving an awesome talk at devcom. He’ll be talking about the connection between trolls, community toxicity, and increased user churn. The struggle is real, and we’ve got the numbers to prove it.

Hope to build a thriving, engaged community in your game? Want to increase retention? Need to reduce your moderation workload so you can focus on fun stuff like shipping new features?

Chris has been in the online safety and security space for 20 years now and learned a few lessons along the way. He’ll be sharing practical, time-and-industry-proven moderation strategies that actually work.

Check out Chris’s talk on Monday, August 21st, from 14:30 – 15:00.

You don’t want to get left behind in a changing industry.

This is the year the industry gets serious about user-generated content (UGC) moderation.

With recent Facebook Live incidents (remember this and this?), new hate speech legislation in Germany, and the latest online harassment numbers from the Pew Research Center, online behavior is a hot topic.

We’ve been studying online behavior for years now. We even sat down with Kimberly Voll and Ivan Davies of Riot Games recently to talk about the challenges facing the industry in 2017.

Oh, and we have a kinda crazy theory about how the internet ended up this way. All we’ll say is that it involves Maslow’s hierarchy of needs

So, it’s encouraging to see that more and more companies are acknowledging the importance of smart, thoughtful, and intentional content moderation.

If you’re working on a game/social network/app in 2017, you have to consider how you’ll handle UGC (whether it’s chat, usernames, or images). Luckily, you don’t have to figure it out all by yourself.

Because…

You deserve success.

And we love this stuff.

Everyone says it, but it’s true: We really, really care about your success. And smart moderation is key to any social product’s success in a crowded and highly competitive market.

Increasing user retention, reducing moderation workload, keeping communities healthy — these are big deals to us. We’ve been fortunate enough to work with hugely successful companies like Roblox, Supercell, Kabam, and more, and we would love to share the lessons we’ve learned and best practices with you.

We’re sending three of our very best Two Hatters/Community Sifters to Germany. Sharon has a wicked sense of humor (and the biggest heart around), Mike has an encyclopedic knowledge of Bruce Springsteen lore, and Chris — well, he’s the brilliant, free-wheeling brain behind the entire operation.

So, if you’d like to meet up and chat at Gamescom, Sharon, Mike, and Chris will be in Cologne from Monday, August 21st to Friday, August 25th. Send us a message at hello@twohat.com, and one of them will be in touch.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


Is Online Behavior Changing (For the Better) in 2017?

This year, it seems like every second article you read is about online behavior. From Mark Zuckerberg’s manifesto to Twitter’s ongoing attempts to address abuse, toxicity is a hot topic.

However, forward-thinking companies like Riot Games have been (not so quietly) researching online toxicity for years now. And one of their biggest takeaways is that when it comes to online behavior, as a society we’re still in the discovery stages… and we have a long way to go.

Luckily, we have experts like Riot’s brilliant Senior Technical Designer Kimberly Voll to help guide us on the journey.

A long-time gamer with a background in computer science, artificial intelligence, and cognitive science (told you she was brilliant), Kim believes passionately in the power of player experience on game design. She also happens to be an expert in player behavior and online communication.

We sat down with her recently to discuss the current state of online discourse, the psychology of player behavior, and how game designers can promote sportsmanship in their games.

You say you want a revolution

Two Hat: As an industry, it seems like 2017 is the year we start to talk about online behavior, honestly and with an eye to finding solutions.

Kim: We’re on the cusp of a pretty significant shift in how we think of online digital play. Step by step, it’s starting to mature into a real industry. We’re at that awkward teenage phase where all hell keeps breaking loose sometimes. The internet is the fastest spreading technology that humans beings have ever faced. You blink, it went global, and now suddenly everybody’s online.

“How do you teach your kids to behave online when we don’t even know how to behave online?”

It hasn’t been culturally appropriated yet. It’s here, we like it, and we’re using it. There’s not enough of us stepping back and looking at it critically.

The fanciest of etiquette!

TW: Is it something about the nature of the internet that makes us behave this way?

Kim: The way we normally handle etiquette is with actual social settings. When you go to a kid’s club, you use kid-friendly language. When you got to a nightclub, you use nightclub-friendly language. We solve for that pretty easily. Most of us are good at reading a room, knowing how to read our peers, knowing what’s okay to say at work, versus elsewhere, knowing what it’s okay to say when you’re on the player behavior team and you’re exposed to all manner of language [laughs]. We’ve been doing this since we moved out of caves.

But we don’t have that on the internet. You can’t reliably look around and trust that space. And you find with kids that they go into all of the spaces trusting. Or they do what kids do and push the limits. Both are not great. We want kids to push the limits so they can learn the limits, but we don’t want them to build up these terrible habits that propagate these ways of talking.

On the internet, you don’t get the gesticulations, you don’t the presence that is being in the room with another person. There are certain channels that right now are completely cut off. So right now we’re hyper-focusing on other channels — for a long time that’s just been chat. These limitations mean that you end up trying to amplify and bring out your humanity in different ways.

The nature of things

TH: As a gamer and a cognitive scientist, what is your take on toxic player behavior?

Kim: I think the first step is understanding the nature of the problem.

There are different ways to look at toxicity and unsportsmanship. We can’t paint it all with the same brush.

“Are there people who just want to watch the world burn? They’re out there, but in our experience, they’re really, really rare.”

Not everyone else is being a saint, but not everyone is the same.

MOBAs [Multiplayer Battle Arena Games] are frustrating because they’re super intense. If something goes wrong you’re particularly susceptible to losing your temper. That creates a tinderbox that gives rise to other things. Couple that with bad habits and socio-norms that have developed on the internet, and have been honed somewhat for a gaming audience, and they’re just that — they’re norms. Doesn’t make them necessarily right or wrong, and it doesn’t mean that players like them. We find that players don’t like them, overwhelmingly. And they’re becoming incredibly vocal, saying “We don’t want this.”

But there’s a second vocal group that’s saying “Suck it up. It’s the internet, it’s the way we talk.” And the balance is somewhere in the middle.

It’s always a balancing act

TH: How can game designers decide what tactic they should use to promote better behavior in their game?

Kim: There is obviously a line, but it shifts a bit. Where that line falls will depend largely on your community, your content. It’s the same way the line shifts dramatically when you’re out with friends drinking, versus at home with the family playing card games with your kid cousins.

Bandaids help, but they’re not the full solution.

There has to be flexibility. The first thing to do is understand your community, and try to gain a broader perspective of the motivation and underlying things that drive these behaviors. And also understand that there is no “one size fits all” approach. As a producer of interactive content, you need to figure out where your comfort level is. Then draw that line, and stick by that line. It’s your game; you can set those standards.

There is understanding the community, understanding it within the context of your game, and then there’s the work that Community Sift does, which is shield. I think that shielding remains ever-important. But there has to be balance. The shield is the band-aid, but if we only ever do that, we’re missing an opportunity to learn from what that bandaid is blocking.

There’s a nice tension there where we can begin to explore things.

You don’t need to fundamentally alter your core experience. But if you have that awareness it forces you to ask questions like, “Do I want to have chat in this part of the game or do I want to have voice chat immediately after a match when tempers are the most heated?

Change is good

TH: Do you have an example of a time when Riot made a change to gameplay based on player behavior?

Kim: Recently we added the ability to select your role before you go into the queue, with some exceptions. Before it used to be that you would pop into chat and the war would start, because there are some roles that people tend to like more.

Before, it used to be that you would pop into chat and then the war would start to ensure you got the role you wanted. Whoever could type “mid” fastest ideally got the role, assuming people were even willing to accept precedence, which sometimes they weren’t. And if you lagged for any reason, you could miss your chance at your role.

We realized we were starting the game out on the wrong foot with these mini-wars. What was supposed to be a cooperative team game — one team vs another — now included this intra-team fighting because we started off with that kind of atmosphere.

Being able to choose your role gives players agency in a meaningful way, and removes these pre-game arguments. It’s not perfect, but it’s made the game significantly better.

Trigger warnings, road rage, and language norms… oh my!

TH: What kinds of things trigger bad behavior?

Kim: There is a mix of things that trigger toxicity and unsportsmanlike behavior. Obviously, frustration is one. But let’s break that down: What do you want to do when you’re frustrated? You want to kick and scream. You want the world to know. And if somebody is there with you, you need them to know, even if they had nothing to do with it.

“Put yourself in a situation where you’re locked behind a keyboard, your frustration is bubbling over, and you’re quite likely alone in a room playing a game. How do you yell at the person on the other side of the screen? Well, you can use all caps, but that’s not very satisfying. So how do you get more volume into your words? You keep amping up what you’re saying. And what’s the top of that chain? Hate speech.”

It’s very similar to road rage. I remember my mom told me a story about some dude who was upset that she didn’t run a yellow light, He actually got out of the car and started pounding on her hood. And I bet he went home afterward, pulled into his driveway, greeted his kid, and was a normal person for the rest of the day.

You’re not an actual monster; you’re in a particular set of circumstances, in that situation, that have funneled you through the keyboard into typing things you might not otherwise type. So that’s one big bucket.

Sometimes, you Hulk out.

In the 70s and 80s, we used to say things like “You’re such a retard.” Now, we’re like “I can’t believe we used to say that.” There are certain phrases that were normal at the time. We had zero ill intent — it was just a way of saying “You’re a goofball.” That sort of normalcy that you get with language, no matter how severe, when you’re exposed to it regularly, becomes ingrained in you, and you carry that through your life and don’t even realize it.

We’ve sent people their chat logs, and I truly believe that they when they look at them, they have no idea what the problem is. Other people see the problem, but they just think, “Suck it up.” But there is a third group of people who look at it and they think “This is the way everybody talks, I don’t understand.” They’re caught in a weird spot where they don’t know how to move forward. And that can trigger defensiveness.

The thought process is roughly “So, you’re asking me to change, but I don’t quite get it, I don’t want to change, because I’m me, and I like talking this way, and when I say things like this, my friends acknowledge me and laugh, and that’s my bonding mechanism so you can’t take that away from me.”

Typically, no one thinks all those things consciously. But they do get angry, and now we’ve lost all productive discourse.

There is a full spectrum here. It’s a big tapestry of really interesting things that are going on when people behave this way on the internet. All of that feeds into the question how do we shield it?

“Shielding is great, but can we also give feedback in a way that increases the likelihood that people who are getting the feedback are receptive to it?”

Can we draw a line between what’s so bad that the cost of the pain caused to people is far more than the time it would take to try to help this person?

Can we actually prevent them from getting into this state by understanding what’s triggering it, whether it’s the game, human nature, or current socio-norms?

Let’s talk about toxicity

TH: What can we do to ensure that these conversations continue?

Kim: I think we need to steer away from accusations. We’re all in this together; we’re all on the internet. There’s a certain level of individual responsibility in how we conduct ourselves online.

I’ve had these conversations when people are like “Yes, let’s clean up the internet, let’s do everything we have to do to make this happen.” And the flipside is people who say “Just suck it up. People are far too sensitive.”

And what I often find is that the first group are just naturally well-behaved online, while the second group is more likely to lose it. So when we have these conversations, what we don’t realize is that our perspective can unconsciously become an affront on who they are.

If we don’t take that into account in the conversation, then we end up inadvertently pointing fingers again.

We have to get to a point where can we talk about it, without getting defensive.

Redefining our approach to player behavior

TH: Your empathetic approach is refreshing. Many of us have gotten into the habit of assuming the worst of people and being unwilling to see the other person’s perspective. And of course, that isn’t productive.

Kim: Despite our tendency to make flippant, sweeping comments — most people are not jerks. They’re a product of their own situation. And those journeys that have got each of us to where we are today are different, and they’re often dramatically different. And when we put people on the internet, we’ve got a mix of folks for whom the only thing connecting them is this game, and they come into the game with a bunch of bad experiences, or just generally feeling like “Everyone else is going to let me down.”

Then somebody makes an innocent mistake, or not even a mistake — maybe they took a direction you didn’t expect — and that just reinforces their worldview. “See, everyone is an idiot!”

When expectations aren’t met it leads to a lot of frustration, and players head into games with a lot of expectations.

I believe very viscerally that we have to listen before we try to aggressively push things out. But also we have to realize that the folks we are trying to understand may not be ready to talk. So we may have to go to them. And that applies to a lot of human tragedy, from racism to sexism.

We come in wagging our fingers, and our natural human defense is “Walls up, defenses up — this is the only way I will solve the cognitive dissonance that is you telling me that I should change who I am. Because I am who I am, and I don’t want to change who I am. Because who else would I be?” And that’s scary.

TH: It sounds like we need to take a step back and show a bit of grace. Like we said before, the conversation is finally starting to happen, so let’s give people time to adjust.

Kim: Think about the average company. You’re trying to make a buck to put food on the table and maybe make a few great games. That doesn’t leave a lot of room to do a lot of extra stuff. You may want to, but you may also think, “I have no idea what to do, and I tried a few things and it didn’t work, so what now? What do I do, stop making games?”

“At Riot, we’re lucky to have had the success that we’ve had to make it possible fund these efforts, and that’s why we want to share. Let’s talk, let’s share. I never thought I’d have this job in my life. We’re very lucky to fund our team and try to make a difference in a little corner of the internet.”

It’s harder for games that have been out for a long time. Because it’s harder to shift normative behavior and break those habits. But we’re trying.

 

Want to know more about Kim? Follow @zanytomato on Twitter

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required