London Calling: A Week of Trust & Safety in the UK

Two weeks ago, the Two Hat team and I packed up our bags and flew to London for a jam-packed week of government meetings, media interviews, and two very special symposiums.

I’ve been traveling a lot recently – first to Australia in mid-September for the great eSafety19 conference, then London, and I’m off to Chicago next month for the International Bullying Prevention Association Conference – so I haven’t had much time to reflect. But now that the dust has settled on the UK visit (and I’m finally solidly back on Pacific Standard Time), I wanted to share a recap of the week as well as my biggest takeaways from the two symposiums I attended.

Talking Moderation

We were welcomed by several esteemed media companies and had the opportunity to be interviewed by journalists who had excellent and productive questions.

Haydn Taylor from GamesIndustry.Biz interviewed Two CEO and founder Chris Priebe, myself, and Cris Pikes, CEO of our partner Image Analyzer about moderating harmful online content, including live streams.

Rory Cellan-Jones from the BBC talked to us about the challenges of defining online harms (starts at 17:00).

Chris Priebe being interviewed
Chris Priebe being interviewed about online harms

I’m looking forward to more interviews being released soon.

We also met with branches of government and other organizations to discuss upcoming legislation. We continue to be encouraged by their openness to different perspectives across industries.

Chris Priebe continues to champion his angle regarding transparency reports. He believes that making transparency reports truly transparent – ie, digitizing and displaying them in app stores – has the greatest potential to significantly drive change in content moderation and online safety practices.

Transparency reports are the rising tide that will float all boats as nobody will want to be that one site or app with a report that doesn’t show commitment and progress towards a healthier online community. Sure, everyone wants more users – but in an age of transparency, you will have to do right by them if you expect them to join your platform and stick around.

Content Moderation Symposium – “Ushering in a new age of content moderation”

On Wednesday, October 2nd Two Hat hosted our first-ever Content Moderation Symposium. Experts from academia, government, non-profits, and industry came together to talk about the biggest content moderation challenges of our time, including tackling complex issues like defining cyberbullying and child exploitation behaviors in online communities to unpacking why a content moderation strategy is business-critical going into 2020.

Alex Holmes, Deputy CEO of The Diana Award opened the day with a powerful and emotional keynote about the effects of cyberbullying. For me, the highlight of his talk was this video he shared about the definition of “bullying” – it really drove home the importance of adopting nuanced definitions.

Next up were Dr. Maggie Brennan, a lecturer in clinical and forensic psychology at the University of Plymouth, and an academic advisor to Two Hat, and Zeineb Trabelsi, a third-year Ph.D. student in the Information System department at Laval University in Quebec, and an intern in the Natural Language Processing department at Two Hat.

Dr. Brennan and Zeineb have been working on academic frameworks for defining online child sexual victimization and cyberbullying behavior, respectively. They presented their proposed definitions, and our tables of six discussed them in detail. Discussion points included:

Are these definitions complete and do they make sense? What further information would we require to effectively use these definitions when moderating content? How do we currently define child exploitation and cyberbullying in our organizations?

My key takeaway from the morning sessions? Defining online harms is not going to be easy. It’s a complicated and nuanced task because human behavior is complicated and nuanced. There are no easy answers – but these cross-industry and cross-cultural conversations are a step in the right direction. The biggest challenge will be taking the academic definitions of online child sexual victimization and cyberbullying behaviors and using them to label, moderate, and act on actual online conversations.

I’m looking forward to continuing those collaborations.

Our afternoon keynote was presented by industry veteran David Nixon, who talked about the exponential and unprecedented growth of online communities over the last 20 years, and the need for strong Codes of Conduct and the resources to operationalize good industry practices. This was followed by a panel discussion with industry experts and several Two Hat customers. I was happy to sit on the panel as well.

My key takeaway from David’s session and the panel discussion? If you design your product with safety at the core (Safety by Design), you’re setting yourself up for community success. If not, reforming your community can be an uphill battle. One of our newest customers Peer Tutor is implementing Safety by Design in really interesting ways, which CEO Wayne Harrison shared during the panel. You’ll learn more in an upcoming case study.

Man standing in front of a screen that says Transparency Reports

Finally, I presented our 5 Layers of Community Protection (more about that in the future – stay tuned!), and we discussed best practices for each layer of content moderation. The fifth layer of protection is Transparency Reports, which yielded the most challenging conversation. What will Transparency Reports look like? What information will be mandatory? How will we define success benchmarks? What data should we start to collect today? No one knows – but we looked at YouTube’s Transparency Report as an example and guidance on what may be legislated in the future.

My biggest takeaway from this session? Best practices exist – many of us are doing them right now. We just need to talk about them and share them with the industry at large. More on that in an upcoming blog post.

Fair Play Alliance’s First European Symposium

Being a co-founder of the Fair Play Alliance and seeing it grow from a conversation between a few friends to a global organization of over 130 companies and many more professionals has been incredible, to say the least. This was the first time the alliance held an event outside of North America. As a global organization, it was very important to us, and it was a tremendous success! The feedback has been overwhelmingly positive, and we are so happy to see that it provided lots of value to attendees.

Members of the Fair Play Alliance

It was a wonderful two-day event held over October 3rd and 4th, with excellent talks and workshops that were hosted for members of the FPA. Chris Priebe, a couple of industry friends/veteran Trust & safety leaders, and I hosted one of the workshops. We’re all excited to take that work forward and see the results that will come out of it and benefit the games industry!

What. A. Week.

As you can tell, it was a whirlwind week and I’m sure I’ve forgotten at least some of it! It was great to connect with old friends and make new friends. All told, my biggest takeaway from the week was this:

Everyone I met cares deeply about online safety, and about finding the smartest, most efficient ways to protect users from online harms while still allowing them the freedom to express themselves. At Two Hat, we believe in an online world where everyone is free to share without fear of harassment or abuse. I’ve heard similar sentiments echoed countless times from other Trust & Safety professionals, and I truly believe that if we continue to collaborate across industries, across governments, and across organizations, we can make that vision a reality.

So let’s keep talking.

I’m still offering free community audits for any organization that wants a second look at their moderation and Trust & Safety practices. Sign up for a free consultation using the form below!



3 Myths About “Toxic” Gaming Communities: What We Learned at LA Games Conference

Two months after the successful Fair Play Alliance summit at GDC 2018, the LA Games Conference hosted the panel “Fighting Toxicity in Gaming.” Two Fair Play Alliance members were on hand to discuss the role technology plays in addressing disruptive player behavior.

Moderated by Dewey Hammond (Vice President of Games Research at Magid Advisors) and featuring panelists J Goldberg (Head of Community at Daybreak Game Company), Kat Lo (online moderation researcher and PhD student at the University of California), and Carlos Figueiredo (Director of Community Trust & Safety at Two Hat Security), the panelists tackled a series of challenging questions about growing healthy gaming communities. They opened with a frank discussion about misconceptions in the world of gaming.

We spoke with Carlos Figueiredo, co-founder of the Fair Play Alliance and digital citizenship and moderation expert. He shared the top three misconceptions about toxicity in video games — beginning with that tricky word, “toxic.”

Myth 1: How we describe toxicity doesn’t need to change
The gaming industry has been using words like “toxicity” and “trolling” for years now. They began as a necessity — catchy phrases like “toxic gamer culture” and “don’t feed the trolls” became a shorthand for player behavior that anyone in the community could reference.

Language, and our understanding of how the internet shapes behavior, has naturally evolved over time. Terminology like “toxic players” and “internet trolls” may be ingrained in our culture, but they are no longer sufficient when describing variable, nuanced human behavior. Human, in all our complexity, cannot be categorized using broad terms.

In fact, in a study released in 2017, Stanford University showed that, given the right set of circumstances (including the mood and context of a conversation), anyone can become a “troll.”

“When I say disruptive behavior, I’m referencing what we would normally refer to as toxicity, which is a very broad term,” Carlos says. “Disruptive behavior assumes different shapes. It’s not just language, although abuse and harassment are often the first things we think of.”

As its name suggests, disruptive behavior can also include cheating, griefing, and deliberately leaving a match early.

“Human behavior is complex,” says Carlos. “Think of it — we’re putting people from different cultures, with different expectations, together in games. They’ve never seen each other, and for fifteen minutes in a match, we’re expecting everything to go well. But what are we doing as an industry to facilitate healthy interactions?”

The first step in getting rid of internet trolls and toxic online gaming communities fostering healthier online gaming communities? Challenge, define, and refine the words we use and their meaning.

From left to right: J Goldberg, Dewey Hammond, Kat Lo, Carlos Figueiredo Image credit: Ferriz, V. (2018, May 8). LA Games Conference [Digital image]. Retrieved from https://www.facebook.com/DigitalMediaWire/

Myth 2: Anonymity is the problem
Countless articles have been written about the dangers of anonymous apps. And while it’s true that anonymity can be dangerous — the toxic (that word again!) side of online disinhibition — it can also be benign. As John Suler, Ph.D. writes in The Online Disinhibition Effect, “Sometimes people share very personal things about themselves [online]. They reveal secret emotions, fears, wishes [and] show unusual acts of kindness and generosity, sometimes going out of their way to help others.”

So what’s a bigger cause of disruptive player behavior, if not users hiding behind the mask of anonymity? “The lack of social consequences,” says Carlos.

“There are different social protocols when we are interacting face to face, and we know very well that our actions have tangible consequences,” he explains. “That’s not always the case online. We’re still figuring out the relatively new virtual spaces and how we are socializing within them.”

“Anonymity alone,” he continues, “is not the biggest driver of disruptive behavior.”

Kat Lo and Carlos Figueiredo Image credit: Ferriz, V. (2018, May 8). LA Games Conference [Digital image]. Retrieved from https://www.facebook.com/DigitalMediaWire/

Myth 3: AI alone will be the savior of player behavior
Disruptive behavior won’t be solved by algorithms or humans alone. Instead, as Carlos says, “A machine/human symbiosis that leverages the best of both can make all the difference.”

Using AI, gaming communities can proactively filter and triage the obviously unhealthy text or images, leaving humans to review the in-between content that requires empathy and human understanding to make a decision.

He also advocates for having a highly skilled and well-trained team of moderators who are well-versed in the game, understand the community, and have access to the right tools to do their jobs.

Having humans review content periodically is imperative, Carlos says. “You can’t just use automation and expect it to run blind. Models have biases if you don’t adjust and have eyes on them.”

He adds, “It’s important to remember that teams will have biases as well, and will require a lot of conscious effort to consider diverse views and motivations, overcome limitations in their thinking, and apply checks and balances along the way.”

Carlos believes that the answer to unhealthy communities is probably a lot more human than we realize. As he says, making a difference in gaming communities comes down to “People who care about games, care about players, care about the community, and are prepared to make the hard decisions.”

“We won’t always get things right,” he continues, “or have all the answers, but we can work really hard towards better online communities when we truly care about them. The answers are as human as they are technological.”



“Community Design is Video Game Design”: Insights from the Fair Play Alliance Summit at GDC 2018

If the lineup outside the first Fair Play Alliance (FPA) panel at GDC this March was any indication, the gaming industry is poised to make some major changes this year.

Following the rousing and packed keynote speech delivered by Riot Games’ Senior Technical Designer Kimberly Voll (a founding member of the Alliance), the “Player Behavior by Game Design” panel centered around the mechanics that drive player behavior.

Featuring devs and designers from industry heavyweights Epic Games, Supercell, Kabam, Blizzard, and Two Hat Security, the first FPA panel of the day addressed the ways gaming companies can engineer their products to empower healthy communication among users.

The room was full to capacity.

Not only that, members of the FPA counted anywhere between 100-200 additional people lined up outside the door, waiting to get in.

Throughout the day, the Fair Play Alliance Summit featured more panels and talks, including “Root Causes of Player Behavior,” a Developer Q&A, “Microtalks in Player Behavior,” and the closing talk “The Advocates Journey: Changing Culture by Changing Yourself, ” presented by Georgrify’s Kate Edwards.

Two Hat Director of Community Trust & Safety Carlos Figueiredo is one of the founding members of the FPA and moderated “Player Behavior By Game Design.” He also attended the Community Management Summit on Tuesday, March 20th. Several panels — most notably “Mitigating Abuse Before it Happens” — closely mirrored the conversations in the FPA room the next day.

Carlos shared his three key insights from the day:

1. “Community design is game design.”
The concept of community design as video game design was truly the biggest insight of GDC. Player behavior — the good, bad, and the ugly — doesn’t come out of nowhere. How a game is designed and engineered has a significant effect on player behavior and community interactions.

So, how can game designers engineer a product that encourages healthy interactions?

A few examples from panels throughout the day:

  • Engineering a healthy player experience from the very first moment they enter the game
  • Ensuring that players are fairly and equally paired in matchmaking
  • Sharing rewards equally among teammates
  • Turning off friendly fire (read about Epic Games’ decision to remove friendly fire from Fornite)
  • Providing feedback to players who submit a report
  • Building intuitive and proactive systems to protect players

How one designs and engineers the game mechanics and sets the stages for player interactions are a crucial foundation in terms of player behavior. What sort of gaming communities are we trying to create, what is this game about and what are we encouraging with the systems we are creating? It’s much better to consider this from the ground up, instead of treating it like an afterthought.  – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

2. Disruptive behavior > toxicity.
Traditionally, the word “toxicity” has been used by the industry to describe a wide range of negative behaviors. Over the years its meaning has become diluted and unclear. Instead, the Fair Play Alliance suggest using the term “disruptive behavior” — literally, any kind of behavior that disrupts the experience of another player.

Human behavior is complex. The way we act changes based on many circumstances. We have bad days. The word “toxicity” is fairly ambiguous and can lead to misunderstandings and misconceptions. Disruptive behavior speaks to the heart of the matter: an action that disrupts the very purpose of a game. – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

3. Focus on fostering & reinforcing healthy interactions.
When we discuss online behavior, the conversation almost always turns to negative behavior, instead of celebrating and encouraging the positive, healthy interactions that actually make up most of our online experiences. The Fair Play Alliance is keen on making games fun, and its members are passionate about supporting positive play — as opposed to just preventing negative interactions.

So the question is no longer, “How do we prevent disruptive behavior?” Instead, it’s time we ask, “How do we encourage players to engage in healthy, spirited competition?”

Games are fun. We want to encourage that enjoyment and focus on creating awesome experiences. – Carlos Figueiredo, Director of Community Trust & Safety for Two Hat Security

Engineering, game design, terminology, and a shift in focus — the gaming industry has a lot of work ahead of it if it wants to understand and discourage disruptive behavior. But the folks in the FPA are confident that the industry is ready to talk — and listen in return.



Introducing The Fair Play Alliance

Today, we are thrilled to announce our involvement with the Fair Play Alliance (FPA), a cross-industry initiative spanning over 30 gaming companies whose mission is to foster fair play in online games, raise awareness of player-behaviour-related issues, and share research and best practices that drive lasting change. As founding members of the Initiative, we are eager to collaborate with a wide range of industry experts to foster and empower healthy online communities.

Check out the official press release below for more information about the coalition.

SAN FRANCISCO, CA – Representatives of over 30 different gaming companies will meet during the 2018 Game Developers Conference (GDC) in San Francisco to discuss best practices in cultivating online gaming experiences free of harassment or abuse.

The Fair Play Alliance (FPA) is a coalition for developers that supports open collaboration, research, and best practices for encouraging healthy gaming communities and fair play. Key objectives include collaboration on initiatives aimed at improving online behavior in games and creating an atmosphere free of abuse and discrimination.

The Fair Play Summit, which takes place on Wednesday, March 21, will feature experts who have been working to understand and address disruptive behaviour in games, speaking on the state of the industry, what developers need to know, and practical methods to create constructive avenues for fair play and collaboration online.

Want to attend? Media and expo pass holders can see the keynote in Room 3020, West Hall from 9:30 to 10:30 am, and all following sessions in Room 306, South Hall from 11 to 6 pm.

Press attendees
Attendance is free to published members of the press – please contact info@fairplayalliance.org for further information.

For more information on the event, the Fair Play Alliance, or for interview requests:

info@fairplayalliance.org
www.fairplayalliance.org

Fair Play Alliance membership

  • Blizzard Entertainment, Inc.
  • CCP Games
  • Corillian
  • Discord Inc.
  • Epic Games, Inc.
  • Flaregames
  • Huuuge Games
  • Kabam
  • Ker-Chunk Games
  • Mixer
  • Owlchemy Labs
  • Playrix
  • Radial Games
  • Riot Games
  • Roblox Corporation
  • Rovio Entertainment Corp.
  • Space Ape Games
  • Spirit AI, Ltd.
  • Supercell
  • Two Hat
  • Twitch Interactive
  • Unity Technologies
  • Xbox
  • + additional silent partners

 

Want more articles like this? Subscribe to our newsletter and never miss a blog!

* indicates required