Top Tips for Managing a Healthy and High-Performing Player Experience Team

Player experience and content moderation stories have dominated the gaming news cycle in the first half of 2020, and show no signs of slowing down. As industry and player interest
in the previously oft-dismissed world of support agents and moderators grows stronger every day, we at Two Hat thought it was time to shine a light on the professionals who have been doing this work in the shadows for years.

We recently caught up with Pascal Debroek, a longtime player experience professional who has worked at some of the biggest mobile gaming startups that Helsinki has to offer, including Supercell, Next Games, and currently Hatch Entertainment Ltd. As Head of Customer Service at Hatch, he is responsible for providing a safe, fair, and frictionless environment for all players.

In this conversation, Pascal shares his experience running successful player support teams and provides invaluable advice for leaders in similar positions.


Let’s start with the obvious question. Why do player experience roles have a higher churn rate than other roles?

Support functions, such as player support, community management, and moderation roles are often not considered to be integral to product or game development, despite the
obvious value they bring to customer retention and product development. CS departments are, more often than not, perceived to be a cost-center, a necessary money-sink to appease
consumers when they might run into a problem and some form of consumer-facing communication is required.

When these functions or departments are not perceived to be part of the “core” of your studio, and the employees don’t feel empowered, nor get the right training and tools to do their job, that will make them feel unappreciated by their employer. Having user feedback being dismissed and waved away – “Oh, that’s just another customer complaining about something not working” – that certainly doesn’t aid the situation. That’s basic Employee Experience knowledge and not just for studios, but for any organization out there.

You have to understand that these can also be emotionally draining jobs. Most of these are pure customer-facing, and in a lot of the cases deal with either sensitive topics,
aggravated end-users dealing with a situation that did not meet their expectations or even outright insults and threats. Let’s not forget, most contacts with players stem from an
emotional state; happy, sad, angry, you’ll encounter them all in these roles. And because it pays off to be empathetic in such roles, it also means your employees are more susceptible
to the emotions that surround them.

And that’s not taking into account the exposure to personal insults, bullying, threats of harm and self-harm, racism, sexism, predatory behaviour, and child grooming to name a few. While these (hopefully) don’t occur all the time, it takes perseverance to stomach them and see the good in things. But I can promise you, it does affect most at one point or
another in their career. And that is why it is so important to focus on well-being for these roles. We all have a threshold on how much we can handle before the job
starts requiring more than we can give back.

It’s a shame because it’s such an important role for any gaming studio and can be a really valuable stepping stone for those who may not want to make support their career. People
who get their start in these support roles understand more often than not what the business is about, they understand basic game design, they understand production. The player support and moderation experience allows them to perform in a slightly different and more player-centric way in other roles because they already understand the perspective of the user.


What are some things that leaders can do to keep their player experience staff happy, safe, and healthy?

As a team leader, the first thing that is required is a safe environment built on trust. Especially if you are in a place where you’re constantly dealing with other humans, and as I
mentioned earlier, potential emotionally-laden communication.

Your team needs to be able to trust each other, both in doing their job to the best of their abilities, but also in being able to support every single person in that team. Without that,
your team will never feel safe and that will have an impact on efficiency, well-being, and ultimately performance. Even more so, if even one team member feels the team lead doesn’t have their back, you’ll end up in a very dangerous situation that could escalate at any moment.

You need to create a team with people who have empathy and people skills, it will make their job a lot easier. While I’m not going to suggest you would need to hire similar people – always go for compatible people who, in addition to a practical skill set are also good at reading and dealing with emotions. Communication is not only about what is being said, but also about what is not.

Once I started involving the whole team in the recruitment process, I saw a surge in team member compatibility and with that an increase in trust and performance over time. We would go through several screenings, talks with me as the hiring manager, an interview with HR, followed by talks with two more senior people from the team. And finally, they would meet up with the rest of the team for a casual half-hour to one-hour chat. Just talking about things the team would want to know like, “Hey, what kind of movies do you
like? How do you unwind? Any interest in sports? Are you into superheroes?” Which can lead to the dreaded Marvel or DC argument [laughs].

It’s important to hire the right people because every team member that you’re adding to the mix will affect and even change your team culture.

Then when it comes down to team culture itself, you need to ensure that you have a fair and open culture and an understanding that you’re all in this together, for the same cause. And people need to be very receptive to feedback and are expected to provide feedback. I believe you can be honest, to the point, and still be respectful and mindful. If you have that initial trust, it should be a lot easier to have those conversations, including the more difficult ones.


You mentioned involving the team in the hiring process. How do you ensure that they continue to work together closely once they’re hired?

There is also no reason why people on the team couldn’t have a one-on-one with each other. I’m not talking about HR-related topics here but professional development, mental
support. It’s more like people asking for advice, another point of view, coaching on a particular topic. Or it can just as well be someone feels they need to lift some weight off their shoulders because of a personal situation.

And of course, there is the obvious “Hey, I have this kind of message. How would you deal with people who talk like this?” There are times when people will send a Slack message to each other and say, “Hey, do you need to talk? You want to grab a cup of coffee? Do we want to go for a short walk?” It’s ok to get frustrated or stuck at times, just as long as you realise it. In the end, everyone should know that they are among peers and they should assist each other. And as a supervisor or as a manager, you need to allow those kinds of things to happen.

Continuous learning and sharing experiences having open, honest feedback, people being able to tell each other how they feel – that’s the most important part. If you’re not feeling good, then how are you going to be able to do your job? You need moral support from your supervisor, but also from your team members.


Because player experience roles and responsibilities are so emotionally charged, do you find that you have to look at success metrics differently?

Metrics are important, but they will never show you the full story. Personally I feel many companies oversimplify by trying to fully quantify performance in support and moderation functions. In a lot of cases, there are external and irrational or emotional factors that will affect your metrics. If you then use those metrics to judge the performance of an individual, now that’s not really fair or motivational, is it?

At a previous employer, we decided not to use KPIs as a determination of whether staff was performing well, but rather used it as a benchmark for industry standards and it would allow the team to push themselves constantly. Of course, this doesn’t mean we were not paying close attention to the KPIs, yet by simply removing the “fear” aspect of employees not meeting certain artificial performance metrics, we created an environment where we would constantly challenge ourselves to work smarter and be proud as a team of our achievements.

The following is a perfect example of why the “traditional” take on support KPIs can be detrimental to a CS agent’s mental health, efficiency, and overall customer satisfaction: If you’re assigned a queue and you’re the one that is dealing with all the sensitive topics and negative feedback that comes in, it takes an emotional toll on you, and it becomes a lot harder to reply with each subsequent message. In order to create an understanding throughout the whole team, and to protect them, every team member would, in turn, be taking care of these more challenging topics and conversations.

Anyone dealing with these more sensitive or negative content and tickets was allowed to take more time. As long as a player received a timely and correct answer, they could take as long as they needed to reply, within sensible limits. The reasoning for this was that when you’re dealing with heavy topics in player support or player moderation, it can suck you emotionally dry very fast. If that means you’re only doing a quarter of tickets in a whole day compared to a top performer, then no one is going to ask why you didn’t do more, because everyone understood the challenge. The worst thing that can happen when you are feeling emotionally drained is adding time pressure. Ultimately, the end-result for the player or all players is more important than adhering to artificial time constraints that don’t reflect the context of the issue.


What’s the business benefit of investing in experienced player experience leaders?

There are a few reasons why a company would want to invest in more experienced Player Experience leaders. All of these stem from the mere fact that there is no substitute for experience. Rarely do companies ask for the added business value of hiring an experienced backend developer or a more senior product lead. Yet when it comes down to customer-facing roles, many companies still seem to struggle with the answer to that question; as if it somehow would be any different than for other roles.

If you’re looking to set up a department that communicates directly with your players from all over the globe, in order to create actionable insights on your product development and provide a safe online environment for all while maintaining scalable cost-efficient operations and always needing to keep in line with the expectations of your audience, would you rather not invest in someone who has the experience?

In order to succeed in your CX endeavours, you hire the right people, so they can hire the right talent, train them, coach them, empower them. They understand the expectations of the audience, what channels to use, how to approach communication about particular topics. They know the tools out there, how to tweak them, can create processes, analyze support metrics and plan resources accordingly. And often forgotten, these are the key people who need to influence decisions across departments and teams, walking the fine line between customer-centricity and profitability of the product or service.  And those insights only come with experience.

However, it’s not just about investing in experienced people. It is also about the resources allocated to the team and the tools they are provided to do their job best. You can hire a top leader, but if they have to make do with an email-only contact center, you won’t get far. The most obvious answer is that more experienced leaders can boost retention metrics in the mid to long term when given access to the right resources. That in itself is a major advantage for studios, especially in the competitive f2p [free-to-play] mobile space, but it extends well beyond that.

Image credit:
https://medium.com/@markpinc/what-will-our-players-thank-us-for-ee61408a5a34

There is this quote from Zynga that I feel more companies should ask themselves. On a wall in one of their offices is this one question: “What will our players thank us for?”

That is a very important thing to reflect on as a company because it all comes down to player expectations and surpassing them. You don’t create fantastic experiences without thought, without understanding your audience, nor without investing in the contact points your audience will reach out to. Because those interactions, the quality, the lack of friction and the efficiency, will leave a lasting impression.

As a follow-up question, I would suggest companies also start asking themselves what their brand will be remembered for later on. This is something companies like Apple, Amazon and Netflix understand. Within games, simply take a look at companies like Activision Blizzard or Supercell. It’s obvious they are not just making games, they are building experiences and are differentiating themselves as a brand. People these days download Supercell games not solely on the premise that it is a good game, but because they have come to expect good games from Supercell as a brand.


Can you tell us about any initiatives you’ve done to boost company awareness of player experience teams?

At Next Games, we created a management-endorsed shadowing initiative dubbed “Player Support Bootcamp”. You would sign up for three-hour sessions where you would be told how player support works, what tools we use, how we communicate, learn about our processes or what happens when we log a bug, how we do investigations, how we do moderation.

It was purely voluntary, and at the high point of the program, we had more than 62,5% of the company signing up for sessions. So we decided to gamify it: Come to three sessions, you get a fantastic-looking degree, created by one of our very talented marketing artists. People started competing for spots in the program; we were fully booked for months. Degrees appeared hanging from the wall or stood framed on desks as a badge of honour, while developers shared personal experiences with the Bootcamp over coffee in the kitchen.

We saw two huge changes come out of the program. A UX designer kicked in gear a big feature redesign of how users save their game progress, based on the feedback she saw from players during Bootcamp. After the implementation of the redesign, we saw a decrease of more than 20% of tickets regarding lost accounts. Huge impact on our bottom line there.

The other change happened when a senior client programmer who went through the program noticed that we were wasting time trying to localize some of our questions since we sometimes would get messages in languages we didn’t have native speakers for. We were copy/pasting the tickets into Google translate, putting them back into the CRM, then replying using Google translate. So in his spare time, he actually started programming a bot for us that would go through the CRM and automatically translate emails for us in advance, saving us time and money.

The buy-in from the management team was crucial to the success of the project. Our CEO was actually one of the early adopters and possibly the biggest proponent. We could see that shortly after the producers of the games actually got a really big interest in the program and gently persuaded people to sign up. Especially for those teams working on new projects, the Bootcamp was a source of inspiration. So it had a huge impact. And I’m sure that there are still people who are talking about the program today.


In your opinion, what does the future hold for player experience teams?

Users nowadays have a better understanding of game mechanics and social dynamics, and they also have higher expectations. Seven to eight years ago, if there was a game on the
app store, you were just comparing that game to the next best game. Nowadays, you compare that game to the last best experience that you’ve ever had, which could include pretty much anything on the app store and beyond, from Amazon to Spotify. I’m expecting every game experience to be just as frictionless as on Clash of Clans, but just as deep as Skyrim. Is that fair? Perhaps not, considering technical and other limitations, yet it is what is happening.

I’m not the only one who thinks companies need to invest more in the service aspects of their games. There’s an unstoppable mindset change happening in the retail and on-demand landscape; the service industry is getting disrupted. Games-as-a Service is already a really big thing right now and it shows no signs of slowing down. But the games industry will need to adapt fast to keep up with this evolution, which obviously doesn’t happen without a change in attitude towards supporting functions and towards the gaming audience.

The whole idea of simply hiring a junior person who can answer email messages for cheap, that’s also going to disappear. The need for emotionally smart, educated and experienced support and moderation personnel is going to skyrocket. The more technology advances, the more need there will be for people who can rely on experience and a higher understanding of what they’re doing, who understand the tools and the processes they’ll be using, but most of all, understand humans and human behaviour.


 

Are you a Player Experience Manager? How do you work with your team to ensure they’re happy, emotionally healthy, and high-performing while doing a tough job?

Send an email to hello@twohat.com with your best tips and advice and we’ll share them (and credit you, of course) in an upcoming post!

You can also share on our social channels – LinkedIn, Twitter, and Facebook.

Join a Webinar: Taking Action on Offensive In-Game Chat

I’m excited to announce that Two Hat is co-hosting an upcoming webinar with the International Game Developers Association on Friday, February 21st, 2020.

The incredible Liza Wood (check out her bio below), our Director of Research and Data Science, will be joining me as we present Defining, Identifying and Actioning Offensive Chat: Approaches and Frameworks.

We will start by examining why defining, identifying and actioning offensive chat matters to game development, with tangible supporting stats. Later we will provide an overview of the Five Layers of Community Protection.

Here’s what you can expect to get out of it:

  • Compelling arguments for adding player safeguarding mechanisms to your game’s social features
  • Actionable takeaways for creating internal alignment on categories and terminology
  • Action plans for identifying and addressing disruptive behavior

We hope you will join us on February 21st at 3 pm PST on IGDA’s Twitch Channel. Mark your calendars!

To celebrate this collaboration with the IGDA, we’re offering exclusive early access to our brand new Content Moderation Best Practices PDF, containing practical applications that you can start leveraging today. Download in advance of the full release happening later this month by filling out the form below.

When you sign up, we will also send you an email reminder on the 20th so you don’t miss the webinar. See you there!



About Liza Wood

Liza brings a wealth of experience and remarkable work in the games industry. After 13 years in video game development, Liza joined Two Hat Security as Director of Research and Data Science in August 2019. There she leads a team of researchers who are helping customers and partners build safe and healthy online communities by removing negative interactions to make room for positive human connections. Prior to starting this new phase of her career, she was the Executive Producer of Disney’s Club Penguin Island, the successor to Club Penguin, where she saw the positive impact that online communities can have.

About the IGDA

We are encouraged by and fully believe in IGDA’s mission to support and empower game developers around the world. Having worked for a gaming organization and co-founded the Fair Play Alliance, I strongly believe in the power of games to create meaningful and life-changing experiences for billions of players collectively. And that starts with supporting the dedicated professionals who are committed to creating those experiences.

Conversations About Gaming & Digital Civility With Laura Higgins from Roblox

In November, Laura Higgins, Director of Community Safety & Digital Civility at Roblox shared the fascinating results of a recent survey with International Bullying Prevention Association Conference attendees in Chicago. The results provide a refreshingly honest peek into the world of online gaming and family communication – and should serve as a wake-up call for family-based organizations.

Roblox conducted two separate surveys – in the UK and the US. In the UK, they spoke to over 1500 parents, and in the US they surveyed more than 3500 parents and 580 teenagers, with different questions but some of the similar themes.

Two Hat Director of Community Trust & Safety Carlos Figueiredo was lucky enough to share the stage with Laura during the Keynote Gaming Panel at the same conference. During the panel and in conversations afterward, he and Laura spoke at length about the surprising survey results and how the industry needs to adopt a “Communication by Design” approach when talking to parents.

What follows is a condensed version of their conversations, where Laura shares her biggest takeaways, advice for organizations, and thoughts on the future of digital conversations.

Carlos Figueiredo: Some fascinating and surprising results came out of these surveys. What were your biggest takeaways?

Laura Higgins: In the UK survey, unsurprisingly, 89% of parents told us that they were worried about their kids playing games online. They cited concerns about addiction, strangers contacting their children, and that gaming might lead to difficulties forming real-life friends or social interactions.

What was really interesting is that nearly the same number of parents said they could see the benefits of gaming, so that’s something we’re going to really unpack over the next year. They recognize improved cognitive skills, they loved the cooperation and teamwork elements that gaming provided, the improved STEM skills. They recognize that playing games can help kids in the future as they will need digital skills as adults, which was really interesting for us to hear about.

The big thing that came out of this that we really need to focus on is that, of those people who said they were worried about gaming, half of them told us that their fears were coming from stories they saw on media and social media, instead of real-life experience. We know there’s a lot of negativity in the press, particularly around grooming and addiction/gambling, so I think we need to be mindful of the way we talk to parents so that whilst we’re educating them about possible risks (and we know that there are risks), we’re also discussing how to raise resilient digital citizens and are giving them the tools to manage risks rather than just giving them bad news. We’re trying to proactively work with media outlets by telling them, if you want to talk about the risks, that’s fine, but let’s share some advice in there as well, empower rather than instill even more fear.

Did you see different results with the US survey?

With the US research we were also able to reach 580 teens and compare the data from them and parents. Some of the most startling stuff for us was the disconnect between what parents think is really happening versus what kids think is happening. For example, 91% of parents were convinced that their kids would come and talk to them if they were being bullied. But only 26% of kids said they would tell their parents if they had witnessed bullying. In fact, they would tell anyone else but their parents; they would report it to the platform, they would challenge the bully directly, or they would go to another adult instead of their parents.

The gap was echoed throughout the whole survey. We asked if parents talked to their kids about online safety and appropriate online behavior, and 93% of parents said that they were at least occasionally or regularly discussing this topic with their kids, while 60% of teens said that their parents never or rarely talked to them about appropriate online behavior. So, whatever it is that parents are saying — kids aren’t hearing it. We need to make sure we’re reaching kids. It’s more than just sitting down and talking to them; it’s how it’s being received by kids as well.

It seems like your surveys are uncovering some uncomfortable realities – and the things that the industry needs to focus on. We talk a lot about Safety by Design, but it seems like a focus we’re missing is Communication by Design.

We were surprised with how honest parents were. Over half of UK parents, for example, are still not checking privacy and security settings that are built in. Part of my role at Roblox has been to review how accessible the advice is, how easy to understand it is, and it’s an ongoing process. We appreciate how busy parents are – they don’t have time to go looking for things.

We asked US parents who rarely or never had conversations with their kids about appropriate behavior online, why they didn’t feel like they were necessary, and we got some fascinating quotes back. Parents think they’re out of their depth, they think that their kids know more than them. In some cases that may be true, but not really – digital parenting is still parenting.

We heard quotes like, “If my kid had a problem, they would tell me.” The research tells us that’s not true.

“If my child was having problems, I would know about it.” But if you’re not talking about it, how is that going to happen?

“I brought my kid up right.” Well, it’s not their behavior we always have to look at – it’s their vulnerabilities as well.

We need to talk more broadly than just how to use the settings, so I think there are many layers to these conversations for parents as well.

What are some other things we can do as an industry to help parents?

One is, give them the skills and easy, bite-sized tips: here’s how you check your safety settings, here’s how you set privacy settings, here’s how you report something in-game, practical things they can teach their kids as well.

There’s also a broader conversation that empowers parents to learn how to have conversations. At Roblox, we do lots of work around things like, how to say no to your kids, what is an appropriate amount of screen time for your child, how to manage in-game purchases, and setting boundaries and limits, all advice that parents are grateful for. But if we just had an advice section or FAQ on the website, they would never get to hear those messages.

It’s about amplifying the message, working with the media as much as possible, having some different outlets like our Facebook page that we just launched. So parents who are sitting on the bus on the way to work scrolling through and finding those little reminders is really helpful.

Speaking of your new Facebook page, Roblox has been really innovative in reaching out to parents.

We’re also taking it offline. We have visits to Roblox, for instance, with kids. We’ll be holding an online safety session for parents while kids are off doing other activities. So I’m helping to write that. And working with parents in organizations as well, so they can still get those messages out where people are.

Schools have a key place in all of these conversations. We know that the quality of online safety conversations in schools is poor, it’s often still an assembly once a year and we’re going to scare you silly, not actually talk about practical stuff, rather than delivering these lessons through the curriculum. They should be reminding kids of appropriate online behavior at all times and giving them those digital literacy skills as well.

We’re doing webinars, we’re doing visits, and hopefully, gradually we’ll keep feeding them those messages.

It’s encouraging that you’re so committed to this, trying to change culture. Not every platform is putting in this effort.

I think we have to. I’ve been working in digital safeguarding for years, and I don’t think that we’ve hit that sweet spot yet. We haven’t affected enough change, and we need to move even faster.

Now, with all of these conversations about online harms papers and regulations, we’ve worked with partners in Australia and New Zealand where they have the Harmful Digital Communications Act but it’s still not really changing, This is just a new approach – that drip-feed, that persistence that hopefully will affect change.

We’re very lucky at Roblox – our community is really lovely. By the way, 40% of Roblox users are females which is rare in gaming. And they are very young and very supportive of each other. They are happy to learn at that age. And we can help to shape them and mold them, and they can take those attitudes and behaviors through their online digital life, as they grow up.

In the survey, we wanted the kids to tell us about the positive and negative experiences that they’ve had online. Actually, what most of them reflected wasn’t necessarily around things like bullying and harassment – they were actually saying that the things that made them feel really bad were when they did badly in a game and they were a bit tough on themselves. And they said they would walk away for 10 minutes, come back, and it was fine. And when people were positive to them in-game, they were thinking about it a few days later. So when we’re looking at how we manage bad behavior in our platform, it’s really important that we have rules, that we have appropriate sanctions in place, and that we can use the positive as an educational tool. I think we really need that balance.

I love that framing. It’s a reminder that most players are having a good time and enjoying the game the way it was meant to be enjoyed. We all have bad days but nasty behavior is not the norm.

It’s in everybody’s interest to make it a positive experience. We have a role to play in that but so do the kids themselves. They self-regulate, they call out bad behaviors, they are very supportive of each other.

We asked them why they play online games and 72% said, “Because it’s fun!”

That should be the starting point. Ultimately, it’s about play and how important that is for all of us.

What is your best advice for gaming organizations, from reinforcing positive behavior to better communicating with parents?

Great question. The first thing is to listen to your community. Their voice is really important. Without our players and their families, we would not have Roblox. Gaming companies can sometimes make decisions that are good for the business, rather than what the players want and what the community needs. And act on it. Take their feedback.

If you’re working with children, have a Duty of Care to make it as safe as possible. That’s a difficult one, because we know that small companies and startups might struggle financially. We’re working with trade bodies on the idea of Safety by Design – what are the bare minimums that must be met before we let anyone communicate on your platform? It doesn’t have to be all of the best equipment, tools, systems in place, but there are some standards that I think we should all have in place.

For example, if you have chat functions you need to make that you’ve got the right filters in place. Make sure it is age-appropriate all the way through.

Ultimately, machine learning and AI is wonderful, but it can never replace humans in certain roles or situations. You need well-trained, good moderators. Moderators have one of the most important roles in gaming platforms, so making sure they’re really well supported is important. They have a tough job. They are dealing with very upsetting things that might happen, so making sure that they aren’t just trained to deal with it, but that they have after-care as well.

If you are a family-based platform make sure you reach out to parents. I met with a delegate and she said it was the first time she’s heard a tech company talk about engaging with parents. I think if we could all start doing that a little bit more, it would be better.

You mentioned that in your 20 years in the digital civility industry, the needle has barely moved. Do you think that’s changing?

I’m really hopeful for the future. I had talked with journalists a few months ago who were slightly scoffing at my aspirations of digital civility. If you’re coming from a starting point where you just assume that games are bad and the players are bad and the community is bad – you’re wrong. People are kind. People do have empathy. They want to see other people succeed. For example, nearly all teens (96%) in our survey said they would likely help a friend they see being bullied online, and the majority of teens confirmed they get help from other players when they need it at least “sometimes,” with 41% saying they get peer help “often” or “always.” Those are all things we see all the time in gaming. And we have this opportunity to spread that out even more and build those really good positive online citizens. This is much bigger than Roblox. These kids are the future. The more that we can invest in them, the better.

We all need to enable those conversations, encourage those conversations, and equip parents with the right messages.

***

Further reading:

Roblox Parents Page
Survey Says Parents and Teens Don’t Discuss Appropriate Online Behavior
Five Things Parents Should Know About Content Moderation in Video Games

London Calling: A Week of Trust & Safety in the UK

Two weeks ago, the Two Hat team and I packed up our bags and flew to London for a jam-packed week of government meetings, media interviews, and two very special symposiums.

I’ve been traveling a lot recently – first to Australia in mid-September for the great eSafety19 conference, then London, and I’m off to Chicago next month for the International Bullying Prevention Association Conference – so I haven’t had much time to reflect. But now that the dust has settled on the UK visit (and I’m finally solidly back on Pacific Standard Time), I wanted to share a recap of the week as well as my biggest takeaways from the two symposiums I attended.

Talking Moderation 

We were welcomed by several esteemed media companies and had the opportunity to be interviewed by journalists who had excellent and productive questions.

Haydn Taylor from GamesIndustry.Biz interviewed Two CEO and founder Chris Priebe, myself, and Cris Pikes, CEO of our partner Image Analyzer about moderating harmful online content, including live streams.

Rory Cellan-Jones from the BBC talked to us about the challenges of defining online harms (starts at 17:00).

Chris Priebe being interviewed
Chris Priebe being interviewed about online harms

I’m looking forward to more interviews being released soon.

We also met with branches of government and other organizations to discuss upcoming legislation. We continue to be encouraged by their openness to different perspectives across industries.

Chris Priebe continues to champion his angle regarding transparency reports. He believes that making transparency reports truly transparent – ie, digitizing and displaying them in app stores – has the greatest potential to significantly drive change in content moderation and online safety practices.

Transparency reports are the rising tide that will float all boats as nobody will want to be that one site or app with a report that doesn’t show commitment and progress towards a healthier online community. Sure, everyone wants more users – but in an age of transparency, you will have to do right by them if you expect them to join your platform and stick around.

Content Moderation Symposium – “Ushering in a new age of content moderation”

On Wednesday, October 2nd Two Hat hosted our first-ever Content Moderation Symposium. Experts from academia, government, non-profits, and industry came together to talk about the biggest content moderation challenges of our time, including tackling complex issues like defining cyberbullying and child exploitation behaviors in online communities to unpacking why a content moderation strategy is business-critical going into 2020.

Meeting room with a banner that says Content Moderation Symposium

Alex Holmes, Deputy CEO of The Diana Award opened the day with a powerful and emotional keynote about the effects of cyberbullying. For me, the highlight of his talk was this video he shared about the definition of “bullying” – it really drove home the importance of adopting nuanced definitions.

Next up were Dr. Maggie Brennan, a lecturer in clinical and forensic psychology at the University of Plymouth, and an academic advisor to Two Hat, and Zeineb Trabelsi, a third-year Ph.D. student in the Information System department at Laval University in Quebec, and an intern in the Natural Language Processing department at Two Hat.

Dr. Brennan and Zeineb have been working on academic frameworks for defining online child sexual victimization and cyberbullying behavior, respectively. They presented their proposed definitions, and our tables of six discussed them in detail. Discussion points included:

Are these definitions complete and do they make sense? What further information would we require to effectively use these definitions when moderating content? How do we currently define child exploitation and cyberbullying in our organizations?

My key takeaway from the morning sessions? Defining online harms is not going to be easy. It’s a complicated and nuanced task because human behavior is complicated and nuanced. There are no easy answers – but these cross-industry and cross-cultural conversations are a step in the right direction. The biggest challenge will be taking the academic definitions of online child sexual victimization and cyberbullying behaviors and using them to label, moderate, and act on actual online conversations.

I’m looking forward to continuing those collaborations.

Our afternoon keynote was presented by industry veteran David Nixon, who talked about the exponential and unprecedented growth of online communities over the last 20 years, and the need for strong Codes of Conduct and the resources to operationalize good industry practices. This was followed by a panel discussion with industry experts and several Two Hat customers. I was happy to sit on the panel as well.

My key takeaway from David’s session and the panel discussion? If you design your product with safety at the core (Safety by Design), you’re setting yourself up for community success. If not, reforming your community can be an uphill battle. One of our newest customers Peer Tutor is implementing Safety by Design in really interesting ways, which CEO Wayne Harrison shared during the panel. You’ll learn more in an upcoming case study.

Man standing in front of a screen that says Transparency Reports
Presenting the fifth layer of community protection

Finally, I presented our 5 Layers of Community Protection (more about that in the future – stay tuned!), and we discussed best practices for each layer of content moderation. The fifth layer of protection is Transparency Reports, which yielded the most challenging conversation. What will Transparency Reports look like? What information will be mandatory? How will we define success benchmarks? What data should we start to collect today? No one knows – but we looked at YouTube’s Transparency Report as an example and guidance on what may be legislated in the future.

My biggest takeaway from this session? Best practices exist – many of us are doing them right now. We just need to talk about them and share them with the industry at large. More on that in an upcoming blog post.

Fair Play Alliance’s First European Symposium

Being a co-founder of the Fair Play Alliance and seeing it grow from a conversation between a few friends to a global organization of over 130 companies and many more professionals has been incredible, to say the least. This was the first time the alliance held an event outside of North America. As a global organization, it was very important to us, and it was a tremendous success! The feedback has been overwhelmingly positive, and we are so happy to see that it provided lots of value to attendees.

Members of the Fair Play Alliance
Fair Play Alliance friends

It was a wonderful two-day event held over October 3rd and 4th, with excellent talks and workshops that were hosted for members of the FPA. Chris Priebe, a couple of industry friends/veteran Trust & safety leaders, and I hosted one of the workshops. We’re all excited to take that work forward and see the results that will come out of it and benefit the games industry!

What. A. Week.

As you can tell, it was a whirlwind week and I’m sure I’ve forgotten at least some of it! It was great to connect with old friends and make new friends. All told, my biggest takeaway from the week was this:

Everyone I met cares deeply about online safety, and about finding the smartest, most efficient ways to protect users from online harms while still allowing them the freedom to express themselves. At Two Hat, we believe in an online world where everyone is free to share without fear of harassment or abuse. I’ve heard similar sentiments echoed countless times from other Trust & Safety professionals, and I truly believe that if we continue to collaborate across industries, across governments, and across organizations, we can make that vision a reality.

So let’s keep talking.

I’m still offering free community audits for any organization that wants a second look at their moderation and Trust & Safety practices. Sign up on our Community Audit page.

YOLO: Life in the Fast Lane Drives a Vision for Safety

In its first 48 hours, YOLO acquired 1 million users, a plague of cyberbullying, a scalable content moderation solution, and a new vision for the future.

YOLO may have never lived at all if not for a weekend experiment. Gregoire Henrion and his cofounders weren’t really interested in an anonymity app, they were just curious what they could build over an idle couple of days. But when YOLO hit the App Store, it found instant traction and caught a ride on a viral loop via Snapchat.

“We had a million users in two days,” says Henrion. Unfortunately, the anonymous nature of the app was also providing a platform for cyberbullying, which spread like wildfire. “We hadn’t thought of it before, because we’d never dreamed of the scale. But even after one day, we knew it was a big issue.”

Desperate for a solution, Henrion reached out to peers in Paris’ apps ecosystem. “I spoke with a friend at Yubo who used Two Hat’s Community Sift and recommended it,” he says. “He connected me with Sharon, their account executive at Two Hat, who instantly went to work for YOLO.”

“Within a day, we went from having lots of bad behaviors, to being safe as could be.”

YOLO was at this time in the midst of a feeding frenzy of meetings, media and monetization that only the developers of such viral app sensations can truly understand. “We were doing funding calls and everything else – it was crazy – but Two Hat sorted out what we needed, and the implementation was completed within hours.”

YOLO initially went with very strong guidelines before easing settings based on user feedback. “We wanted to fix what was wrong, and we didn’t want to be associated with bad behaviors,” says Henrion. By experimenting with policies and settings we find we are able to deal with 95 to 99 percent of the issues.”

In YOLO’s configuration, inappropriate messages or comments simply do not get shared, but the offending party doesn’t know this. In the content moderation industry, this is known as a false send. But the bully just knows they’re not getting any attention back, which is often enough for them to stop and go away. “Now, we have the app tuned so that the filters are super-efficient,” says Henrion.

“We have a lot of control now; our users are happy, and we are super happy.”

Moving forward, YOLO plans to apply what the company has learned about anonymity and social media to carve a new approach to online safety. “When we look at user behavior now, one of the secrets of YOLO being successful is that even if the user’s name is hidden, you still see the face of the user on their profile pic. This bit of exposure – you don’t know me but you can see my face – is very often enough to make users regulate their behaviors. That’s naturally ‘Safe by Design’ because it’s our normal behavior.”

“If you want to create value you have to make something secure. We’re not naive anymore. We know all the bad things that can happen in social.”

YOLO envisions their community and others as a place where Safety by Design has encouraged users to change behaviors — Why bully, harass or cajole in a community if life on mute is the only possible outcome?

“Anonymity alone is not a sustainable approach to managing communities,” says Henrion. “Two Hat’s Community Sift gives us tools to help shift user behavior, the security system to deal with those who cause trouble, and a solution we know scales quickly.”

 


We’re currently offering no-cost, no-obligation Community Audits for social networks that want an expert consultation on their community moderation practices.

Our Director of Community Trust & Safety will examine your community, locate areas of potential risk, and provide you with a personalized community analysis, including recommended best practices and tips to maximize user engagement.

Sign up today and we’ll be in touch.

Meet the Mayor in a Town of 20 Million Teens

Launched in 2016, Yubo is a social network of more than 20 million users from around the world. Yubo lets users meet new people and connect through live video streaming and chat. Developed and operated by Paris-based Twelve App SAS, the Yubo app is available for free on the App Store and Google Play.

Two Hat’s Community Sift platform powers content moderation for Yubo’s Live Titles, Comments, and Usernames, all in multiple languages. Use cases include detection and moderation of bullying, sexting, drugs/alcohol, fraud, racism, and grooming. Recently, Yubo’s COO, Marc-Antoine Durand, sat down with Two Hat to share his thoughts on building and operating a safe social platform for teens, and where future evolutions in content moderation may lead.

Talk about what it’s like to operate a community of young people from around the globe sharing 7 million comments every day on your platform.

It’s like running a city. You need to have rules and boundaries, and importantly you need to educate users about them, and you have to undertake prevention to keep things from getting out of hand in the first place. You’ll deal with all the bad things that exist elsewhere in society – drug dealing, fraud, prostitution, bullying and harassment, thoughts or attempts at suicide – and you will need a framework of policies and law enforcement to keep your city safe. It’s critical that these services are delivered in real-time.

Marc-Antoine Durand, COO of Yubo

The future safety of the digital world rests upon how willing we are to use behavioral insights to stop the bad from spoiling the good. If a Yubo moderator sees something happening that violates community guidelines or could put someone at risk, they send a warning message to the user. The message might say that their Live feed will be shut down in one minute, or it might warn the user they will be suspended from the app if they don’t change their behavior. We’re the only social video app to do this, and we do it because the best way for young people to learn is in the moment, through real-life experience.

Yubo’s role is to always find a balance between ensuring self-expression and freedom of speech while preventing harm. Teenagers are very keen to talk about themselves, are interested in others and want to share the issues that are on their minds such as relationships and sexuality. This is a normal part of growing up and development at this point in teenagers’ lives. But this needs to be done within a context that is healthy and free from pressure and coercion, for example, sharing intimate images. Finding a limit or balance between freedom and protection in each case is important to make sure the app is appealing to young people and offers them the space for expression but keeps them as safe as possible.

When Yubo first launched in 2016, content moderation was still quite a nascent industry. What were your solutions options at the time and how was your initial learning curve as a platform operator?

There weren’t many options available then. You could hire a local team of moderators to check comments and label them, but that’s expensive and hard to scale. There was no way our little team of four could manage all that and be proficient in Danish, English, French, Norwegian, Spanish and Swedish all at the same time. So multi-language support was a must to have.

We created our own algorithms to detect images that broke Yubo’s community guidelines and acceptable use policies, but content moderation is a very special technical competency and it’s a never-ending job and there were only four of us and we simply couldn’t do all that was required to do this well…  As a result, early on, we were targeted by the press as a ‘bad app.’ To win the trust back and establish the app as safe and appropriate for young people we had to start over. Our strategy was to show that we were working hard and fast to improve and we set out to establish that a small company with the right safety strategy and tools can be just as good, or better, at content moderation as any large company.

I applaud Yubo for extensively reworking its safety features to make its platform safer for teens. Altering its age restrictions, improving its real identity policy, setting clear policies around inappropriate content and cyberbullying, and giving users the ability to turn location data off demonstrates that Yubo is taking user safety seriously.

Julie Inman Grant, Australian e-safety Commissioner

What are some of the key content moderation issues on your platform and how do you engage users as part of the solution?

One of the issues every service has is user fake profiles. These are particularly a problem in issues like grooming, or bullying. To address this, we have created a partnership with a company called Yoti that allows users to certify their identity. So, when you’re talking to somebody, you can see that they have a badge signifying that their identity has been certified, indicating they are ‘who they say they are.’ It’s a voluntary process for users to participate in this, but if we think a particular profile may be suspicious or unsafe, we can force the user to certify their identity, or they will be removed from the platform.

Real time intervention by Yubo moderatorsThe other issues we deal with are often related to the user’s live stream title, which is customizable, and the comments in real-time chats. Very soon after launching, we saw that users were creating sexualized and ‘attention-seeking’ live stream titles not just for fun, but as a strategy to attract more views, for example, with a title such as: “I’m going to flash at 50 views.” People are very good at finding ways to bypass the system by creating variations of words. We realized immediately that we needed a technology to detect and respond to that subversion.

As to engaging users as part of our content moderation, it’s very important to give users who wish to participate in some way an opportunity to help and something they can do to help with the app. Users want and value this. When our users report bad or concerning behavior in the app, they give us a very precise reason and good context. They do this because they are very passionate about the service and want to keep it safe. Our job is to gather this feedback and data so that we may learn from it, but also to take action on what users tell us, and to reward those who help us. That’s how this big city functions.

Yubo was referenced as part of the United Kingdom’s Online Harms white paper and consultation — What’s your take on pending duty of care legislation in the UK and elsewhere, and are you concerned that a more restrictive regulatory environment may stifle technical innovation?

I think regulation is good as long as it’s thoughtful and agile to adjust to a constantly changing technical environment and not simply a way to blame apps and social platforms for all the bad things happening in society because that does not achieve anything. Perhaps most concerning is setting standards that only the Big Tech companies with thousands of moderators and technical infra-structure staff can realistically achieve, and this prohibits and restricts smaller start-ups being innovative and able to participate in the ecosystem. Certainly, people spend a lot of time on these platforms and they should not be unregulated, but the government can’t just set rules, they need to help companies get better at providing safer products and services.

It’s an ecosystem and everyone needs to work together to improve it and keep it as safe as possible, and this includes the wider public and users themselves. So much more is needed in the White Paper about media literacy and managing off-line problems escalating and being amplified online. Bullying and discrimination, for example, exist in society and strategies are needed in schools, families, and communities to tackle these issues – just focusing online will not deter or prevent these issues.

In France, by comparison to the UK, we’re very far away from this ideal ecosystem. We’ve started to work on moderation, but really the French government just does whatever Facebook says. No matter where you are, the more regulations you have, the more difficult it will be to start and grow a company, so barriers to innovation and market entry will be higher. That’s just where things are today.

It’s in our DNA to take safety features as far as we can to protect our users.

— Marc-Antoine Durand, COO of Yubo

How do you see Yubo’s approach to content moderation evolving in the future?

We want to build a reputation system for users, the idea being to do what I call pre-moderation, or detecting unsafe users by their history. For that, we need to gather as much data as we can from our user’s live streams, titles, and comments. The plan is to create a method where users are rewarded for good behavior. That’s the future of the app, to reward the good stuff and, for the very small minority who are doing bad stuff, like inappropriate comments or pictures or titles, we’ll engage them and let them know it’s not ok and that they need to change their behavior if they want to stay. So, user reputation as a baseline for moderation. That’s where we are going.

 


We’re currently offering no-cost, no-obligation Community Audits for social networks that want an expert consultation on their community moderation practices.

Our Director of Community Trust & Safety will examine your community, locate areas of potential risk, and provide you with a personalized community analysis, including recommended best practices and tips to maximize user engagement.

Sign up today and we’ll be in touch.