Snapchat, Kik and Facebook have all openly struggled with issues of cyberbullying on their platforams and while they may be some of the biggest and most ubiquitous, they are hardly alone.
Recently, I have read commentaries in response to the excellent New York Times series on child sexual abuse. One particular point that was raised inspired me to write this article: the claim that existing technologies are not sophisticated enough to stop predators online, and that artificial intelligence systems alone might provide a solution. In desperate times, when the horrid truth of online child sexual abuse (there’s no such thing as child pornography) and the staggering increase in images and videos being shared are crushing our collective spirits, it’s understandable that we will look for a silver bullet.
Chris Priebe, CEO of Two Hat Security, explores how business, government and users will be affected by Online Harms and Duty of Care legislation in the UK and beyond.
In November, Laura Higgins, Director of Community Safety & Digital Civility at Roblox shared the fascinating results of a recent survey with International Bullying Prevention Association Conference attendees in Chicago. The results provide a refreshingly honest peek into the world of online gaming and family communication – and should serve as a wake-up call for family-based organizations.
Roblox conducted two separate surveys – in the UK and the US. In the UK, they spoke to over 1500 parents, and in the US they surveyed more than 3500 parents and 580 teenagers, with different questions but some of the similar themes.
Two Hat Director of Community Trust & Safety Carlos Figueiredo was lucky enough to share the stage with Laura during the Keynote Gaming Panel at the same conference. During the panel and in conversations afterward, he and Laura spoke at length about the surprising survey results and how the industry needs to adopt a “Communication by Design” approach when talking to parents.
What follows is a condensed version of their conversations, where Laura shares her biggest takeaways, advice for organizations, and thoughts on the future of digital conversations.
Carlos Figueiredo: Some fascinating and surprising results came out of these surveys. What were your biggest takeaways?
Laura Higgins: In the UK survey, unsurprisingly, 89% of parents told us that they were worried about their kids playing games online. They cited concerns about addiction, strangers contacting their children, and that gaming might lead to difficulties forming real-life friends or social interactions.
What was really interesting is that nearly the same number of parents said they could see the benefits of gaming, so that’s something we’re going to really unpack over the next year. They recognize improved cognitive skills, they loved the cooperation and teamwork elements that gaming provided, the improved STEM skills. They recognize that playing games can help kids in the future as they will need digital skills as adults, which was really interesting for us to hear about.
The big thing that came out of this that we really need to focus on is that, of those people who said they were worried about gaming, half of them told us that their fears were coming from stories they saw on media and social media, instead of real-life experience. We know there’s a lot of negativity in the press, particularly around grooming and addiction/gambling, so I think we need to be mindful of the way we talk to parents so that whilst we’re educating them about possible risks (and we know that there are risks), we’re also discussing how to raise resilient digital citizens and are giving them the tools to manage risks rather than just giving them bad news. We’re trying to proactively work with media outlets by telling them, if you want to talk about the risks, that’s fine, but let’s share some advice in there as well, empower rather than instill even more fear.
Did you see different results with the US survey?
With the US research we were also able to reach 580 teens and compare the data from them and parents. Some of the most startling stuff for us was the disconnect between what parents think is really happening versus what kids think is happening. For example, 91% of parents were convinced that their kids would come and talk to them if they were being bullied. But only 26% of kids said they would tell their parents if they had witnessed bullying. In fact, they would tell anyone else but their parents; they would report it to the platform, they would challenge the bully directly, or they would go to another adult instead of their parents.
The gap was echoed throughout the whole survey. We asked if parents talked to their kids about online safety and appropriate online behavior, and 93% of parents said that they were at least occasionally or regularly discussing this topic with their kids, while 60% of teens said that their parents never or rarely talked to them about appropriate online behavior. So, whatever it is that parents are saying — kids aren’t hearing it. We need to make sure we’re reaching kids. It’s more than just sitting down and talking to them; it’s how it’s being received by kids as well.
It seems like your surveys are uncovering some uncomfortable realities – and the things that the industry needs to focus on. We talk a lot about Safety by Design, but it seems like a focus we’re missing is Communication by Design.
We were surprised with how honest parents were. Over half of UK parents, for example, are still not checking privacy and security settings that are built in. Part of my role at Roblox has been to review how accessible the advice is, how easy to understand it is, and it’s an ongoing process. We appreciate how busy parents are – they don’t have time to go looking for things.
We asked US parents who rarely or never had conversations with their kids about appropriate behavior online, why they didn’t feel like they were necessary, and we got some fascinating quotes back. Parents think they’re out of their depth, they think that their kids know more than them. In some cases that may be true, but not really – digital parenting is still parenting.
We heard quotes like, “If my kid had a problem, they would tell me.” The research tells us that’s not true.
“If my child was having problems, I would know about it.” But if you’re not talking about it, how is that going to happen?
“I brought my kid up right.” Well, it’s not their behavior we always have to look at – it’s their vulnerabilities as well.
We need to talk more broadly than just how to use the settings, so I think there are many layers to these conversations for parents as well.
What are some other things we can do as an industry to help parents?
One is, give them the skills and easy, bite-sized tips: here’s how you check your safety settings, here’s how you set privacy settings, here’s how you report something in-game, practical things they can teach their kids as well.
There’s also a broader conversation that empowers parents to learn how to have conversations. At Roblox, we do lots of work around things like, how to say no to your kids, what is an appropriate amount of screen time for your child, how to manage in-game purchases, and setting boundaries and limits, all advice that parents are grateful for. But if we just had an advice section or FAQ on the website, they would never get to hear those messages.
It’s about amplifying the message, working with the media as much as possible, having some different outlets like our Facebook page that we just launched. So parents who are sitting on the bus on the way to work scrolling through and finding those little reminders is really helpful.
Speaking of your new Facebook page, Roblox has been really innovative in reaching out to parents.
We’re also taking it offline. We have visits to Roblox, for instance, with kids. We’ll be holding an online safety session for parents while kids are off doing other activities. So I’m helping to write that. And working with parents in organizations as well, so they can still get those messages out where people are.
Schools have a key place in all of these conversations. We know that the quality of online safety conversations in schools is poor, it’s often still an assembly once a year and we’re going to scare you silly, not actually talk about practical stuff, rather than delivering these lessons through the curriculum. They should be reminding kids of appropriate online behavior at all times and giving them those digital literacy skills as well.
We’re doing webinars, we’re doing visits, and hopefully, gradually we’ll keep feeding them those messages.
It’s encouraging that you’re so committed to this, trying to change culture. Not every platform is putting in this effort.
I think we have to. I’ve been working in digital safeguarding for years, and I don’t think that we’ve hit that sweet spot yet. We haven’t affected enough change, and we need to move even faster.
Now, with all of these conversations about online harms papers and regulations, we’ve worked with partners in Australia and New Zealand where they have the Harmful Digital Communications Act but it’s still not really changing, This is just a new approach – that drip-feed, that persistence that hopefully will affect change.
We’re very lucky at Roblox – our community is really lovely. By the way, 40% of Roblox users are females which is rare in gaming. And they are very young and very supportive of each other. They are happy to learn at that age. And we can help to shape them and mold them, and they can take those attitudes and behaviors through their online digital life, as they grow up.
In the survey, we wanted the kids to tell us about the positive and negative experiences that they’ve had online. Actually, what most of them reflected wasn’t necessarily around things like bullying and harassment – they were actually saying that the things that made them feel really bad were when they did badly in a game and they were a bit tough on themselves. And they said they would walk away for 10 minutes, come back, and it was fine. And when people were positive to them in-game, they were thinking about it a few days later. So when we’re looking at how we manage bad behavior in our platform, it’s really important that we have rules, that we have appropriate sanctions in place, and that we can use the positive as an educational tool. I think we really need that balance.
I love that framing. It’s a reminder that most players are having a good time and enjoying the game the way it was meant to be enjoyed. We all have bad days but nasty behavior is not the norm.
It’s in everybody’s interest to make it a positive experience. We have a role to play in that but so do the kids themselves. They self-regulate, they call out bad behaviors, they are very supportive of each other.
We asked them why they play online games and 72% said, “Because it’s fun!”
That should be the starting point. Ultimately, it’s about play and how important that is for all of us.
What is your best advice for gaming organizations, from reinforcing positive behavior to better communicating with parents?
Great question. The first thing is to listen to your community. Their voice is really important. Without our players and their families, we would not have Roblox. Gaming companies can sometimes make decisions that are good for the business, rather than what the players want and what the community needs. And act on it. Take their feedback.
If you’re working with children, have a Duty of Care to make it as safe as possible. That’s a difficult one, because we know that small companies and startups might struggle financially. We’re working with trade bodies on the idea of Safety by Design – what are the bare minimums that must be met before we let anyone communicate on your platform? It doesn’t have to be all of the best equipment, tools, systems in place, but there are some standards that I think we should all have in place.
For example, if you have chat functions you need to make that you’ve got the right filters in place. Make sure it is age-appropriate all the way through.
Ultimately, machine learning and AI is wonderful, but it can never replace humans in certain roles or situations. You need well-trained, good moderators. Moderators have one of the most important roles in gaming platforms, so making sure they’re really well supported is important. They have a tough job. They are dealing with very upsetting things that might happen, so making sure that they aren’t just trained to deal with it, but that they have after-care as well.
If you are a family-based platform make sure you reach out to parents. I met with a delegate and she said it was the first time she’s heard a tech company talk about engaging with parents. I think if we could all start doing that a little bit more, it would be better.
You mentioned that in your 20 years in the digital civility industry, the needle has barely moved. Do you think that’s changing?
I’m really hopeful for the future. I had talked with journalists a few months ago who were slightly scoffing at my aspirations of digital civility. If you’re coming from a starting point where you just assume that games are bad and the players are bad and the community is bad – you’re wrong. People are kind. People do have empathy. They want to see other people succeed. For example, nearly all teens (96%) in our survey said they would likely help a friend they see being bullied online, and the majority of teens confirmed they get help from other players when they need it at least “sometimes,” with 41% saying they get peer help “often” or “always.” Those are all things we see all the time in gaming. And we have this opportunity to spread that out even more and build those really good positive online citizens. This is much bigger than Roblox. These kids are the future. The more that we can invest in them, the better.
We all need to enable those conversations, encourage those conversations, and equip parents with the right messages.
Two weeks ago, the Two Hat team and I packed up our bags and flew to London for a jam-packed week of government meetings, media interviews, and two very special symposiums.
I’ve been traveling a lot recently – first to Australia in mid-September for the great eSafety19 conference, then London, and I’m off to Chicago next month for the International Bullying Prevention Association Conference – so I haven’t had much time to reflect. But now that the dust has settled on the UK visit (and I’m finally solidly back on Pacific Standard Time), I wanted to share a recap of the week as well as my biggest takeaways from the two symposiums I attended.
We were welcomed by several esteemed media companies and had the opportunity to be interviewed by journalists who had excellent and productive questions.
Haydn Taylor from GamesIndustry.Biz interviewed Two CEO and founder Chris Priebe, myself, and Cris Pikes, CEO of our partner Image Analyzer about moderating harmful online content, including live streams.
Rory Cellan-Jones from the BBC talked to us about the challenges of defining online harms (starts at 17:00).
I’m looking forward to more interviews being released soon.
We also met with branches of government and other organizations to discuss upcoming legislation. We continue to be encouraged by their openness to different perspectives across industries.
Chris Priebe continues to champion his angle regarding transparency reports. He believes that making transparency reports truly transparent – ie, digitizing and displaying them in app stores – has the greatest potential to significantly drive change in content moderation and online safety practices.
Transparency reports are the rising tide that will float all boats as nobody will want to be that one site or app with a report that doesn’t show commitment and progress towards a healthier online community. Sure, everyone wants more users – but in an age of transparency, you will have to do right by them if you expect them to join your platform and stick around.
Content Moderation Symposium – “Ushering in a new age of content moderation”
On Wednesday, October 2nd Two Hat hosted our first-ever Content Moderation Symposium. Experts from academia, government, non-profits, and industry came together to talk about the biggest content moderation challenges of our time, including tackling complex issues like defining cyberbullying and child exploitation behaviors in online communities to unpacking why a content moderation strategy is business-critical going into 2020.
Alex Holmes, Deputy CEO of The Diana Award opened the day with a powerful and emotional keynote about the effects of cyberbullying. For me, the highlight of his talk was this video he shared about the definition of “bullying” – it really drove home the importance of adopting nuanced definitions.
Next up were Dr. Maggie Brennan, a lecturer in clinical and forensic psychology at the University of Plymouth, and an academic advisor to Two Hat, and Zeineb Trabelsi, a third-year Ph.D. student in the Information System department at Laval University in Quebec, and an intern in the Natural Language Processing department at Two Hat.
Dr. Brennan and Zeineb have been working on academic frameworks for defining online child sexual victimization and cyberbullying behavior, respectively. They presented their proposed definitions, and our tables of six discussed them in detail. Discussion points included:
Are these definitions complete and do they make sense? What further information would we require to effectively use these definitions when moderating content? How do we currently define child exploitation and cyberbullying in our organizations?
My key takeaway from the morning sessions? Defining online harms is not going to be easy. It’s a complicated and nuanced task because human behavior is complicated and nuanced. There are no easy answers – but these cross-industry and cross-cultural conversations are a step in the right direction. The biggest challenge will be taking the academic definitions of online child sexual victimization and cyberbullying behaviors and using them to label, moderate, and act on actual online conversations.
I’m looking forward to continuing those collaborations.
Our afternoon keynote was presented by industry veteran David Nixon, who talked about the exponential and unprecedented growth of online communities over the last 20 years, and the need for strong Codes of Conduct and the resources to operationalize good industry practices. This was followed by a panel discussion with industry experts and several Two Hat customers. I was happy to sit on the panel as well.
My key takeaway from David’s session and the panel discussion? If you design your product with safety at the core (Safety by Design), you’re setting yourself up for community success. If not, reforming your community can be an uphill battle. One of our newest customers Peer Tutor is implementing Safety by Design in really interesting ways, which CEO Wayne Harrison shared during the panel. You’ll learn more in an upcoming case study.
Finally, I presented our 5 Layers of Community Protection (more about that in the future – stay tuned!), and we discussed best practices for each layer of content moderation. The fifth layer of protection is Transparency Reports, which yielded the most challenging conversation. What will Transparency Reports look like? What information will be mandatory? How will we define success benchmarks? What data should we start to collect today? No one knows – but we looked at YouTube’s Transparency Report as an example and guidance on what may be legislated in the future.
My biggest takeaway from this session? Best practices exist – many of us are doing them right now. We just need to talk about them and share them with the industry at large. More on that in an upcoming blog post.
Fair Play Alliance’s First European Symposium
Being a co-founder of the Fair Play Alliance and seeing it grow from a conversation between a few friends to a global organization of over 130 companies and many more professionals has been incredible, to say the least. This was the first time the alliance held an event outside of North America. As a global organization, it was very important to us, and it was a tremendous success! The feedback has been overwhelmingly positive, and we are so happy to see that it provided lots of value to attendees.
It was a wonderful two-day event held over October 3rd and 4th, with excellent talks and workshops that were hosted for members of the FPA. Chris Priebe, a couple of industry friends/veteran Trust & safety leaders, and I hosted one of the workshops. We’re all excited to take that work forward and see the results that will come out of it and benefit the games industry!
What. A. Week.
As you can tell, it was a whirlwind week and I’m sure I’ve forgotten at least some of it! It was great to connect with old friends and make new friends. All told, my biggest takeaway from the week was this:
Everyone I met cares deeply about online safety, and about finding the smartest, most efficient ways to protect users from online harms while still allowing them the freedom to express themselves. At Two Hat, we believe in an online world where everyone is free to share without fear of harassment or abuse. I’ve heard similar sentiments echoed countless times from other Trust & Safety professionals, and I truly believe that if we continue to collaborate across industries, across governments, and across organizations, we can make that vision a reality.
So let’s keep talking.
I’m still offering free community audits for any organization that wants a second look at their moderation and Trust & Safety practices. Sign up on our Community Audit page.
In its first 48 hours, YOLO acquired 1 million users, a plague of cyberbullying, a scalable content moderation solution, and a new vision for the future.
YOLO may have never lived at all if not for a weekend experiment. Gregoire Henrion and his cofounders weren’t really interested in an anonymity app, they were just curious what they could build over an idle couple of days. But when YOLO hit the App Store, it found instant traction and caught a ride on a viral loop via Snapchat.
“We had a million users in two days,” says Henrion. Unfortunately, the anonymous nature of the app was also providing a platform for cyberbullying, which spread like wildfire. “We hadn’t thought of it before, because we’d never dreamed of the scale. But even after one day, we knew it was a big issue.”
Desperate for a solution, Henrion reached out to peers in Paris’ apps ecosystem. “I spoke with a friend at Yubo who used Two Hat’s Community Sift and recommended it,” he says. “He connected me with Sharon, their account executive at Two Hat, who instantly went to work for YOLO.”
“Within a day, we went from having lots of bad behaviors, to being safe as could be.”
YOLO was at this time in the midst of a feeding frenzy of meetings, media and monetization that only the developers of such viral app sensations can truly understand. “We were doing funding calls and everything else – it was crazy – but Two Hat sorted out what we needed, and the implementation was completed within hours.”
YOLO initially went with very strong guidelines before easing settings based on user feedback. “We wanted to fix what was wrong, and we didn’t want to be associated with bad behaviors,” says Henrion. By experimenting with policies and settings we find we are able to deal with 95 to 99 percent of the issues.”
In YOLO’s configuration, inappropriate messages or comments simply do not get shared, but the offending party doesn’t know this. In the content moderation industry, this is known as a false send. But the bully just knows they’re not getting any attention back, which is often enough for them to stop and go away. “Now, we have the app tuned so that the filters are super-efficient,” says Henrion.
“We have a lot of control now; our users are happy, and we are super happy.”
Moving forward, YOLO plans to apply what the company has learned about anonymity and social media to carve a new approach to online safety. “When we look at user behavior now, one of the secrets of YOLO being successful is that even if the user’s name is hidden, you still see the face of the user on their profile pic. This bit of exposure – you don’t know me but you can see my face – is very often enough to make users regulate their behaviors. That’s naturally ‘Safe by Design’ because it’s our normal behavior.”
“If you want to create value you have to make something secure. We’re not naive anymore. We know all the bad things that can happen in social.”
YOLO envisions their community and others as a place where Safety by Design has encouraged users to change behaviors — Why bully, harass or cajole in a community if life on mute is the only possible outcome?
“Anonymity alone is not a sustainable approach to managing communities,” says Henrion. “Two Hat’s Community Sift gives us tools to help shift user behavior, the security system to deal with those who cause trouble, and a solution we know scales quickly.”
We’re currently offering no-cost, no-obligation Community Audits for social networks that want an expert consultation on their community moderation practices.
Our Director of Community Trust & Safety will examine your community, locate areas of potential risk, and provide you with a personalized community analysis, including recommended best practices and tips to maximize user engagement.
Sign up today and we’ll be in touch.