3 Takeaways From The 16th International Bullying Prevention Conference

I recently had the privilege to speak on the keynote gaming panel of the 16th Annual International Bullying Prevention Conference, an event themed Kindness & Compassion: Building Healthy Communities.

The International Bullying Prevention Association is a 501(c)3 nonprofit organization founded in 2003 when grassroots practitioners and researchers came together to convene the first conference in the US entirely focused on bullying prevention. They host an annual conference in Chicago where attendees can benefit from workshops, poster sessions and TED-inspired sessions which deliver hands-on solutions and theoretical, research-based presentations. 

Below, I focus on the sessions and discussions I participated in regarding cyberbullying, and present a brief account of the takeaways I brought back to Canada and Two Hat.

1. User-centric approaches to online safety

A few people on the tech panels referred to the concept of “user-centric safety” — letting users set their boundaries and comfort levels for online interactions. Catherine Teitelbaum, a renowned Global Trust & Safety Executive who heads up Trust & Safety for Twitch, is a big champion of the idea and spoke about how the concept of “safety” varies from person to person. Offering customized control for the user experience, like Twitch does with Automod by empowering channel owners to set their chat filtering standards, is the way of the future. 

Online communities are diverse and unique, and often platforms contain many communities with different norms. The ability to tailor chat settings to those unique characteristics is critical.

Wouldn’t it be great for users to be able to choose their safety settings and what they are comfortable with – the same way they can set their privacy settings on online platforms? What if a mother wants to enjoy an online platform with her child, but wants to ensure that they don’t see any sexual language? Perhaps a gamer just wants to relax and play a few rounds without experiencing the violent language that might be the norm in a mature game centered around combat. The more agency and flexibility we give to users and players online, the better we can cater to the different expectations we all have when we log in.

2. Shared Responsibility, and the Importance of Diverse Voices

The concept of sharing and contributing to the greater good of online safety practices across tech industries also came up. Here at Two Hat we believe that ushering in a new age of content moderation and empowering an Internet that will fulfill its true purpose of connecting human beings is only possible through a shared responsibility approach (which also came up in the conference). We believe it will take the efforts of everyone involved to truly change things for the better. This includes academia, industry, government, and users. 

In his 2018 book “Farsighted: How Do We Make The Decisions That Matter The Most”, Steven Johnson writes about how complex decisions require a comprehensive mapping of all factors involved and how those are informed and extremely benefited from a set of diverse perspectives. The best, farsighted decisions compile the voices of a variety of people. The intricate human interaction systems we are creating on the Internet require complex decision-making at both the inception and design stage. However, right now those decisions are rarely informed by multi-disciplinary lenses. No wonder we are so shortsighted when it comes to anticipating issues with online behaviour and online harms.

A true, collaborative community of practice is needed. We need that rising tide that floats all boats, as my good friend Dr. Kim Voll says.

3. Empathy as an Antidote

Another good friend, Dr. Sameer Hinduja was one of the speakers in the conference. Dr Hinduja is a Professor in the School of Criminology and Criminal Justice at Florida Atlantic University and Co-Director of the Cyberbullying Research Center who is recognized internationally for his groundbreaking work on the subjects of cyberbullying and safe social media use. You will be hard-pressed to find someone more dedicated to the well-being of others.

He talked about how empathy can be used to prevent bullying, pulling from research and practical applications that have resulted in improvement in peer to peer relationships. He stressed the importance of practices that lead youth to go beyond the traditional approach of “being in someone else’s shoes” to feel empathy, and reaching a point where they truly value others. This is so important, and it makes me wonder: How can we design human interaction systems online where we perceive each other as valuable individuals and are constantly reminded of our shared humanity? How do we create platforms that discourage solely transactional interaction? How do we bring offline social cues into the online experience? How can we design interaction proxies to reduce friction between users – and ultimately lead us to more positive and productive online spaces? I don’t have all the answers – no one does. But I am encouraged by the work of people like Dr Hinduja, the Trust and Safety team at Twitch, the incredible Digital Civility efforts of Roblox and my friend Laura Higgins, their Director of Community Safety & Digital Civility, and events like The International Bullying Prevention Conference.

Moving Forward

Cyberbullying is one of the many challenges facing online platforms today. Let’s remember that it’s not just cyberbullying – there is a wider umbrella of behaviors that we need to better understand and define, including harassment, reputation tarnishing, doxxing, and more. We need to find a way to facilitate better digital interactions in general, by virtue of how we design online spaces, how we encourage positive and productive exchanges, and understanding that it will take a wider perspective, informed by many lenses, in order to create online spaces that fulfill their true potential.

If you’re reading this, you’re likely in the industry, and you’re definitely a participant in online communities. So what can you do, today, to make a difference? How can industry better collaborate to advance online safety practices?



Three Techniques to Protect Users From Cyberbullying

CEO Chris Priebe founded Two Hat Security back in 2012, with a big goal: To protect people of all ages from online bullying. Over the last six years, we’ve been given the opportunity to help some of the largest online games, virtual worlds, and messaging apps in the world grow healthy, engaged communities on their platforms.

Organizations like The Cybersmile Foundation provide crucial services, including educational resources and 24-hour global support, to victims of cyberbullying and online abuse.

But what about the platforms themselves? What can online games and social networks do to prevent cyberbullying from happening in the first place? And how can community managers play their part?

In honour of #StopCyberbullyingDay 2018 and our official support of the event, today we are sharing our top three techniques that community managers can implement to stop cyberbullying and abuse in their communities.

1. Share community guidelines.
Clear community standards are the building blocks of a healthy community. Sure, they won’t automatically prevent users from engaging in toxic or disruptive behaviour , but they go a long way in setting language and behaviour expectations up front.

Post guidelines where every community member can see them. For a forum, pin a “Forum Rules, Read Before Posting” post at the top of the page. For comment sections, include a link or popup next to the comment box. Online games can even embed code of conduct reminders within their reporting feature. Include consequences — what can users expect to happen if policies are broken?

Don’t just include what not to do — include tips for what you would like your users to do, as well. Want the community to encourage and support each other? Tell them!

2. Use proactive moderation.
Once community standards are clearly communicated, community managers need a method to filter, escalate, and review abusive content.

Often, that involves choosing the right moderation software. Most community managers use either a simple profanity filter or a content moderation tool. Proactive moderation involves filtering cyberbullying and abuse before it reaches the community. Profanity filters use a strict blacklist/whitelist to detect harassment, but they’re not sophisticated or accurate enough to understand context or nuance, and some only work for the English language.

Instead, find a content moderation tool that can accurately identify cyberbullying, remove it in real-time — and ultimately prevent users from experiencing abuse.

Of course, platforms should still always have a reporting system. But proactive moderation means that users only have to report questionable, “grey-area” content or false positive, instead of truly damaging content like extreme bullying and hate speech.

3. Reward positive users.
Positive user experience leads to increased engagement, loyalty, and profits.

Part of a good experience involves supporting the community’s code of conduct. Sanctioning users who post abusive comments or attack other community members is an essential technique in proactive moderation.

But with so much attention paid to disruptive behaviour, positive community members can start to feel like their voices aren’t heard.

That’s why we encourage community managers to reinforce positive behaviour by rewarding power users.

Emotional rewards add a lot of value, cost nothing, and take very little time. Forum moderators can upvote posts that embody community standards. Community managers can comment publicly on encouraging or supportive posts. Mods and community managers can even send private messages to users who contribute to community health and well-being.

Social rewards like granting access to exclusive content and achievement badges work, too. Never underestimate the power of popularity and peer recognition when it comes to encouraging healthy behaviour!

When choosing a content moderation tool to aid in proactive moderation, look for software that measures user reputation based on behaviour. This added technology takes the guesswork and manual review out of identifying positive users.

#StopCyberbullyingDay 2018, organized by the Cybersmile Foundation.

The official #StopCyberbullyingDay takes place once every year, on the third Friday in June. But for community managers, moderators, and anyone who works with online communities (including those of us at Two Hat Security), protecting users from bullying and harassment is a daily task. Today, start out by choosing one of our three healthy community building recommendations — and watch your community thrive.

After all, doesn’t everyone deserve to share online without fear of harassment or abuse?



How Are You Celebrating Stop Cyberbullying Day?

Bullied: A Life in Two Stories

One. She wakes with a heaviness in her heart. It’s only Tuesday; still four school days left to go if she includes today. She glances at her phone, swipes to open the screen. Seventeen notifications. Texts, message threads, every app lit up with a new comment.

She ignores them all. She already knows what they say, anyway.

She gets dressed, carefully avoiding the mirror. Eats her breakfast in silence; soggy little rainbow circles, drenched in milk.

Her phone vibrates. She glances at the screen. It’s briefly lit with a message from a number she doesn’t recognize. u r faaaaaaat lil piggy, it says. She looks away, reads the back of the cereal box instead.

Breakfast finished, she shrugs into her backpack. Time to face the day. Time to leave.

Tucked away in her back pocket, her phone vibrates again.

Two. He double-clicks the bronze shield on his desktop. The game opens with a burst of heroic drums and horns. He enters his username and password, selects his favorite server, armors up for battle, and strides into the town square where several members of his clan wait. The square is crowded, teeming with barrel-chested warriors, tall mages draped in black cloaks, hideous pop-eyed goblins hopping from foot to foot.

He scans the usernames, looking for one in particular. Doesn’t see it. Feels his shoulders loosen and his back relax. He hadn’t realized how much tension he was holding inside, just looking for the name.

“who is ready to fight?” he types in the room chat.

A private message flashes in the lower right corner of his screen.

“hey faggot loser im baaaaaack”
“when u r goin to kill yrslf”
“log off n die loser”

His shoulders tighten again. It’s going to be a long session.

#StopCyberbullyingDay

Those are only two examples of online bullying. There are countless others.

In 2017, there are no safe spaces for the bullied. We are all connected, day and night. Kids can’t disengage. We can’t expect them to put their iPhones away, stop using social networks, and walk away from the internet.

Online communities are just as meaningful as offline communities. And for kids and teens, they can — and should — be spaces that encourage personal growth, curiosity, and discovery. But too often, the online space is riddled with obstacles that stop kids from reaching their true potential.

The internet grew up fast.

We’re only just starting to realize that we’ve created a culture of bullying and abuse. So it’s up to us to change the culture.

As adults, it’s our job to ensure that when kids and teens are online, they are safe. Safe to be themselves, safe to share who they really are, and safe from abuse.

Today we celebrate Stop Cyberbullying Day. Launched by the Cybersmile Foundation in 2012, it’s a dedicated day to campaign for a better internet — for a truly inclusive space where everyone is free to share without fear of harassment or abuse.

Here at Two Hat Security, we believe in a world free of online bullying, harassment, and child exploitation. Today’s message of solidarity and empathy is core to our vision of a better internet. No one can fix this problem on their own, which is why days like today are so important.

Let’s come together — as families, as companies, as co-workers, and as citizens of this new digital world — and take a stand against bullying. The Cybersmile Foundation has some great ideas on their site — like Tweeting something nice to a person you follow or coming up with a new anti-bullying slogan.

We’ll continue to find new ways to protect online communities around the world. And we’ll keep trying to change the culture and the industry, every day. We hope you’ll join us.