When Social Networks Put Online Safety First, We All Win

 “If we’re looking at the current zeitgeist, you have a consumer base that’s looking toward tech companies to showcase moral guidance.” — David Ryan Polgar

Users are fed up.

Tired of rampant harassment and abuse in social media, consumers have finally begun to demand safer online spaces that encourage and reward good digital citizenship. And they’re starting to hold social networks accountable for dangerous behavior on their platforms.

But what exactly are online safety and digital citizenship? And what can social networks do to make safety an industry standard?

We spoke with Trust & Safety experts David Ryan Polgar of Friendbase and Carlos Figueiredo of Two Hat Security to get their thoughts on changing attitudes in the industry — and the one thing that social networks can do today to inspire civility and respect on their platform.

Click play to listen:

Highlights & key quotes

On safety:

“Online safety… is very similar to driving. There are lots of dangers to getting on the road, but that doesn’t mean we don’t get on the road.” — David Ryan Polgar

“It’s important for us to consider not just safety, but what is a healthy online experience? It’s okay to have a certain amount of risk that will vary from community to community… We don’t want to focus just on the dangers and risks.” — Carlos Figueiredo

On “safety by design”:

“There are lots of examples where a company scaled up quickly and aggressively got millions of users, but they didn’t necessarily have the features in place to have a safe experience. We want safety, but we also want vibrancy, that happy mix  — what I call a ‘Goldilocks zone.’ And the danger is, once you get labeled as a place that allows for toxic behavior, it’s very difficult to alter that perception, even when you change some of the tools.” — DRP

“Whenever possible, safety needs to be a product and design consideration from the very beginning… by having this proactive approach, you can prevent a lot of issues.” — CF

On setting a positive tone in your product:

I think the big thing is intuitive tools. That’s always been a big complaint for a lot of individuals. Once you have a problem online, is it intuitive to report it? And then, potentially more importantly, what’s the protocol after that’s been reported?” — DRP

“One thing that I would definitely recommend that people start doing is, if they don’t have an individual or a team in charge of community well-being or community safety, have somebody where at least a big chunk of time is dedicated to this – and a team, even better. Put that as a key priority of your product. Employ really solid people who understand your community.” — CF

Online safety & digital citizenship resources

David is a board member for the non-profit #ICANHELP, which holds the first annual #Digital4Good event next month at Twitter HQ. This highly-anticipated event brings together students, representatives from the tech industry, and teachers to discuss and celebrate positive tech and media use.

Learn more on the #ICANHELP website, and follow @icanhelp and #Digital4Good on Twitter. 

Don’t miss the live-streamed event on Monday, September 18th. Carlos will be moderating a panel with three very special guests (more info to come!). They’ll be talking about player behavior in online games.

Two Hat Security is hosting an exclusive webinar about community building on Wednesday, September 13th. In The Six Essential Pillars of Healthy Online Communities, Carlos shares the six secrets to creating a thriving, engaged, and loyal community in your social product. Whether you’re struggling to build a new community or need advice shaping an existing product, you don’t want to miss this. Save your seat today!

David is a prolific writer who thoughtfully examines the ethical consequences of emerging technology. Recent pieces include Alexa, What’s the Future of Conversational Interface? and Has Human Communication Become Botified? Follow @TechEthicist on Twitter for insights into online safety, digital citizenship, and the future of tech.

About the speakers

David Ryan Polgar

David Ryan Polgar has carved out a unique and pioneering career as a “Tech Ethicist.” With a background as an attorney and college professor, he transitioned in recent years to focus entirely on improving how children, teens, and adults utilize social media & tech. David is a tech writer (Big Think, Quartz, and IBM thinkLeaders), speaker (3-time TEDx, The School of The New York Times), and frequent tech commentator (SiriusXM, AP, Boston Globe, CNN.com, HuffPost). He has experience working with startups and social media companies (ASKfm), and co-founded the global Digital Citizenship Summit (held at Twitter HQ in 2016). Outside of writing and speaking, David currently serves as Trust & Safety for the teen virtual world Friendbase. He is also a board member for the non-profit #ICANHELP, which is planning the first #Digital4Good event at Twitter HQ on September 18th.

His forward-thinking approach to online safety and digital citizenship has been recognized by various organizations and outlets across the globe and was recently singled out online by the Obama Foundation.

Carlos Figueiredo

Carlos Figueiredo leads Two Hat Security‘s Trust & Safety efforts, collaborating with clients and partners to challenge our views of healthy online communities.

Born and raised in Brazil, Carlos has been living in Canada for almost 11 years where he has worked directly with online safety for the last 9 years, helping large digital communities with their mission to stay healthy and engaged. From being a moderator himself to leading a multi-cultural department that was pivotal to the safety of global communities across different languages and cultures, Carlos has experienced the pains and joys of on-screen interactions.

He’s interested in tackling the biggest challenges of our connected times and thrives on collaborating and creating bridges in the industry.

 

 

About Two Hat Security

At Two Hat Security, we empower social and gaming platforms to build healthy, engaged online communities, all while protecting their brand and their users from high-risk content. Want to increase user retention, reduce moderation, and protect your brand?

Get in touch today to see how our chat filter and moderation software Community Sift can help you make online safety a priority in your product.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


A Four-Step Beginner’s Guide to COPPA Compliance

There are few things more rewarding in the digital space than creating a great kid’s product. The market is booming, and there are endless possibilities for new, innovative products. Kids (and parents) today are looking for unique, engaging, and above all safe apps and sites to play, share, and connect with other kids across the world.

But if you plan to market your product to kids under 13 in the US, by law you have to ensure that it follows the Children’s Online Privacy Protection Act (COPPA). The act regulates how you collect and store personal information of under-13 users in your product.

And it’s a big deal. Companies like Hasbro, Mattel, Viacom, and JumpStart Games were fined in 2016 for violating COPPA. And three new lawsuits were announced this month. If you saw the last season of Silicon Valley, like Dinesh and the rest of the Pied Piper crew, you know that COPPA fines are hefty. Privacy regulations are a hot topic in 2018. GDPR is set to take effect in the EU on May 25th. And ESRB has proposed modifications to Safe Harbor, with public comments due on May 9th.

COPPA, like most regulations, can be confusing to navigate, especially if you’re new to the industry. That’s why we’ve put together this four-step guide to help you unpack the legalese.

(Register to watch our on-demand webinar The ROI of COPPA. In it, we explore the benefits of compliance, including increased engagement, retention, and profits.)

Let’s get the easy stuff out of the way first.

What is COPPA?

The Children’s Online Privacy Protection Act (COPPA) protects the online privacy of children under the age of 13 in the US. It’s a large and complex rule (you can read it in its entirety here), but there are a few key points you should know:

  • COPPA only applies to children in the US under the age of 13 (U13). If you are a US-based company then you are expected to protect U13 globally. However, if your company is based out of the US, you are only legally obligated under COPPA to protect American children on your platform — or risk a potentially crippling fine from the FTC or even a US State Attorney General.
  • Children outside the US will be subject to emerging similar regulations like the GDPR and its minors consent requirements. Those laws might also apply to US companies operating in those countries.
  • COPPA only applies to children under the age of 13. Other international regulations have different cutoff ages. For example, GDPR may allow member states to choose an age up to 15 for parental consent requirements.
  • COPPA is designed to protect children’s online privacy and data security — it doesn’t prevent cyberbullying or profanity
  • COPPA is not just about removing private information. It’s also about having parental consent to collect, use, disclose, track or share private information.
  • COPPA also applies to third-party plugins and services you use. This is a very tricky situation and requires proper due diligence. See the current class action law suits recently filed in this area as an example.

Please note: Two Hat Security is not a law office and cannot provide legal advice. While we provide technology that helps companies achieve compliance, you should still always seek independent legal counsel. We recommend reaching out to a Safe Harbor provider like PRIVO, kidSAFE seal, or ESRB (more about these companies below).

In addition, the FTC has put together a Six-Step Compliance Plan that is a great reference and another helpful starting point.

What is PII (Personally Identifiable Information)?

PII is any information that can be used to identify a person in real life. This list is from the FTC’s page:

  • full name;
  • home or other physical address, including street name and city or town;
  • online contact information like an email address or other identifier that permits someone to contact a person directly — for example, an IM identifier, VoIP identifier, or video chat identifier;
  • screen name or user name where it functions as online contact information;
  • telephone number;
  • Social Security number;
  • a persistent identifier that can be used to recognize a user over time and across different sites, including a cookie number, an IP address, a processor or device serial number, or a unique device identifier;
  • a photo, video, or audio file containing a child’s image or voice;
  • geolocation information sufficient to identify a street name and city or town; or
  • other information about the child or parent that is collected from the child and is combined with one of these identifiers.

Personal information isn’t just a user’s real name. It also includes cookie numbers and geolocation.

Let’s get into the meat and potatoes — the four crucial steps to get started.

Step one is a big one.

Step 1: Choose your strategy

There are a few options here. Each one has its pros and cons.

Option 1: Don’t have under-13 users in your product

A common strategy used by large sites and apps like Facebook, YouTube, Twitter, and others. To be in this category, your product cannot be child-directed. So if you’ve created a product that is clearly designed for children, if you partner with kid-directed products, and market yourself as a place for youth, this option won’t work.

However, if you qualify as a general audience, you can reduce your risk by providing an age gate and blocking U13. Typically that means asking users to enter their birth date when they sign up. If you do this, you’ll also need a mechanism to block users from first selecting a child’s birthdate, then changing their answer.

As well, you will need a strategy to manage accounts that you discover are under 13 — think of kids who post pictures from their birthday party with the caption “I’m 10 now!” If that’s reported, you have to build a process to deal with it.

So, if your product is clearly designed for children, you will have to go with one of the following two options.

Option 2: Verifiable Parental Consent

Works if you need users to share personal information

Parental consent is a necessity if your site or app is a social network for under-13 users who share their real identity. As well, if your site or app is designed for video or photo sharing, getting a parent’s permission is a smart option. With the photo regulations in COPPA, it’s much smarter to err on the side of caution.

Parental consent is tricky. Recommended steps include faxing or mailing in a signature, speaking to a customer support rep, or making a payment of one dollar or more. The FTC recently approved two new options — knowledge-based challenge questions and utilizing your driver’s license (that itself has its own security risks!).

Option 3: Email plus

Works if your product doesn’t require that users share personal information

The third option is to add a parental permission step to the sign-up process. This first requires that you build an age gate. Then, if the user is under 13, you require that they enter their parent’s email address or another online identifier to request consent before gaining access to their account. After that, you send an email with proper disclosure notices and an activation link to their parents. Animal Jam, Club Penguin, and many Disney products use this strategy.

The downside to this option? It can be bad for business. Every time you add a step to the registration process, you lose more users. In some cases, you can lose up to 50% of all candidates during the email sign-up process.

But adding parental permission during the signup process is just the first step. You still have to ensure that kids can’t disclose their personal information once they’re logged in.

Here’s how COPPA defines “disclosure“:

(2) Making personal information collected by an operator from a child publicly available in identifiable form by any means, including but not limited to a public posting through the Internet, or through a personal home page or screen posted on a Web site or online service; a pen pal service; an electronic mail service; a message board; or a chat room.

To translate: You can’t let your under-13 users post “My name is Joe Rogers and I live at 2345 Markum street. Will you be my friend?” At that point, you have collected PII and broadcast it publically. This also applies to profile pictures if they contain GPS information or a child’s face.

To prevent disclosure, you’ll need to combine the email plus option with a chat and/or image filter that checks every post for PII. More on that later.

Wait, there’s more!

Another factor to consider is what kind of third-party apps and services you are using. Google Analytics and others have a config setting that can turn off user-specific behavior tracking. If you’re building a mobile app, remember to scan all your plugins to ensure they are compliant.

Which option do you choose?

Whatever decision you make will be based on your unique product.

Fortunately, there are two companies that can help you implement whichever strategy you choose.

Full disclosure: We work with all three companies below, which is why we feel comfortable recommending their services.

Recommended products

AgeCheq offers a developer-friendly service that handles nearly all of the parent/child and technical complexities of COPPA compliance for you. After you provide accurate answers to a focused privacy survey and drop a few lines of code into your website or app, COPPA parental notice and consent (and ongoing consent management) are handled with a single-sign on parental dashboard that supports multiple children and (of course) multiple publishers.

AgeCheq has put a lot of effort into streamlining the effort required by parents and children to enjoy your content, while still complying with COPPA. DragonVale World and WolfQuest are two very popular apps that use the AgeCheq service. If you may have EU users, AgeCheq also offers a universal service called ConsentCheq CDK that provides GDPR, ePrivacy and COPPA consent with a single integration.

Kids Web Services (KWS) is a platform that makes it easy for developers to build effective and COPPA and GDPR-K compliant digital products for kids, and to increase engagement and conversion. Building awesome digital experiences for the under-13 market is hard. KWS is the self-service platform for developers to effortlessly build effective digital products for kids and parents that are compliant with all global data privacy laws, such as COPPA and GDPR. You build awesome and fun digital experiences, and they ensure everything you do is transparent and compliant by design.

 

PRIVO offers an FTC-approved kid and family customer identity and permission management platform. Their platform includes registration & account creation, identity proofing, sliding scale parental consent, single sign on, account management & permission portal.

From their site: “PRIVO helps you build trust with your users, partners, and regulators by helping you meet or exceed federal and international data regulations. PRIVO can act as your fractional Chief Privacy Officer.”

They are a COPPA Safe Harbor provider as well, so they can also review your compliance (more about Safe Harbor later).

Step 2: Have a clear privacy policy and notices

Like most people, we start to go cross-eyed when looking at legal documents. The FTC outlines the components of a COPPA compliant privacy policy here.

One of the most important takeaways? Be clear and avoid legalese. AgeCheq and PRIVO can help with that.

Step 3: Filter or redact PII

Here is where a chat filter and automated moderation software like Community Sift becomes business-critical.

Whenever an under-13 user gives you text, images, or video you need to ensure that it doesn’t contain PII (if you haven’t received verifiable parental consent).

Kids are persistent. And they’re very savvy — they know their way around simple blacklists and unsophisticated filters.

Some examples of kids sharing PII that we’ve seen:

“My add dress is one two 3 on homer street”

“Ping me at two 5 oh seven seven 55”

“250” (line 1)
“Seven” (line 2)
“Seven” (line 3)
“55” … (you get the idea)

Here’s the problem: Most filters confuse these intentional manipulations with innocent things like:

“9 8 7 6 5 4 3 2 1 go”

“My stats are 456.344.2222.222 with XP of 233”

“I have 3 cats 1 dog 4 hamsters and 67 goldfish”

You don’t want to prevent kids from talking about their pets or sharing their achievements. But can you do that without blocking all numbers?

Reputation matters

Community Sift is a reputation-based system, which is the best way to deal with endlessly complicated filter tricks. With 1.8 million language patterns and linguistic templates added to the system, we find the most common ways users try to break the filter.

When a user is caught manipulating the filter, they earn a “reputation” for actively trying to share PII. Then, the system automatically changes their settings to prevent them from sharing any numbers. Innocent users can still use numbers to talk about points, items, etc.

The best part is that users can change their reputation based on behavior. So, users with a “bad” reputation who stop attempting to share their phone numbers can earn a “good” reputation and regain their old settings. And it all happens in real time.

Sanctions matter, too

As a best practice, we recommend using both user reputation and sanctions to moderate PII. Progressive sanctions give users the opportunity to change their behavior over time.

You can start by displaying a message to the user, explaining why sharing PII in dangerous. If they try again, suspend their account for 24 hours, then 72 hours — and so on.

The good thing is that immediate feedback almost always works. After the initial warning, most users won’t try again.

Why does COPPA matter?

For us, one of the most important sections in the regulation is the definition of collecting PII: “Enabling a child to make personal information publicly available in identifiable form.”

The definition continues:

“An operator shall not be considered to have collected personal information under this paragraph if it takes reasonable measures to delete all or virtually all personal information from a child’s postings before they are made public and also to delete such information from its records,” from §312.2.2; emphasis ours

Kids don’t understand the inherent dangers of sharing PII online. They are persistent, and unless we put (reasonable) measures in place to prevent it, they will keep trying to post their phone numbers, email addresses, real names, and more.

You’ll never catch everything. But you can detect and block nearly all instances of PII without crippling your business, and without stifling actual conversation.

As operators of kid-friendly social products, it’s our responsibility to protect children from their own innocence.

COPPA was created to protect children’s privacy. But in the process, it also protects them from predators, grooming, and luring.

Step 4: Get a COPPA Safe Harbor to validate

COPPA compliance is no easy task. Every product is unique and requires individual attention.

As a final step, the industry best practice is to hire an FTC-approved COPPA Safe Harbor site to review your strategy and product, as well as your 3rd party contracts and settings, to ensure you are and remain compliant.

Yes, it will cost several thousand dollars — but it’s far cheaper than losing hundreds of thousands of dollars in FTC fines and parental trust; in addition to saving on legal fees and providing you confidence in your engagement strategy.

You also get a seal to put on your app or site. The seal means that the Safe Harbor company you’ve chosen has given their stamp of approval to your strategy.

If the FTC ever looks at you and finds a potential issue they will look at that stamp/seal and allow you a chance to cure any issues without the heavy stick of enforcement action. You are basically deemed compliant when you are in good standing with an approved Safe Harbor. Of course, you need to keep it current, but it’s a great protection.

Safe Harbor companies that we recommend:

Not only does PRIVO provide a system for verifiable parental consent, they also are a Safe Harbor and will review your site, or app, or smart toy for COPPA, GDPR, SOPIPA, the Student Data Privacy Pledge, and other regulations protecting kids privacy online. It is pretty powerful that they offer two products to cover both of your needs.

kidSAFE initially only provided a seal certifying that kid’s sites and apps have followed proper safety steps. Now they are an FTC-approved Safe Harbor, and can also provide a kidSAFE+ COPPA-CERTIFIED Seal. That added safety certification is highly advantageous for kid’s products.

kidSAFE reviewed Community Sift and helped us create a set of guidelines for under-13 products. They are experts when it comes to kid’s online privacy and safety.

 

You’ve heard this line before — “Rated ‘E’ for everyone.” The rating system created by the Entertainment Software Rating Board is clear, concise, and immediately recognizable. They are also a COPPA compliance and Safe Harbor provider.

Next steps

There you have it — your four-step beginner’s guide to COPPA compliance. To recap, your first step is to decide on a parental permission strategy.

Once you’ve done that, craft a readable privacy policy that anyone can understand.

Then, find the right moderation software to filter PII in chat/images/usernames.

Finally, select a Safe Harbor company to help validate your strategy and ensure that you’re compliant.

Easy as pie?

Maybe not — but while COPPA compliance may not be easy, it is important.

Editor’s note: Originally published in August 2017, updated April 2018

Want more best practices for keeping your online community COPPA compliant? Sign up for the Two Hat newsletter and never miss an update! 

You’ll receive invites to exclusive webinars and workshops, community management and moderation tips, and product updates.

We will never share your information with third parties and you can unsubscribe at any time.

* indicates required


Top Three Reasons You Should Meet us at Gamescom

Heading to Gamescom or devcom this year? It’s a huge conference, and you have endless sessions, speakers, exhibits, and meetings to choose from. Your time is precious — and limited. How do you decide where you go, and who you talk to?

Here are three reasons we think you should meet with us while you’re in Cologne.

You need practical community-building tips.

Got trolls?

Our CEO & founder Chris Priebe is giving an awesome talk at devcom. He’ll be talking about the connection between trolls, community toxicity, and increased user churn. The struggle is real, and we’ve got the numbers to prove it.

Hope to build a thriving, engaged community in your game? Want to increase retention? Need to reduce your moderation workload so you can focus on fun stuff like shipping new features?

Chris has been in the online safety and security space for 20 years now and learned a few lessons along the way. He’ll be sharing practical, time-and-industry-proven moderation strategies that actually work.

Check out Chris’s talk on Monday, August 21st, from 14:30 – 15:00.

You don’t want to get left behind in a changing industry.

This is the year the industry gets serious about user-generated content (UGC) moderation.

With recent Facebook Live incidents (remember this and this?), new hate speech legislation in Germany, and the latest online harassment numbers from the Pew Research Center, online behavior is a hot topic.

We’ve been studying online behavior for years now. We even sat down with Kimberly Voll and Ivan Davies of Riot Games recently to talk about the challenges facing the industry in 2017.

Oh, and we have a kinda crazy theory about how the internet ended up this way. All we’ll say is that it involves Maslow’s hierarchy of needs

So, it’s encouraging to see that more and more companies are acknowledging the importance of smart, thoughtful, and intentional content moderation.

If you’re working on a game/social network/app in 2017, you have to consider how you’ll handle UGC (whether it’s chat, usernames, or images). Luckily, you don’t have to figure it out all by yourself.

Because…

You deserve success.

And we love this stuff.

Everyone says it, but it’s true: We really, really care about your success. And smart moderation is key to any social product’s success in a crowded and highly competitive market.

Increasing user retention, reducing moderation workload, keeping communities healthy — these are big deals to us. We’ve been fortunate enough to work with hugely successful companies like Roblox, Supercell, Kabam, and more, and we would love to share the lessons we’ve learned and best practices with you.

We’re sending three of our very best Two Hatters/Community Sifters to Germany. Sharon has a wicked sense of humor (and the biggest heart around), Mike has an encyclopedic knowledge of Bruce Springsteen lore, and Chris — well, he’s the brilliant, free-wheeling brain behind the entire operation.

So, if you’d like to meet up and chat at Gamescom, Sharon, Mike, and Chris will be in Cologne from Monday, August 21st to Friday, August 25th. Send us a message at hello@twohat.com, and one of them will be in touch.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


Connect With us at the Crimes Against Children Conference in Dallas

For the second year in a row, Two Hat Security will be attending the Crimes Against Children Conference in Dallas, Texas as a sponsor. Founded in 1988, the conference brings together attendees from law enforcement, child protective services, and more to “provid[e] practical and interactive instruction to those fighting crimes against children and helping children heal.”

Last year, more than 4200 professionals attended CACC — a record for the conference and a sign of the growing need for these discussions, workshops, and training sessions.

Two Hat Security founder and CEO Chris Priebe and VP of Product Brad Leitch are hosting two sessions this year. Both sessions will provide investigators with a deeper understanding of the vital role artificial intelligence plays in the future of abuse investigations.

Session 1: Using Artificial Intelligence to Prioritize and Solve Crimes

Tuesday, August 8
1:45 PM – 3:00 PM
Location: Dallas D2

In this session, we explore what recent advances in artificial intelligence and machine learning mean for law enforcement. We’ll discuss how this new technology can be applied in a meaningful way to triage and solve cases faster. This is a non-technical session that will help prepare investigative teams for upcoming technology innovations.

Session 2: Beyond PhotoDNA — Detecting New Child Sexual Abuse Material With CEASE.ai

Wednesday, August 9
8:30 AM – 9:45 AM
Location: City View 8 (Exhibitor Workshop)

Traditionally, PhotoDNA has allowed organizations to detect already categorized child sexual abuse material (CSAM). Sadly, with new digital content being so easy to create and distribute worldwide, investigators have seen an epidemic of brand-new, never-seen CSAM being shared online.

CEASE is an AI model that uses computer vision to detect these new images. Our collaboration with the Royal Canadian Mounted Police has given our data scientists access to a permanent data set of confirmed CSAM, which we are using to train the model.

However, it’s still a work in progress. If you are a member of the law enforcement community or the technology industry, we need your expertise and vast knowledge to help shape this groundbreaking system.

Stop by our booth

We look forward to meeting fellow attendees and discussing potential collaboration and partnership opportunities. Visit us at booth 41 on the 2nd floor, next to Griffeye.

As we learned at the Protecting Innocence Hackathon in July, “If we want to protect the innocence of children, we have a responsibility to be transparent and collaborative.”

You can sign up for both workshops through the conference website. Feel free to email us at hello@twohat.com to set up a meeting.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


How League of Legends Is Teaching High School Teens the Values of Sportsmanship

Ivan Davies of Riot Games has one of the coolest job descriptions ever.

“My job is to try and make a difference to the League of Legends player and wider community,” he says. “I work in a publishing office in Oceania, where I’m not told what to do by my Manager. I’m simply entrusted to make a difference; it’s then up to the local team to decide what direction we should take.”

For Ivan and his team, making a difference means tackling one of the biggest issues facing the gaming world today: How do you educate young players about good online behavior?

Following the Summoner’s Code

Riot Games has long been a proponent of sportsmanship. With 100 million monthly players across the globe, League of Legends is the biggest game in the industry. Because of its intensely competitive nature, it has become known for its sometimes heated atmosphere. Players are expected to abide by the Summoner’s Code, a comprehensive guide to being a good team player.

League of Legends in action

Despite encouraging the Summoner’s Code and being at the forefront of player behavior studies, Ivan notes that “At times, it’s felt like we could do more. Video games are a fundamental reflection of humanity: how we learn, how we interact, how we come to understand our world. We all “play” throughout our lives in some capacity or another. Video games just provide a particular sandbox… the reason they work so well is because of these parallels. The social and competitive nature of League of Legends taps into human fundamentals.”

Last year, Ivan and his team started to wonder what they could do outside of the in-game experience to positively shape player behavior. They realized that it’s not just the gaming industry that isn’t doing enough — it’s also the education sector. Students are online every day, at school and at home, and yet schools are doing very little to teach students about acceptable online behavior.

“Some schools don’t do enough to set students up for an online future. I’ve heard a number of schools hire an external speaker to talk to their students about cyberbullying. This talk may happen once a year purely to tick a box; a curriculum standard has been met, and online etiquette is not considered a priority for another year.” Ivan says.

“Teachers and the education sector have been slow to respond to this online world and setting students up for a future of online activity. The education sector is meant to set you up for life and at the moment not enough is being done to ensure online educational needs are being met.”

It’s all about sportsmanship

In 2016 Ivan and his team created League of Legends High School Clubs — an initiative that is now spreading across Australia and New Zealand. Like other after-school clubs (think AV, drama, or Model UN), League of Legends clubs are led by a dedicated teacher. Under the teacher’s supervision, students play League of Legends in groups at school and even participate in championship tournaments against other schools.

To help students understand and follow the Summoner’s Code, Ivan and his team have outlined six aspects of sportsmanship, which teachers and students discuss before, during, and after a game.

The six aspects of sportsmanship studied in LoL High School Clubs.

“A League of Legends High School Club is intended to promote authentic, relatable learning experiences,” Ivan says. “It provides an opportunity for students to explore and model the key values that exist in schools and in the curriculum. We’ve chosen to focus on sportsmanship and have provided a code of acceptable behavior for players to abide by in their pursuit of fair play.”

Helping teachers and students

Ivan and his team haven’t just worked diligently to promote the clubs — they’ve also built a remarkable set of teaching materials structured around the “Assessment for Learning” framework. Popular in the UK and Australia, “Assessment for Learning” emphasizes ongoing review and adjustment based on each student’s unique needs. Teaching materials include everything from discussion cards and self-evaluation sheets to essential information for school IT departments.

“We need to meet students where they are, and the more the education sector supports what we’re doing, the more likely we can collectively make a difference.”

This connection to the tenets of education is no accident — it’s a particularly brilliant choice on the part of Ivan and his team. As he says, “The resources align with the national curriculum and Positive Behavior for Learning, an initiative in Oceania which many schools are looking to roll out. League of Legends High School Clubs is one way of implementing these initiatives.”

Online changes, offline improvements

The exciting news is that the clubs have a real effect on kids — and not just on their online behavior.

“A year ago, we had this hypothesis that League of Legends could teach right from wrong,” he says. “A club led by a dedicated Teacher can definitely provide those opportunities. Not only have Teachers seen students adopting sportsmanlike characteristics, which has led to outcomes like effective communication and leadership, but some Teachers are now starting to see this transfer out of the League of Legends High School clubs and into the wider school curriculum.”

In addition to the existing 30 schools participating in clubs, Ivan did a professional development session last year to 26 teachers in Perth. As of July 2017, he has spoken to 130 different teachers across Oceania, and he’s eager to meet with more.

“A League of Legends High School Club is intended to promote authentic, relatable learning experiences.”

In the future, he hopes to expand the program throughout Oceania, adding more schools, teachers, and students to the already-growing list of participants. Not only that, he hopes that the education departments in Australia and New Zealand will soon recognize the benefits of the program — and potentially change the way they teach online etiquette to kids.

Why early digital education is crucial

“This is the place to teach online behavior,” Ivan says of high school. “I’ve always seen the education sector as a critical evolution point for young people. As teens begin to explore and experiment with the online world, we must think about how we can best support them on this journey. Let’s not shift the responsibility onto someone else or hope that they will learn online skills themselves.”

He hopes that the success of the project will send a strong signal to the world — that it’s time we tackle the problem of toxic online behavior. “This whole notion of ‘We’re going to wrap kids up in cotton wool. We’re going to remove them from the internet,’ is not an effective solution,” he cautions.

“Our children and our students look to us to set expectations of what good behavior looks like.”

“What we have to do is meet them on their chosen journey and be prepared to walk alongside them, side by side, step by step. As parents and teachers, we need to allow students to inevitably trip up or fall, and as they do we should be prepared and able to provide support and guidance. We should help them to make sense of what happened and why, and then encourage them to continue walking until they are skilled enough to walk on their own.”

It’s clear that the time for early education is now. The Pew Research Center’s latest study reports that 40% of Americans have experienced online harassment, while 62% consider harassment a major problem. As Ivan points out, these numbers highlight just how serious the problem is. The clubs are only the first step.

The future is now

“We, as adults, educators, and teachers have to be prepared to act,” Ivan says. “Our children and our students look to us to set expectations of what good behavior looks like, and if we can’t find the courage, time or dedication to step up and make a difference — what hope does the next generation have? Now is the time for change. The future we hope for won’t exist unless we do something about the now.”

“This is the place to teach online behavior,” Ivan says of high school. “I’ve always seen the education sector as a critical evolution point for young people.”

He’s hopeful for the future. “This is a hot topic of conversation. I spoke to three teachers yesterday, and I’m speaking to two more today.”

He adds, “I believe in a broad and balanced education system which embraces diversity and new opportunities that enhance understanding and student learning. We spend time on the things we care about, and the same goes for today’s students, many of whom are already invested in a digital world.

We need to meet students where they are, and the more the education sector supports what we’re doing, the more likely we can collectively make a difference.”

Find out more about sportsmanship and League of Legends High School Clubs on their site. Don’t forget to download their fantastic Teacher’s Resources here.

Interested in starting a club at your school? Find out how.

Questions for Ivan and his team? Get in touch at OCE-Highschool@riotgames.com.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required