Free Webinar: Six Essential Pillars of a Healthy Online Community

Updated April 17th, 2018

Watch the recording!


Don’t miss the next webinar about building healthy online communities! Sign up for our newsletter and never miss an update!

* indicates required

Does your online game, virtual world, or app include social features like chat, usernames, or user-generated images? Are you struggling with abusive content, lack of user engagement, or skyrocketing moderation costs?

Building an engaging, healthy, and profitable community in your product is challenging — that’s why we formulated the Six Essential Pillars of a Healthy Online Community!

In this exclusive talk, industry experts share their six techniques for creating a thriving, engaged, and loyal community in your social product:

Join us on Wednesday, April 4th at 10:00 AM PST/1:00 PM EST!

In this free, one-hour webinar, you’ll learn how to:

  • Protect your brand using “safety by design”
  • Increase user trust with consistent messaging
  • Reduce moderator workload & empower meaningful work
  • Improve community health using real-time feedback

Save your seat today for this ultimate guide to community health and engagement!

Two Hat Security is Ready to Rocket for the Third Year in a Row

We are excited to announce that Two Hat Security has been featured on the ICT (Information and Communications Technology) Ready to Rocket List for the third year in a row!

Rocket Builders’ sixteenth annual list recognizes BC tech companies that are “best positioned to capitalize on the technology sector trends that will lead them to faster growth than their peers… [the lists] provide accurate predictions of private companies that will likely experience significant growth, venture capital investment or acquisition by a major player in the coming year.”

Read the full press release.

About Ready to Rocket

Ready to Rocket is a unique business recognition list that profiles B.C. technology companies with the greatest potential for revenue growth. Since 2003, the Ready to Rocket list has consistently predicted the revenue growth leaders and the companies most likely to attract investment.

About Rocket Builders

Rocket Builders is a management consulting firm providing sales and marketing services. With a focus on helping technology companies to grow and prosper, Rocket Builders has a proven track record of success with its clients. Since 2000, they have been engaged in market research, market planning, business development initiatives, strategic selling, and product launches for over 350 organizations.

About Two Hat Security

At Two Hat Security, we empower social and gaming platforms to build healthy, engaged online communities, all while protecting their brand from high-risk content. Our flagship software Community Sift is a powerful content filter and automated moderation tool that detects high-risk content in text, usernames, and images. Used by online games, social networks, virtual worlds, forums, messaging apps and more, Community Sift leverages an artificial intelligence model that helps social products increase user engagement, decrease moderation costs and workload, and protect communities from abuse.


Want more articles like this? Subscribe to our newsletter and never miss a blog!

* indicates required

Introducing The Fair Play Alliance

Today, we are thrilled to announce our involvement with the Fair Play Alliance (FPA), a cross-industry initiative spanning over 30 gaming companies whose mission is to foster fair play in online games, raise awareness of player-behaviour-related issues, and share research and best practices that drive lasting change. As founding members of the Initiative, we are eager to collaborate with a wide range of industry experts to foster and empower healthy online communities.

Check out the official press release below for more information about the coalition.

SAN FRANCISCO, CA – Representatives of over 30 different gaming companies will meet during the 2018 Game Developers Conference (GDC) in San Francisco to discuss best practices in cultivating online gaming experiences free of harassment or abuse.

The Fair Play Alliance (FPA) is a coalition for developers that supports open collaboration, research, and best practices for encouraging healthy gaming communities and fair play. Key objectives include collaboration on initiatives aimed at improving online behavior in games and creating an atmosphere free of abuse and discrimination.

The Fair Play Summit, which takes place on Wednesday, March 21, will feature experts who have been working to understand and address disruptive behaviour in games, speaking on the state of the industry, what developers need to know, and practical methods to create constructive avenues for fair play and collaboration online.

Want to attend? Media and expo pass holders can see the keynote in Room 3020, West Hall from 9:30 to 10:30 am, and all following sessions in Room 306, South Hall from 11 to 6 pm.

Press attendees
Attendance is free to published members of the press – please contact for further information.

For more information on the event, the Fair Play Alliance, or for interview requests:

Fair Play Alliance membership

  • Blizzard Entertainment, Inc.
  • CCP Games
  • Corillian
  • Discord Inc.
  • Epic Games, Inc.
  • Flaregames
  • Huuuge Games
  • Kabam
  • Ker-Chunk Games
  • Mixer
  • Owlchemy Labs
  • Playrix
  • Radial Games
  • Riot Games
  • Roblox Corporation
  • Rovio Entertainment Corp.
  • Space Ape Games
  • Spirit AI, Ltd.
  • Supercell
  • Two Hat
  • Twitch Interactive
  • Unity Technologies
  • Xbox
  • + additional silent partners


Want more articles like this? Subscribe to our newsletter and never miss a blog!

* indicates required

#ICANHELP Light the Fire: An Interview With Musician Lisa Heller

Singer, songwriter, and actress Lisa Heller is using her voice to change the world — online and offline.

Only 21 years old, Lisa is already making waves. She’s released three successful singles, was named Ambassador for the first annual #Digital4Good event held at Twitter HQ this summer and is poised to release a new video “Light the Fire” to support the non-profit #ICANHELP in their newest campaign to support positivity in social media.

As a teenager, Lisa suffered from anxiety and low self-esteem. She turned to songwriting for strength.

“I found my purpose by writing music,” she says. “I’m able to inspire other people. Now I know why I’m experiencing anxiety — it’s for a reason. So that I can inspire others, so they know there is something else out there who is going through the same thing. That’s why I think this happened to me — so that I can be strong for other people, and talk about my experiences.”

Lisa took time out of her busy schedule (in addition to writing, performing, and recording music, she’s in her senior year at university) to talk to us about #ICANHELP and “Light the Fire.”

How did you get involved with #ICANHELP?

There’s a guy named Charlie Peake, who’s from Simsbury Connecticut, where I’m from and who went to Colgate University where I go. He read an article about me and reached out to me about SCORE, a free nationwide mentorship program. He helped connect me with David Ryan Polgar, who is a board member for #ICANHELP, who connected me with Matt Soeth, co-founder of #ICANHELP. Everything clicked from there.

How did you become Ambassador for the #Digital4Good event at Twitter HQ?

Matt and I started talking about how it would be cool if I had a song that would fit well with the next online #ICANHELP campaign. I mentioned that I had this song “Light the Fire” that has a similar message as #ICANHELP. He asked me to talk at the #Digital4Good event in San Francisco and said we could also film a video for the song while I was there to tie it into #ICANHELP.

The students in the video are all worldwide winners who were nominated by their peers for the #Digital4Good program.


What was your experience at #Digital4Good?

It was pretty cool. The first half of the day students talked about their experiences having a positive impact online, and shared their accomplishments in middle school and high school.

I spoke about my “Hope” video and I got to announce that I was doing “Light the Fire” with the #ICANHELP campaign.

I stuck around for the rest of the day where we had all of these different panels, and I was able to talk with the students and people in tech. We all met in small groups and had some pretty cool conversations.

You’re active and popular on social media, with a combined 26k followers on Instagram and Facebook. Have you experienced harassment and abuse online?

The reason I started doing music was because I felt a sense of alienation growing up before I even posted anything online. Part of the reason I wanted to make music was to be a strong person who could stand up to people and show that it’s okay to be criticized because you can be the stronger, bigger person.

When I started posting videos, it was hard. Sometimes I would get negative feedback. When I released my “Hope” video, which is about kids with terminal illnesses, some people made some really horrible comments, like they wished these kids would pass away. But the amazing thing is that my fans would all stand up to those people, and it created this online community where the people who had made negative comments ended up being won over and even apologizing. They ended up supporting the song and the video.

As hard as it was at the beginning to read those negative comments, I’ve also become a stronger person. I’ve learned that often people make comments because they might be insecure themselves. And I’ve learned to handle it because I want to be that strong person for other people who might be experiencing harassment every day.

Let’s talk about “Light The Fire.” What is the message of the song?

I started writing “Light the Fire” a year and a half ago. The point of the song was to inspire others to get involved with whatever cause they believe in and to stand up for their beliefs.

The line “If we light the fire, build our own heat, we’ll brothers and sisters march the streets,” is about peacefully standing up for what you believe in, in order to spread a positive message. The line “Strike one match, that’s all we need,” expresses the idea that only one person needs to stand up, but if you work hard enough then other people will join in with that positive message, and you can spread it and build something really great. “Light the Fire” is about spreading positive messages and having a positive impact online and offline.


Formed by educators Matt Soeth and Kim Karr, #ICANHELP is a non-profit organization that educates and empowers students to use social media positively. To date, #ICANHELP has worked with students to take down over 800 pages dealing with harassment, impersonation, bullying, and more. They work closely with schools in training students on how to respond to cyber issues.

About #ICANHELP to Light the Fire

#ICANHELP is excited to partner with Lisa Heller on “Light the Fire.” Lisa collaborated with #ICANHELP and our students to plan, record, and edit this video. #Digital4Good is about students inspiring students and “Light the Fire” is a perfect way to promote that message. Together we are powerful, together we can make a difference.

About Lisa Heller

21-year-old alternative pop singer Lisa Heller started writing music when she was only fourteen. Songs helped her work through her tough adolescent years battling anxiety and low self-esteem.

Her debut single “Hope” set the foundation for her career when the music video reached over 1.6 million views on YouTube and Trended to #1 in 5 countries (top 10 in 7). Heller’s original songs share inspirational messages such as her commentary on college hookup culture in “Midnight” and her experiences in overcoming anxiety with old school “Things You Never Said”. The Edit, a clothing line by Seventeen Magazine, has featured Heller as an Influencer on social media.


Want more articles like this? Subscribe to our newsletter and never miss a blog!

* indicates required

Thinking of Building Your Own Chat Filter? Five Reasons You’re Wasting Your Time!

If you’re building an online community, whether a game or social network, flagging and dealing with abusive language and users is critical to success. Back in 2014, a Riot Games study suggested that users who experience abuse their first time in the game are potentially three times more likely to quit and never return.

“Chatting is a major step in our funnel towards creating engaged, paying users. And so, it’s really in Twitch’s best interests — and in the interest of most game dev companies and other social media companies — to make being social on our platform as pleasant and safe as possible.” – Ruth Toner, Twitch

[postcallout title=”Does Your Online Community Need a Chat Filter?” body=”A content filter and automated moderation system is business crucial for products with user-generated content and user interactions.” buttontext=”Read More” buttonlink=””]At Two Hat, we found that smart moderation can potentially double user retention. And we’re starting to experience an industry-wide paradigm shift. Today, gaming and social companies realize that if they want to shape healthy, engaged, and ultimately profitable communities, they must employ some kind of chat filter and moderation software.

But that begs the question — should you build it yourself or use an outside vendor? Like anti-virus software, it’s better left to a team dedicated day in, day out, to keeping the software updated.

A few things to consider before investing a great deal of time and expense into an in-house chat filter.

1. A blacklist/whitelist doesn’t work because language isn’t binary

Traditionally, most filters use a binary blacklist/whitelist. The thing is, language isn’t binary. It’s complex and nuanced.

For instance, in many older gaming communities, some swear words will be acceptable, based on context. You could build a RegEx tool to string match input text, and it would have no problem finding an f-bomb. But can it recognize the critical difference between “Go #$%^ yourself” and “That was #$%^ing awesome”?

What if your players spell a word incorrectly? What if they use l337 5p34k (and they will)? What if they deliberately try to manipulate the filter?

It’s an endless arms race, and your users have way more time on their hands than you do.

Think about the hundreds of different variations of these phrases:

You should kill yourself / She deserves to die / He needs to drink bleach / etc
You are a [insert racial slur here]

Imagine the time and effort it would take to enter every single variation. Now add misspellings. Now add l337 mapping. Now add the latest slang. Now add the latest latest slang.

It never ends.

Now, imagine using a filter that has access to billions of lines of chat across dozens of different platforms. By using a third-party filter, you’ll benefit from the network effect, detecting words and phrases you would likely never find on your own.

2. Keep your team focused on building an awesome product — not chasing a few bad actors around the block

“When I think about being a game developer, it’s because we love creating this cool content and features. I wish we could take the time that we put into putting reporting [features] on console, and put that towards a match history system or a replay system instead. It was the exact same people that had to work on both who got re-routed to work on the other. – Jeff Caplin, Blizzard Entertainment

Like anything else built in-house, someone has to maintain the filter as well as identify and resolve specific incidents. If your plan is to scale your community, maintaining your own filter will quickly become unmanageable. The dev and engineering teams will end up spending more time keeping the community safe than actually building the community and features.

Compare that with simply tapping into the RESTful API of a service provider that reliably uses AI and human review to keep abusive language definitions current and quickly process billions of reports per day. Imagine letting community managers identify and effectively deal with the few bad actors while the rest of your team relentlessly improves the community itself.

3. Moderation without triage means drowning in user reports

There is a lot more to moderation than just filtering abusive chat. Filtering — regardless of how strict or permissive your community may be — is only the first layer of defense against antisocial behavior.

You’ll also need a way for users to report abusive behavior, an algorithm that bubbles the worst reports to the top for faster review, an automated process for escalating especially dangerous (and potentially illegal) content for your moderation team to review, various workflows to accurately and progressively message, warn, mute, and sanction accounts and (hopefully) correct user behavior, a moderation tool with content queues for moderators to actually review UGC, a live chat viewer, an engine to generate business intelligence reports…

“Invest in tools so you can focus on building your game with the community.”

That’s Lance Priebe, co-creator of the massively popular kid’s virtual world Club Penguin, sharing one of the biggest lessons he learned as a developer.

Focus on what matters to you, and on what you and your team do best — developing and shipping kickass new game features.

4. It’s obsolete before it ships

The more time and money you can put into your core product — improved game mechanics, new features, world expansions — the better.

Think of it this way. Would you build your own anti-virus software? Of course not. It would be outdated before launch. Researching, reviewing, and fighting the latest malware isn’t your job. Instead, you rely on the experts.

Now, imagine you’ve built your own chat filter and are hosting it locally. Every day, users find new ways around the filter, faster than you can keep up. That means every day you have to spend precious time updating the repository with new expressions. And that means testing and finally deploying the update… and that means an increase in game downtime.

Build your own chat filter, they said. “It’ll be fun,” they said.

This all adds up to a significant loss of resources and time — your time, your team’s time, and your player’s time.

5. Users don’t only chat in English

What if your community uses other languages? Consider the work that you’ll have to put into building an English-only filter. Now, double, triple, quadruple that work when you add Spanish, Portuguese, French, German, etc.

Word-for-word translation might work for simple profanity, but as soon as you venture into colloquial expressions (“let’s bang,” “I’m going to pound you,” etc) it gets messy.

In fact, many languages have complicated grammar rules that make direct translation literally impossible. Creating a chat filter in, say, Spanish, would require the expertise of a native speaker with a deep understanding of the language. That means hiring or outsourcing multiple language experts to build an internal multi-language filter.

And anyone who has ever run a company knows — people are awesome but they’re awfully expensive.

Lego businessman is stressed about expenses.

How complex are other languages? German has four grammar cases and three genders. Finnish uses 15 noun cases in the singular and 16 in the plural. And the Japanese language uses three independent writing systems (hiragana, katakana, kanji), all three of which can be combined in a single sentence.

Tl;dr, because grammar: Every language is complex in its own way. Running your English filter through a direct translation like Google translate won’t result in a clean, accurate chat filter. In fact, it will likely alienate your community if you get it wrong.

Engineering time is too valuable to waste

Is there an engineering team on the planet that has the time (not to mention resources) to maintain an internally-hosted solution?

Dev teams are already overtaxed with overflowing sprint cycles, impossible QA workloads, and resource-depleting deployment processes. Do you really want to maintain another internal tool?

If the answer is “no,” luckily there is a solution — instead of building it yourself, rely on the experts.

Think of it as anti-virus software for your online community.

Talk to the experts

Consider Community Sift by Two Hat Security for your community’s chat filter. Specializing in identification and triage of high-risk and illegal content, we are under contract to process 4 billion messages every day. Since 2012 we have been empowering gaming and social platforms to build healthy, engaged communities by providing cost-effective, purposeful automated moderation.

You’ll be in good company with some of the largest online communities by Supercell, Roblox, Kabam, and many more. Simply call our secure RESTful API to moderate text, usernames, and images in over 20 of the most popular IRL and digital languages, all built and maintained by our on-site team of real live native speakers.


Want more articles like this? Subscribe to our newsletter and never miss a blog!

* indicates required