Nice Needs No Filter: Celebrating Pink Shirt Day in Style!

On February 28, 2018, we encourage everyone to practice kindness and wear pink to symbolize that you do not tolerate bullying.

Pink Shirt Day website

Every year, communities across the world celebrate Pink Shirt Day on February 28th. In 2007 a group of teens in Nova Scotia organized a high school protest in support of a new student who was bullied for wearing pink. Inspired by their compassion, Pink Shirt Day raises awareness and money for anti-bullying charities.

Founded by CKNW Kids’ Fund, working with partners Boys & Girls Clubs and CKNW 980, the annual event gets bigger every year. In 2017 alone, funds raised through merchandise and community fundraising efforts supported programs that impacted more than 59,300 youth and children.

Nice Needs No Filter

This years’ theme Nice Needs No Filter is especially close to our hearts at Two Hat.

Founder and CEO Chris Priebe formed Two Hat Security and built our chat filter Community Sift with one big goal: to remove bullying from the internet. We believe that everyone should have the power to share without fear of harassment or abuse.

Movements like Pink Shirt Day and organizations like the CKNW Kids’ Fund shed a light on one of the biggest issues facing society today. We are so proud to stand alongside them as they fight for a better, more inclusive internet.

Find out how you can get involved.

Coming together as a community

Pink Shirt Day is a great excuse to hold a bake sale… (as if we needed an excuse to bring treats to work).

We started the day by visiting our friends at Burke Hair Lounge and getting free pink highlights! 

Then, we picked up cheese biscuits from the amazing Crystal at The Little Hobo Soup and Sandwich Shop.

At noon, it was bake sale time! It was chilly, but the sun came out (briefly) and, thanks to advertising help from Accelerate Okanagan and Kelowna Capital News, the turnout was impressive.


We’re excited to announce that we raised over $1,000 for anti-bullying charities across Western BC! Thank you to everyone who came out and supported this amazing cause. We’ll see you next year!

Oh, and one final message on Pink Shirt Day: Remember, nice needs no filter. Be kind online — and let’s fight for a better internet, together. 

Does Your Online Community Need a Chat Filter?

“Chatting is a major step in our funnel towards creating engaged, paying users. And so, it’s really in Twitch’s best interests — and in the interest of most game dev companies and other social media companies — to make being social on our platform as pleasant and safe as possible.”

– Ruth Toner, Data Scientist at Twitch, UX Game Summit 2017

You’ve probably noticed a lot of talk in the industry these days about chat filters and proactive moderation. Heck, we talk about filters and moderation techniques all the time.

We’ve previously examined whether you should buy moderation software or build it yourself and shared five of the best moderation workflows we’ve found to increase productivity.

Today, let’s take a step back and uncover why filtering matters in the first place.

Filtering: First things first

Using a chat, username, or image filter is an essential technique for any product with social features. Ever find yourself buried under a mountain of user reports, unable to dig yourself out? That’s what happens when you don’t use a filter.

Every game, virtual world, social app, or forum has its own set of community guidelines. Guidelines and standards are crucial in setting the tone of an online community, but they can’t guarantee good behavior.

Forget about Draconian measures. Gone are the days of the inflexible blacklist/whitelist, where every word is marked as either good or bad with no room for the nuances and complexity of language. Contextual filters that are designed around the concept of varying risk levels let you make choices based on your community’s unique needs.

Plenty of online games contain violence and allow players to talk about killing each other, and that’s fine — in context. And some online communities may allow users to engage in highly sexual conversations in one-on-one chat.

For many communities, their biggest concern is hate speech. Users can swear, sext, and taunt each other to their heart’s content, but racial and religious epithets are forbidden. With a filter that can distinguish between vulgarity and hate speech, you can grant your users the expressivity they deserve while still protecting them from abuse.

With a smart filter that leverages an AI/human hybrid approach, you can give those different audiences the freedom to express themselves.

Key takeaway #1: A content filter and automated moderation system is business crucial for products with user-generated content and user interactions.

Social is king

“A user who experiences toxicity is 320% more likely to quit.” – Jeffrey Lin, Riot Games

Back in 2014, a Riot Games study found a correlation between abusive player behavior and user churn. In fact, the study suggested that users who experience abuse their first time in the game are potentially three times more likely to quit and never return.

In 2016 we conducted a study with an anonymous battle game, and interestingly, found that users who engaged in chat were three times more likely to keep playing after the first day.

While there is still a lot of work to be done in this field, these two preliminary studies suggest that social matters. Further studies may show slightly different numbers, but it’s well understood that negative behavior leads to churn in online communities.

When users chat, they form connections. And when they form connections, they return to the game.

But the flipside of that is also true: When users chat and their experience is negative, they leave. We all know it’s far more expensive to acquire a new user than to keep an existing one — so it’s critical that gaming and social platforms do everything in their power to retain new users. The first step? Ensure that their social experience is free from abuse and harassment.


Key takeaway #2: The benefits of chatting can be canceled by user churn driven by exposure to high-risk content


Reduce moderation workload and increase user retention

Proactive filtering and smart moderation doesn’t just protect your brand and your community from abusive content. It also has a major impact on your bottom line.

Habbo is a virtual hotel where users from around the world can design rooms, roleplay in organizations, and even open their own trade shops and cafes. When the company was ready to scale up and expand into other languages, they realized that they needed a smarter way to filter content without sacrificing user’s ability to express themselves. 

By utilizing a more contextual filter than their previous blacklist/whitelist, Habbo managed to reduce their moderation workload by a whopping 70%. Without reportable content, there just isn’t much for users to report.

Friendbase is a virtual world where teens can chat, create, and play as friendly avatars. Similar to Habbo, the company launched their chat features with a simple blacklist/whitelist filter technology. However, chat on the platform was quickly filled with sexism, racism, and bullying behavior. For Friendbase, this behavior led to high user churn.

By leveraging a smarter, more contextual chat filter they were able to manage their users’ first experiences and create a healthy, more positive environment. Within six months of implementing new filtering and moderation technology, user retention by day 30 had doubled. And just like Habbo, user reports decreased significantly. That means fewer moderators are needed to do less work. And not only that — the work they do is far more meaningful.

Key takeaway #3: Build a business case and invest in a solid moderation software to ensure you leverage the best of healthy interactions in your online community.

Does your online community need a chat filter?

[postcallout title=”Thinking of Building Your Own Chat Filter?” body=”If you’re thinking about building your own chat filter, here are 5 important things to consider.” buttontext=”Read More” buttonlink=””]Ultimately, you will decide what’s best for your community. But the answer is almost always a resounding “yes.”

Many products with social features launch without a filter (or with a rudimentary blacklist/whitelist) and find that the community stays healthy and positive — until it scales. More users means more moderation, more reports, and more potential for abuse.

Luckily, you can prevent those growing pains by launching with a smart, efficient moderation solution that blends AI and human review to ensure that your community is protected from abuse and that users are leveraging your platform for what it was intended — connection, interaction, and engagement.



Want more articles like this? Subscribe to our newsletter and never miss a blog!

* indicates required

Download the High School League of Legends Teaching Resources

Last month, we announced our partnership with the High School League of Legends Clubs.

This month, we are thrilled to announce that we’ve released our first official teaching resources!

Online Risk and Digital Citizenship: Learning About Risk With High School League of Legends Clubs is an overview and endorsement of the clubs, focusing on the great strides Riot Games has made teaching students about online etiquette and sportsmanship.Teaching A Team-Oriented Mindset & Resilience is a lesson plan for teachers leading clubs. It includes teaching objectives, a series of student activities, and discussion questions to bring up before, during, and after a match.You can also download both resources on the High School League of Legends site.

We had a great time collaborating with the League of Legends team in Oceania on these resources. Teachers, we hope you find them invaluable in your classes, as you lead students along this journey. Students, we hope you learn about sportsmanship and digital citizenship — and have a lot of fun along the way.