Thinking of Building Your Own Chat Filter? Four Reasons You’re Wasting Your Time!

 In Online Safety, Social Networks

“When I think about being a game developer, it’s because we love creating this cool content and features. We want to make new maps, we want to make new heroes, we want to make animated shorts, that’s where our passion is.”

In September, Overwatch posted a new developer’s update video on YouTube. In it, game director Jeff Caplan candidly discusses challenges facing the development team as they add additional reporting features to the game and fix broken moderation features.

With so many resources focused on moderation, actual game development has been impacted:

“I wish we could take the time that we put into putting reporting [features] on console, and put that towards a match history system or a replay system instead. It was the exact same people that had to work on both who got re-routed to work on the other.”

That sucks.

Why use a chat filter in the first place?

Last December, Twitch introduced AutoMod, an automated language filtering platform that allows broadcasters to moderate their own channels in real time.

Ruth Toner, Data Scientist on the Community Success Team at Twitch, recently explained why the company chose to develop this feature:

“Chatting is a major step in our funnel towards creating engaged, paying users. And so, it’s really in Twitch’s best interests — and in the interest of most game dev companies and other social media companies — to make being social on our platform as pleasant and safe as possible.”

You’ve probably heard the Riot Games stat that says users who experience toxicity are three times more likely to leave a game and never return. At Two Hat, we found that smart moderation can potentially double user retention.

We’ve witnessed an industry-wide paradigm shift this year. Today, most gaming and social companies realize that if they want to shape healthy, engaged, and ultimately profitable communities, they have to employ some kind of a chat filter and moderation software.

But that begs the question — should you build it yourself? Or use an outside vendor?

There are four strong arguments against building it yourself.

1. It’s not just a blacklist/whitelist

Traditionally, most filters use a binary blacklist/whitelist. The thing is, language isn’t binary. It’s complex and nuanced.

For instance, in many older gaming communities, some swear words will be acceptable, based on context. You could build a RegEx tool to string match input text, and it would have no problem finding the f-word. But can it recognize the critical difference between “Go #$%^ yourself” and “That was #$%^ing awesome”?

What if your players spell a word incorrectly? What if they use l337 5p34k (and they will)? What if they deliberately try to manipulate the filter?

It’s an endless arms race, and your users have way more time on their hands than you do.

Think about the hundreds of different variations of these phrases:

You should kill yourself / She deserves to die / He needs to drink bleach / etc
You are a [insert racial slur here]

Imagine the time and effort it would take to enter every single variation. Now add misspellings. Now add l337 mapping. Now add the latest slang. Now add the latest latest slang.

It never ends.

Now, imagine using a filter that has access to billions of lines of chat across dozens of different platforms. By using a third-party filter, you’ll experience the network effect, detecting words and phrases you would likely never find on your own.

2. Users don’t only chat in English

What if your community uses other languages? Consider the work that you’ll have to put into building an English-only filter. Now, double, triple, quadruple that work when you add Spanish, Portuguese, French, German, etc.

Word-for-word translation might work for simple profanity, but as soon as you venture into colloquial expressions (“let’s bang,” “I’m going to pound you,” etc) it gets messy.

In fact, many languages have complicated grammar rules that make direct translation literally impossible. Creating a chat filter in, say, Spanish, would require the expertise of a native speaker with a deep understanding of the language. That means hiring or outsourcing multiple language experts to build an internal multi-language filter.

And anyone who has ever run a company knows — people are awesome but they’re awfully expensive.

Lego businessman is stressed about expenses.

How complex are other languages? German has four grammar cases and three genders. Finnish uses 15 noun cases in the singular and 16 in the plural. And the Japanese language uses three independent writing systems (hiragana, katakana, kanji), all three of which can be combined in a single sentence.

Tl;dr, because grammar: Every language is complex in its own way. Running your English filter through a direct translation like Google translate won’t result in a clean, accurate chat filter. In fact, it will likely alienate your community if you get it wrong.

3. Effective moderation involves more than a chat filter

There is a lot more to moderation than just filtering abusive chat. Filtering — regardless of how strict or permissive your community may be — is only the first layer of defense against antisocial behavior.

You’ll also need a way for users to report abusive behavior, an algorithm that bubbles the worst reports to the top for faster review, an automated process for escalating especially dangerous (and potentially illegal) content for your moderation team to review, various workflows to accurately and progressively message, warn, mute, and sanction accounts and (hopefully) correct user behavior, a moderation tool with content queues for moderators to actually review UGC, a live chat viewer, an engine to generate business intelligence reports…

“Invest in tools so you can focus on building your game with the community.”

That’s Lance Priebe, co-creator of the massively popular kid’s virtual world Club Penguin, sharing one of the biggest lessons he learned as a developer.

Focus on what matters to you, and on what you and your team do best — developing and shipping kickass new game features.

4. It’s a pain in the butt to maintain

The more time and money you can put into your core product — improved game mechanics, new features, world expansions — the better.

Think of it this way. Would you build your own anti-virus software? Of course not. It would be outdated before launch. Researching, reviewing, and fighting the latest malware isn’t your job. Instead, you rely on the experts.

Now, imagine you’ve built your own chat filter and are hosting it locally. Every day, users find new ways around the filter, faster than you can keep up. That means every day you have to spend precious time updating an MySQL table with new expressions. And that means testing and finally deploying the update… and that means an increase in game downtime.

Build your own chat filter, they said. “It’ll be fun,” they said.

This all adds up to a significant loss of resources and time — your time, your team’s time, and your player’s time.

Engineering time is too valuable to waste

Is there an engineering team on the planet that has the time (not to mention resources) to maintain an internally-hosted solution?

Dev teams are already overtaxed with overflowing sprint cycles, impossible QA workloads, and resource-depleting deployment processes. Do you really want to maintain another internal tool?

If the answer is “no,” luckily there is a solution — instead of building it yourself, rely on the experts.

Think of it as anti-virus software for your online community.

Talk to the experts

Founded in 2012 by developer and Disney alum Chris Priebe, Two Hat Security empowers gaming and social platforms to build healthy, engaged online communities, all while protecting their brand and their users from high-risk content. 

Under contract to process 4 billion messages a day, our chat filter and moderation software Community Sift specializes in identifying and escalating manipulative, high-risk content for cost-effective and purposeful moderation.

We’ve helped communities across the globe grow with over 20 of the most popular digital languages, all built and maintained by native speakers using in-house quality assurance tools.

Companies like Supercell, Roblox, Kabam, and more call our secure RESTful API to moderate text, usernames, and images in their online communities every day.

Get in touch today to find out how Community Sift can take the pressure off your development team — and foster an engaged community in the process.


Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


 

Recommended Posts
Contact Us

Hello! Send us your question, and we'll get back to you as soon as possible.

Start typing and press Enter to search