Webinar: Image Moderation 101

Wondering about the latest industry trends in image moderation? Need to keep offensive and unwanted images out of your community — but no idea where to start?

Join us for 20 minutes on Wednesday, August 22 for an intimate chat with Carlos Figueiredo, Two Hat Director of Community Trust & Safety.

Register Now
In this 20 minute chat, we’ll cover:

  • Why image moderation is business-critical for sharing sites in 2018
  • An exclusive look at our industry-proven best practices
  • A sneak peek at the future of image moderation… will there be robots?

Sign up today to save your seat!

Community Manager Academy: Resources for Growing Healthy Online Communities

As an online Community Manager, you’re responsible for a seemingly endless list of tasks and projects. From managing a team of moderators to reporting on community engagement metrics, your responsibilities never end.

You don’t have time to seek out the latest trends in community management – you’re too busy compiling your weekly “time to resolution for tickets” report!

We want community managers and moderators to thrive in their jobs – after all, protecting users from abusive content and fostering healthy communities is our passion. We’re always looking for the best way to share tips and tricks, best practices, and walkthroughs.

To save you time – and keep you up to date on the latest news in the business – we’ve created Community Manager Academy, our version of school (minus exams, grades, and deadlines, so you know… fun school).

Our Community Manager resource center consists of on-demand webinars and downloadable content that can be accessed anytime, anywhere. You’ll find UGC moderation best practices, community health checklists, COPPA compliance guides, and more.

Take a minute to check out the page, and let us know what you think. What do you like? What do you not like? What topics would you like us to cover in the future? It’s your page — we would love to hear from you!

Social Media Slang Every Community Manager Should Know in 2018

We all know how quickly news travels online. But what about new slang? Just like news stories, words and phrases can go viral in the blink of an eye (or the post of a Tweet, if you will).

No one is more aware of the ever-evolving language of social media than online community managers. Moderators and community managers who review user-generated chat, comments, and usernames every day have to stay in the loop when it comes to new online slang.

Here are eight new words that our language and culture experts identified this month:

hundo p

To know with 100% certainty. “This coffee is hundo p giving me life.”

trill

A combination of “true” and “real”. “To keep it trill, I need a break from reviewing usernames. I can’t look at another variation of #1ShawnMendesFan.”

otp

One True Pairing; the perfect couple you ship in fanfiction. “Link and Zelda are always and forever the otp. Don’t @ me.”

distractivated

Distracted in a way that motivates/inspires. “I was so distractivated today looking at Twitter for new slang, I mentally rearranged my entire apartment.”

JOMO

Joy of Missing Out; the opposite of FOMO. “I missed the catered lunch and Fornite battle yesterday but it’s okay because I was JOMOing in the park.”

ngl; tache

Not gonna lie; mustache. “I’m ngl, that new moderator who just started today has a serious Magnum PI tache going on.”

sus

Suspect. “These cat pics are pretty sus, no way does it have anime-size eyes.”

What’s an effective community management strategy to ensure that new phrases are added regularly? We recommend using a content moderation tool that automatically identifies trending terms and can be updated in real time.

Not sure how to choose the right solution for your community? Check out What is the difference between a profanity filter and a content moderation tool?

In the meantime, happy moderating (and try not to get too distractivated).

Affected by the Smyte Closure? Two Hat Security Protects Communities From Abusive Comments and Hate Speech

Statement from CEO and founder Chris Priebe: 

As many of you know, Smyte was recently acquired by Twitter and its services are no longer available, affecting many companies in the industry.

As CEO and founder of Two Hat Security, creators of the chat filter and content moderation solution Community Sift, I would like to assure both our valued customers and the industry at large that we are, and will always remain, committed to user protection and safety. For six years we have worked with many of the largest gaming and social platforms in the world to protect their communities from abuse, harassment, and hate speech.

We will continue to serve our existing clients and welcome the opportunity to work with anyone affected by this unfortunate situation. Our mandate is and will always be to protect the users on behalf of all sites. We are committed to uninterrupted service to those who rely on us.

If you’re in need of a filter to protect your community, we can be reached at hello@twohat.com.

Online Moderators: Ten Simple Steps to Decrease Your Stress

As a community manager or content moderator, you experience the dark side of the internet every day. Whether you are reviewing chat, social media, forum comments, or images, high-risk content can be draining — and you may not even realize the damage it’s doing.

Studies show that community teams on the front lines of chat, image, or video moderation are especially vulnerable to stress-related symptoms including depression, insomnia, vicarious trauma (also known as “compassion fatigue”), and even PTSD. It’s critical that you have the right tools and techniques at your disposal to support your mental health.

Therapist and wellness trainer Carol Brusca recently hosted a “Stress, Wellness, and Resilience” training session for Two Hat’s clients and partners. Here are her top 10 wellness tips for online moderators and community managers

1. Talk to someone.
Having and using social supports is the number one indicator of resilience. Asking for help from someone who cares about you is a wonderful way to get through a difficult time.

Does your company health plan provide access to a mental health professional? Take advantage of it. There’s no shame in talking to a therapist. Sometimes, talking to a stranger can be even more effective than confiding in a loved one.

2. Learn to say no.
If we do not set boundaries with others we can find ourselves feeling stressed out and overwhelmed. If you notice this might be a habit for you, try saying “no” once a day and see if you begin to feel better.

Of course, saying “no” at work isn’t always an option. But if you’re spending too much time reviewing high-risk content, talk to your manager. Ask if you can vary your tasks; instead of spending all of your workday reviewing user reports, break up the day with 15-minute gameplay breaks. Check out our blog post and case study about different moderation techniques you can use to avoid chat moderation burnout.

3. Go easy on yourself.
We are quick to criticize ourselves and what we have done wrong, but not as likely to give ourselves credit for what went right, or all the things we did well.

Remember that you work hard to ensure that your online community is healthy, happy, and safe. Pat yourself on the back for a job well done, and treat yourself to some self-care.

4. Remember, this too will pass.
There are very few situations or events in our lives that are forever. Try repeating this mantra during a stressful time: this struggle will pass. It will make getting through that time a little easier.

(Maybe just repeat it silently in your head. Your co-workers will thank you.)

5. Get plenty of sleep.
We need sleep to replenish and rejuvenate. Often when we are feeling stressed, we struggle with sleeping well. If this happens to you, make sure your bedroom is dark and cool; try some gentle music to help you get to sleep, or use an app that plays soothing sounds on a loop. If staying asleep is the problem, try having a notepad and pen by your bed to write down your worries as they come up.

Pro tip: Save the marathon 3:00 am Fortnite sessions for the weekend.

6. Have a hobby.
Having a hobby is a great distraction from the stressors of everyday life. If you can do something outside, all the better. For many people being in nature automatically decreases stress.

Or, stick to video games. Playing Tetris has been proven to help people who experience trauma.

7. Drink tea.
A large dose of caffeine causes a short-term spike in blood pressure. It may also cause your hypothalamic-pituitary-adrenal axis to go into overdrive. Instead of coffee or energy drinks, try green tea.

We know that the smell of a freshly-brewed pot of coffee is like catnip to most moderators… but hear us out. Green tea has less than half the caffeine of coffee and contains healthy antioxidants, as well as theanine, an amino acid that has a calming effect on the nervous system.

8. Laugh it off.
Laughter releases endorphins that improve mood and decrease levels of the stress-causing hormones cortisol and adrenaline. It literally tricks your nervous system into making you happy. Try a comedy movie marathon or a laughter yoga class (this is a real thing!).

And hey, a 10-minute meme break never hurt anyone.

9. Exercise.
Getting plenty of exercise will decrease stress hormones and increase endorphins, leaving you feeling more energized and happier.

Ever had a 30-second, impromptu dance party at your desk?

No, really!

Often referred to as the “stress hormone,” cortisol is released in our body when we’re under pressure. Excess cortisol can cause you to feel stress, anxiety, and tension. Exercise brings your cortisol levels back down to normal, allowing you to relax and think straight again.

So crank up a classic, stand up… and get down.

10. Try the “3 Good things” exercise.
Each night, write down three good things that happened during the day. This practice makes you shift your perspective to more positive things in your life — which in turn can shift your mood from stressed to happy…

… even if the three good things are tacos for lunch, tacos for 2 pm snack, and tacos for 4 pm snack. Good things don’t have to be earth-shattering. Gratitude comes in all sizes.

So, whether you’re sipping a mug of green tea, talking to a professional, or shaking your groove thing in the name of science and wellness, never forget that a little self-care can go a long way.

At Two Hat, we empower gaming and social platforms to build healthy and engaged online communities with our content filter and automated moderation software Community Sift — and that can’t be done without healthy and engaged community teams.



Free Webinar: Six Essential Pillars of a Healthy Online Community

Updated April 17th, 2018

Watch the recording!

 

Don’t miss the next webinar about building healthy online communities! Sign up for our newsletter and never miss an update!

* indicates required




Does your online game, virtual world, or app include social features like chat, usernames, or user-generated images? Are you struggling with abusive content, lack of user engagement, or skyrocketing moderation costs?

Building an engaging, healthy, and profitable community in your product is challenging — that’s why we formulated the Six Essential Pillars of a Healthy Online Community!

In this exclusive talk, industry experts share their six techniques for creating a thriving, engaged, and loyal community in your social product:

Join us on Wednesday, April 4th at 10:00 AM PST/1:00 PM EST!

In this free, one-hour webinar, you’ll learn how to:

  • Protect your brand using “safety by design”
  • Increase user trust with consistent messaging
  • Reduce moderator workload & empower meaningful work
  • Improve community health using real-time feedback

Save your seat today for this ultimate guide to community health and engagement!


Thinking of Building Your Own Chat Filter? Five Reasons You’re Wasting Your Time!

If you’re building an online community, whether a game or social network, flagging and dealing with abusive language and users is critical to success. Back in 2014, a Riot Games study suggested that users who experience abuse their first time in the game are potentially three times more likely to quit and never return.

“Chatting is a major step in our funnel towards creating engaged, paying users. And so, it’s really in Twitch’s best interests — and in the interest of most game dev companies and other social media companies — to make being social on our platform as pleasant and safe as possible.” – Ruth Toner, Twitch

At Two Hat, we found that smart moderation can potentially double user retention. And we’re starting to experience an industry-wide paradigm shift. Today, gaming and social companies realize that if they want to shape healthy, engaged, and ultimately profitable communities, they must employ some kind of chat filter and moderation software.

But that begs the question — should you build it yourself or use an outside vendor? Like anti-virus software, it’s better left to a team dedicated day in, day out, to keeping the software updated.

A few things to consider before investing a great deal of time and expense into an in-house chat filter.

1. An allow/disallow list doesn’t work because language isn’t binary
Traditionally, most filters use a binary allow/disallow list. The thing is, language isn’t binary. It’s complex and nuanced.

For instance, in many older gaming communities, some swear words will be acceptable, based on context. You could build a RegEx tool to string match input text, and it would have no problem finding an f-bomb. But can it recognize the critical difference between “Go #$%^ yourself” and “That was #$%^ing awesome”?

What if your players spell a word incorrectly? What if they use l337 5p34k (and they will)? What if they deliberately try to manipulate the filter?

It’s an endless arms race, and your users have way more time on their hands than you do.

Think about the hundreds of different variations of these phrases:

“You should kill yourself / She deserves to die / He needs to drink bleach / etc”
“You are a [insert racial slur here]”

Imagine the time and effort it would take to enter every single variation. Now add misspellings. Now add l337 mapping. Now add the latest slang. Now add the latest latest slang.

It never ends.

Now, imagine using a filter that has access to billions of lines of chat across dozens of different platforms. By using a third-party filter, you’ll benefit from the network effect, detecting words and phrases you would likely never find on your own.

2. Keep your team focused on building an awesome product — not chasing a few bad actors around the block

“When I think about being a game developer, it’s because we love creating this cool content and features. I wish we could take the time that we put into putting reporting [features] on console, and put that towards a match history system or a replay system instead. It was the exact same people that had to work on both who got re-routed to work on the other. – Jeff Caplin, Blizzard Entertainment

Like anything else built in-house, someone has to maintain the filter as well as identify and resolve specific incidents. If your plan is to scale your community, maintaining your own filter will quickly become unmanageable. The dev and engineering teams will end up spending more time keeping the community safe than actually building the community and features.

Compare that with simply tapping into the RESTful API of a service provider that reliably uses AI and human review to keep abusive language definitions current and quickly process billions of reports per day. Imagine letting community managers identify and effectively deal with the few bad actors while the rest of your team relentlessly improves the community itself.

3. Moderation without triage means drowning in user reports
There is a lot more to moderation than just filtering abusive chat. Filtering — regardless of how strict or permissive your community may be — is only the first layer of defense against antisocial behavior.

You’ll also need a way for users to report abusive behavior, an algorithm that bubbles the worst reports to the top for faster review, an automated process for escalating especially dangerous (and potentially illegal) content for your moderation team to review, various workflows to accurately and progressively message, warn, mute, and sanction accounts and (hopefully) correct user behavior, a moderation tool with content queues for moderators to actually review UGC, a live chat viewer, an engine to generate business intelligence reports…

“Invest in tools so you can focus on building your game with the community.”

That’s Lance Priebe, co-creator of the massively popular kid’s virtual world Club Penguin, sharing one of the biggest lessons he learned as a developer.

Focus on what matters to you, and on what you and your team do best — developing and shipping kickass new game features.

4. It’s obsolete before it ships
The more time and money you can put into your core product — improved game mechanics, new features, world expansions — the better.

Think of it this way. Would you build your own anti-virus software? Of course not. It would be outdated before launch. Researching, reviewing, and fighting the latest malware isn’t your job. Instead, you rely on the experts.

Now, imagine you’ve built your own chat filter and are hosting it locally. Every day, users find new ways around the filter, faster than you can keep up. That means every day you have to spend precious time updating the repository with new expressions. And that means testing and finally deploying the update… and that means an increase in game downtime.


Build your own chat filter, they said. “It’ll be fun,” they said.

This all adds up to a significant loss of resources and time — your time, your team’s time, and your player’s time.

5. Users don’t only chat in English
What if your community uses other languages? Consider the work that you’ll have to put into building an English-only filter. Now, double, triple, quadruple that work when you add Spanish, Portuguese, French, German, etc.

Word-for-word translation might work for simple profanity, but as soon as you venture into colloquial expressions (“let’s bang,” “I’m going to pound you,” etc) it gets messy.

In fact, many languages have complicated grammar rules that make direct translation literally impossible. Creating a chat filter in, say, Spanish, would require the expertise of a native speaker with a deep understanding of the language. That means hiring or outsourcing multiple language experts to build an internal multi-language filter.

And anyone who has ever run a company knows — people are awesome but they’re awfully expensive.

Lego businessman is stressed about expenses.

How complex are other languages? German has four grammar cases and three genders. Finnish uses 15 noun cases in the singular and 16 in the plural. And the Japanese language uses three independent writing systems (hiragana, katakana, kanji), all three of which can be combined in a single sentence.

TL;DR: because grammar: Every language is complex in its own way. Running your English filter through a direct translation like Google translate won’t result in a clean, accurate chat filter. In fact, it will likely alienate your community if you get it wrong.

Engineering time is too valuable to waste
Is there an engineering team on the planet that has the time (not to mention resources) to maintain an internally-hosted solution?

Dev teams are already overtaxed with overflowing sprint cycles, impossible QA workloads, and resource-depleting deployment processes. Do you really want to maintain another internal tool?

If the answer is “no,” luckily there is a solution — instead of building it yourself, rely on the experts.

Think of it as anti-virus software for your online community.

Talk to the experts
Consider Community Sift by Two Hat Security for your community’s chat filter. Specializing in identification and triage of high-risk and illegal content, we are under contract to process 4 billion messages every day. Since 2012 we have been empowering gaming and social platforms to build healthy, engaged communities by providing cost-effective, purposeful automated moderation.

You’ll be in good company with some of the largest online communities by Supercell, Roblox, Kabam, and many more. Simply call our secure RESTful API to moderate text, usernames, and images in over 20 of the most popular IRL and digital languages, all built and maintained by our on-site team of real live native speakers.



Does Your Online Community Need a Chat Filter?

“Chatting is a major step in our funnel towards creating engaged, paying users. And so, it’s really in Twitch’s best interests — and in the interest of most game dev companies and other social media companies — to make being social on our platform as pleasant and safe as possible.”

– Ruth Toner, Data Scientist at Twitch, UX Game Summit 2017

You’ve probably noticed a lot of talk in the industry these days about chat filters and proactive moderation. Heck, we talk about filters and moderation techniques all the time.

We’ve previously examined whether you should buy moderation software or build it yourself and shared five of the best moderation workflows we’ve found to increase productivity.

Today, let’s take a step back and uncover why filtering matters in the first place.

Filtering: First things first

Using a chat, username, or image filter is an essential technique for any product with social features. Ever find yourself buried under a mountain of user reports, unable to dig yourself out? That’s what happens when you don’t use a filter.

Every game, virtual world, social app, or forum has its own set of community guidelines. Guidelines and standards are crucial in setting the tone of an online community, but they can’t guarantee good behavior.

Forget about Draconian measures. Gone are the days of the inflexible blacklist/whitelist, where every word is marked as either good or bad with no room for the nuances and complexity of language. Contextual filters that are designed around the concept of varying risk levels let you make choices based on your community’s unique needs.

Plenty of online games contain violence and allow players to talk about killing each other, and that’s fine — in context. And some online communities may allow users to engage in highly sexual conversations in one-on-one chat.

For many communities, their biggest concern is hate speech. Users can swear, sext, and taunt each other to their heart’s content, but racial and religious epithets are forbidden. With a filter that can distinguish between vulgarity and hate speech, you can grant your users the expressivity they deserve while still protecting them from abuse.

With a smart filter that leverages an AI/human hybrid approach, you can give those different audiences the freedom to express themselves.

Key takeaway #1: A content filter and automated moderation system is business crucial for products with user-generated content and user interactions.

Social is king

“A user who experiences toxicity is 320% more likely to quit.” – Jeffrey Lin, Riot Games

Back in 2014, a Riot Games study found a correlation between abusive player behavior and user churn. In fact, the study suggested that users who experience abuse their first time in the game are potentially three times more likely to quit and never return.

In 2016 we conducted a study with an anonymous battle game, and interestingly, found that users who engaged in chat were three times more likely to keep playing after the first day.

While there is still a lot of work to be done in this field, these two preliminary studies suggest that social matters. Further studies may show slightly different numbers, but it’s well understood that negative behavior leads to churn in online communities.

When users chat, they form connections. And when they form connections, they return to the game.

But the flipside of that is also true: When users chat and their experience is negative, they leave. We all know it’s far more expensive to acquire a new user than to keep an existing one — so it’s critical that gaming and social platforms do everything in their power to retain new users. The first step? Ensure that their social experience is free from abuse and harassment.

 

Key takeaway #2: The benefits of chatting can be canceled by user churn driven by exposure to high-risk content

 

Reduce moderation workload and increase user retention

Proactive filtering and smart moderation doesn’t just protect your brand and your community from abusive content. It also has a major impact on your bottom line.

Habbo is a virtual hotel where users from around the world can design rooms, roleplay in organizations, and even open their own trade shops and cafes. When the company was ready to scale up and expand into other languages, they realized that they needed a smarter way to filter content without sacrificing user’s ability to express themselves. 

By utilizing a more contextual filter than their previous blacklist/whitelist, Habbo managed to reduce their moderation workload by a whopping 70%. Without reportable content, there just isn’t much for users to report.

Friendbase is a virtual world where teens can chat, create, and play as friendly avatars. Similar to Habbo, the company launched their chat features with a simple blacklist/whitelist filter technology. However, chat on the platform was quickly filled with sexism, racism, and bullying behavior. For Friendbase, this behavior led to high user churn.

By leveraging a smarter, more contextual chat filter they were able to manage their users’ first experiences and create a healthy, more positive environment. Within six months of implementing new filtering and moderation technology, user retention by day 30 had doubled. And just like Habbo, user reports decreased significantly. That means fewer moderators are needed to do less work. And not only that — the work they do is far more meaningful.

Key takeaway #3: Build a business case and invest in a solid moderation software to ensure you leverage the best of healthy interactions in your online community.

Does your online community need a chat filter?

[postcallout title=”Thinking of Building Your Own Chat Filter?” body=”If you’re thinking about building your own chat filter, here are 5 important things to consider.” buttontext=”Read More” buttonlink=”https://www.twohat.com/thinking-building-chat-filter-4-reasons-youre-wasting-time/”]Ultimately, you will decide what’s best for your community. But the answer is almost always a resounding “yes.”

Many products with social features launch without a filter (or with a rudimentary blacklist/whitelist) and find that the community stays healthy and positive — until it scales. More users means more moderation, more reports, and more potential for abuse.

Luckily, you can prevent those growing pains by launching with a smart, efficient moderation solution that blends AI and human review to ensure that your community is protected from abuse and that users are leveraging your platform for what it was intended — connection, interaction, and engagement.

 

 

Want more articles like this? Subscribe to our newsletter and never miss a blog!

* indicates required