Doing The Math: Does Moderation Matter?

 In Online Safety

Welcome back to our series about the cost of doing nothing. Feeling lost? Take a minute to read the first two posts, The Other Reason You Should Care About Online Toxicity and A Tale of Two Online Communities.

Today we test our theory: when social products do nothing about toxicity, they lose money. Using AI Warzone and Trials of Serathian (two totally-made-up-but-awesome online games) as examples, we’ll run their theoretical financials through our mathematical formula to see how they perform.

Remember — despite being slightly different games, AI Warzone and Trials of Serathian have similar communities. They’re both competitive MMOs, are targeted to a 13+ audience, and are predominantly male.

But they differ in one key way. Our post-apocalyptic robot battle game AI Warzone proactively moderates the community, and our epic Medieval fantasy Trials of Serathian does nothing.

Let’s take a look at the math.

The Math of Toxicity

In 2014, Jeffrey Lin from Riot Games presented a stat at GDC that turned the gaming world on its head. According to their research, users who experience toxicity are 320% more likely to quit. That’s huge. To put that number in further perspective, consider this statistic from a 2015 study:

52% of MMORPG players reported that they had been cyber-victimized, and 35% said they had committed cyberbullying themselves.

A majority of players have experienced toxicity. And a surprising amount of them admit to engaging in toxic behavior.

We’ll take those numbers as our starting point. Now, let’s add a few key facts — based on real data — about our two fictional games to fill in the blanks:

  • Each community has 1 million users
  • Each community generates $13.51 in revenue from each user
  • The base monthly churn rate for an MMO is 5%, regardless of moderation
  • According to the latest Fiksu score, it costs $2.78 to acquire a new user
  • They’ve set a 10% Month over Month growth target

So far, so good — they’re even.

Now let’s add toxicity into the mix.

Even with a proactive moderation strategy in place, we expect AI Warzone users to experience about 10% toxicity. It’s a complex battle game where tension is built into the game mechanic, so there will be conflict. Users in Trials of Serathian — our community that does nothing to mitigate that tension— experience a much higher rate of toxicity, at 30%.

Using a weighted average, we’ll raise AI Warzone’s churn rate from 5% to 6.6%. And we’ll raise Trials of Serathian to 9.8%.

Taking all of these numbers into account, we can calculate the cost of doing nothing using a fairly simple formula, where U is total users, and U¹ is next month’s total users:

U¹ = U — (U * Loss Rate) + Acquired through Advertising

Using our formula to calculate user churn and acquisition costs, let’s watch what happens in their first quarter.

Increased User Churn = Increased Acquisition Costs

In their first quarter, AI Warzone loses 218,460 users. And to meet their 10% growth rate target, they spend $1,527,498 to acquire more.

Trials of Serathian, however, loses 324,380 users (remember, their toxicity rate is much higher). And they have to spend $1,821,956 to acquire more users to meet the same growth target.

Let’s imagine that AI Warzone spends an additional $60,000 in that first quarter on moderation costs. Even with the added costs, they’ve still saved $234,457 in profits.

That’s a lot. Not enough to break a company, but enough to make executives nervous.

Let’s check back in at the end of the year.

The Seven Million Dollar Difference

We gathered a few key stats from our two communities.

When Trials of Serathian does nothing, their EOY results are:

  • Churn rate: 9.8%
  • User Attrition: -8,672,738
  • Total Profits (after acquisition costs): $39,784,858

And when AI Warzone proactively moderates, their EOY results are:

  • Churn rate: 6.6%
  • User Attrition: -5,840,824
  • Total Profits (after acquisition costs): $47,177,580

AI Warzone deals with toxicity in real time and loses fewer users in the process — by nearly 3 million. They can devote more of their advertising budget to acquiring new users, and their userbase grows exponentially. The end result? They collect $7,392,722 more in profits than Trials of Serathian, who does nothing.

Userbase growth with constant 30% revenue devoted to advertising.

And what does AI Warzone do with $7 million more in revenue? Well, they develop and ship new features, fix bugs, and even start working on their next game. AI Warzone: Aftermath, anyone?

These communities don’t actually exist, of course. And there are a multitude of factors that can effect userbase growth and churn rate. But it’s telling, nonetheless.

And there are real-world examples, too.

Sticks and Stones

Remember the human cost that we talked about earlier? Money matters — but so do people.

We mentioned Twitter in The Other Reason You Share About Online Toxicity. Twitter is an easy target right now, so it’s tempting to forget how important the social network is, and how powerful it can be.

Twitter is a vital platform for sharing new ideas and forging connections around the globe. Crucially, it’s a place where activists and grassroots organizers can assemble and connect with like-minded citizens to incite real political change. The Arab Spring in 2011 and the Women’s March in January of this year are only two examples out of thousands.

But it’s become known for the kind of abuse that Lily Allen experienced recently — and for failing to deal with it adequately. Twitter is starting to do something — over the last two years, they’ve released new features that make it easier to report and block abusive accounts. And earlier this week even more new features were introduced. The question is, how long can a community go without doing something before the consequences catch up to them?

Twitter’s user base is dwindling, and their stock is plummeting, in large part due to their inability to address toxicity. Can they turn it around? We hope so. And we have some ideas about how they can do it (stay tuned for tomorrow’s post).

What Reddit Teaches us About Toxicity and Churn

Reddit is another real-world example of the cost of doing nothing.

In collaboration with Simon Fraser University, we provided the technology to conduct an independent study of 180 subreddits, using a public Reddit data set. In their academic paper “The Impact of Toxic Language on the Health of Reddit Communities,” SFU analyzes the link between toxicity and community growth.

They found a correlation between an increase in toxic posts and a decrease in community growth. Here is just one example:

The blue line shows high-risk posts decreasing; the red line shows the corresponding increase in community growth.

It’s a comprehensive study and well worth your time. You can download the whitepaper here.

What Now?

Using our formula, we can predict how a proactive moderation strategy can impact your bottom line. And using our two fictional games as a model, we can see how a real-world community might be affected by toxicity.

AI Warzone chose to engineer a healthy community — and Trials of Serathian chose to do nothing.

But what does it mean to “engineer a healthy community”? And what strategies can you leverage in the real world to shape a troll-free community?

In tomorrow’s post, we examine the moderation techniques that AI Warzone used to succeed.

Spoiler alert: They work in real games, too.

Originally published on Medium

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


Recommended Posts
×
Subscribe to the blog and never miss a post!
Contact Us

Hello! Send us your question, and we'll get back to you as soon as possible.

Start typing and press Enter to search