Sky: Children of the Light is the Kind of Game We Need Right Now

I play sky with my nephew, he is 6yrs old. We have a routine he gets home from school and the evenings I’m home from work we play for a while. He loves sky and he’s such a kind boy. Sky is such a beautiful place where he uses all his candles making friends.

I moved from my parents house 2 years ago now and I don’t get to see my siblings very often. However, my little sister and I always find some time to play to the game at the same time like that we can keep in touch

I’ve met so many awesome people on Sky, most of them live on the other side of the world. We keep track of each other’s timezones so we can play together. I have Sky friends who don’t speak the same language as me but that’s what the cute emotes are for!

Those are just a few of the tweets from Sky: Children of the Light fans, in answer to the question “Do you play #thatskygame with family, long-distance friends, or your significant other? 🥰 We are so inspired by these stories and would love to hear more from people who use Sky to connect with loved ones.”

Sky is a new social adventure game from thatgamecompany, the studio responsible for some of the most innovative and beloved games of the last 10+ years, including Journey, Flower, and Flow.

Featuring inventive gameplay based on compassion and community, gorgeous music, and a world populated with broken constellations, ancestor spirits, and Candles, Sky is the gaming experience we need right now. It’s a compelling example of a game designed with a singular purpose – connection and community.

The results speak for themselves – Sky has won multiple awards, including iPhone Game of the Year 2019, the Audience Award at the Game Developers Choice Awards 2020, and most recently the Pocket Gamer People’s Choice award. With over 10 million downloads already on iOS, Sky was just released for Android last week and will be available on the Nintendo Switch this summer.

We’ve paired up with Sky’s Community Developer Robert Hornbek to bring you a short webinar exploring how the studio designed its newest game with positive social features at the core, and how they maintain that healthy community spirit using intentional and purposeful moderation.

In this webinar, you’ll learn:

  • Why thatgamecompany is committed to player safety (5:00 – 6:32)
  • How Sky: Children of the Light leverages innovative social features for better user engagement and experience (12:38 – 19:08, 25:08 – 28:30)
  • Their moderation best practices built with Two Hat (19:09 – 24:38)
  • Three pieces of advice for teams launching a new social game (28:35 – 33:15)

About thatgamecompany

thatgamecompany is a game studio dedicated to creating timeless interactive entertainment that inspires human connection worldwide. Creator of the critically-acclaimed games Sky, Journey, Flower, and Flow.



Join a Webinar: Taking Action on Offensive In-Game Chat

I’m excited to announce that Two Hat is co-hosting an upcoming webinar with the International Game Developers Association on Friday, February 21st, 2020.

The incredible Liza Wood (check out her bio below), our Director of Research and Data Science, will be joining me as we present Defining, Identifying and Actioning Offensive Chat: Approaches and Frameworks.

We will start by examining why defining, identifying and actioning offensive chat matters to game development, with tangible supporting stats. Later we will provide an overview of the Five Layers of Community Protection.

Here’s what you can expect to get out of it:

  • Compelling arguments for adding player safeguarding mechanisms to your game’s social features
  • Actionable takeaways for creating internal alignment on categories and terminology
  • Action plans for identifying and addressing disruptive behavior

We hope you will join us on February 21st at 3 pm PST on IGDA’s Twitch Channel. Mark your calendars!

To celebrate this collaboration with the IGDA, we’re offering exclusive early access to our brand new Content Moderation Best Practices PDF, containing practical applications that you can start leveraging today. Download in advance of the full release happening later this month by filling out the form below.

When you sign up, we will also send you an email reminder on the 20th so you don’t miss the webinar. See you there!



About Liza Wood

Liza brings a wealth of experience and remarkable work in the games industry. After 13 years in video game development, Liza joined Two Hat Security as Director of Research and Data Science in August 2019. There she leads a team of researchers who are helping customers and partners build safe and healthy online communities by removing negative interactions to make room for positive human connections. Prior to starting this new phase of her career, she was the Executive Producer of Disney’s Club Penguin Island, the successor to Club Penguin, where she saw the positive impact that online communities can have.

About the IGDA

We are encouraged by and fully believe in IGDA’s mission to support and empower game developers around the world. Having worked for a gaming organization and co-founded the Fair Play Alliance, I strongly believe in the power of games to create meaningful and life-changing experiences for billions of players collectively. And that starts with supporting the dedicated professionals who are committed to creating those experiences.

The Changing Landscape of Automated Content Moderation in 2019

Is 2019 the year that content moderation goes mainstream? We think so.

Things have changed a lot since 1990 when Tim Berners-Lee invented the World Wide Web. A few short years later, the world started to surf the information highway – and we’ve barely stopped to catch our collective breath since.

Learn about the past, present, and future of online content moderation in an upcoming webinar

The internet has given us many wonderful things over the last 30 years – access to all of recorded history, an instant global connection that bypasses country, religious, and racial lines, Grumpy Cat – but it’s also had unprecedented and largely unexpected consequences.

Rampant online harassment, an alarming rise in child sexual abuse imagery, urgent user reports that go unheard – it’s all adding up. Now that well over half of Earth’s population is online (4 billion people as of January 2018), we’re finally starting to see an appetite to clean up the internet and create safe spaces for all users.

The change started two years ago.

Mark Zuckerberg’s 2017 manifesto hinted at what was to come:

“There are billions of posts, comments, and messages across our services each day, and since it’s impossible to review all of them, we review content once it is reported to us. There have been terribly tragic events — like suicides, some live streamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner. There are cases of bullying and harassment every day, that our team must be alerted to before we can help out. These stories show we must find a way to do more.”

In 2018, the industry finally realized that it was time to find solutions to the problems outlined in Facebook’s manifesto. The question was no longer, “Should we moderate content on our platforms?” and instead became, “How can we better moderate content on our platforms?”

Play button on a film stripLearn how you can leverage the latest advances in content moderation in an upcoming webinar

The good news is that in 2019, we have access to the tools, technology, and years of best practices to make the dream of a safer internet a reality. At Two Hat, we’ve been working behind the scenes for nearly seven years now (alongside some of the biggest games and social networks in the industry) to create technology to auto-moderate content so accurately that we’re on the path to “invisible AI” – filters that are so good you don’t even know they’re in the background.

On February 20th, we invite you to join us for a very special webinar, “Invisible AI: The Future of Content Moderation”. Two Hat CEO and founder Chris Priebe will share his groundbreaking vision of artificial intelligence in this new age of chat, image, and video moderation.

In it, he’ll discuss the past, present, and future of content moderation, expanding on why the industry shifted its attitude towards moderation in 2018, with a special focus on the trends of 2019.

He’ll also share exclusive, advance details about:

We hope you can make it. Give us 30 minutes of your time, and we’ll give you all the information you need to make 2019 the year of content moderation.

PS: Another reason you don’t want to miss this – the first 25 attendees will receive a free gift! ; )


Read about Two Hat’s big announcements:

Two Hat Is Changing the Landscape of Content Moderation With New Image Recognition Technology

Two Hat Leads the Charge in the Fight Against Child Sexual Abuse Images on the Internet

Two Hat Releases New Artificial Intelligence to Moderate and Triage User-Generated Reports in Real Time

 

How Do You Calculate the ROI of Proactive Moderation in Chat?

On Tuesday, October 30th, I’m excited to be talking to Steve Parkis, a senior tech and entertainment executive who drove amazing growth in key products at Disney and Zynga, about how chat has a positive effect on user retention and overall revenue. It would be great to have you join us — you can sign up here.

Until then, I would like to get the conversation started here.

There is a fundamental understanding in online industries that encouraging prosocial, productive interactions and curbing anti-social, disruptive behavior in our online communities are important things to do.

The question I’ve been asking myself lately is this — do we have the numbers to prove that proactive moderation and other approaches are business crucial?

In my experience, our industries (games, apps, social networks, etc) lack the studies and numbers to prove that encouraging the productive and tackling negative interactions has a key impact on user engagement, retention, and growth.

This is why I’m on a mission this quarter to create new resources, including a white paper, that will shed light on this matter, and hopefully help as many people as possible in their quest to articulate this connection.

First steps and big questions
We already know that chat and social features are good for business — we have lots of metrics around this — but the key info that we’re missing is the ROI of proactive moderation and other community measures. Here’s where I need your help, please:

  • How have you measured the success of filtering and other approaches to tackle disruptive behavior (think spam, fraud, hate speech, griefing, etc) as it relates to increased user retention and growth in your communities?
  • Have you measured the effects of implementing human and/or automated moderation in your platforms, be it related to usernames, user reports, live chat, forum comments, and more?
  • Why have you measured this?

I believe the way we are currently operating is self-sabotage. By not measuring and surfacing the business benefits of proactive moderation and other measures to tackle anti-social and disruptive behaviour, our departments are usually seen as cost-centers rather than key pieces in revenue generation.

I believe that our efforts are crucial to removing the blockers to growth in our platforms, and also encouraging and fostering stronger user engagement and retention.

Starting the conversation
I’ve talked to many of you and I’m convinced we feel the same way about this and see similar gaps. I invite you to email your comments and thoughts to carlos.figueiredo@twohat.com.

Your feedback will help inform my next article as well as my next steps. So what’s in it for you? First, I’ll give you a shoutout (if you want) in the next piece about this topic, and will also give you exclusive access to the resources once they are ready, giving you credit where it’s due. You will also have my deepest gratitude : ) You know you can also count on me for help with any of your projects!

To recap, I would love to hear from you about how you and your company are measuring the return on investment from implementing measures (human and/or technology driven) to curb negative, antisocial behaviour in your platforms.

How are you thinking about this, what are you tracking, and how are you analyzing this data?

Thanks in advance for your input. I look forward to reading it!



Upcoming Webinar: Yes, Your Online Game Needs a Chat Filter

Are you unconvinced that you need a chat filter in your online game, virtual world, or social app? Undecided if purchasing moderation software should be on your product roadmap in 2018? Unsure if you should build it yourself?

You’re not alone. Many in the gaming and social industries are still uncertain if chat moderation is a necessity.

On Wednesday, January 31st at 10:00 am PST, Two Hat Community Trust & Safety Director Carlos Figueiredo shares data-driven evidence proving that you must make chat filtering and automated moderation a business priority in 2018.

In this quick 30-minute session, you’ll learn:

  • Why proactive moderation is critical to building a thriving, profitable game
  • How chat filtering combined with automation can double user retention
  • How to convince stakeholders that moderation software is the best investment they’ll make all year