How #ICANHELP is Empowering the Next Generation of Digital Citizens

Two Hat believes that everyone should be free to share online without fear of harassment or abuse. We also believe that making this vision a reality is a shared responsibility.

That’s why we have allied ourselves with diverse organizations including non-profits, government agencies, private companies, and industry alliances to share best practices, produce online safety resources, and spread the word of proactive, purposeful content moderation. One of those organizations is the California-based non-profit #ICANHELP.

We recently sat down with Matt Soeth, co-founder and executive director of #ICANHELP to discuss the organization’s upcoming initiatives with the NY Yankees, his thoughts on social media legislation, and #Digital4Good, their annual event celebrating student achievements.

===

Carlos Figueiredo: Tell us about your organization, #ICANHELP.

Matt Soeth: #ICANHELP educates and empowers students to use social media positively.

We train students how to be digital first responders. When they see something online we want them to know how to report content (when necessary), how to respond to negative content, and in the words of students, how to respond to all the “drama.”

At the same time, we work with students to build positive social media campaigns. We train educators and admin best practices around modeling and guiding students in developing their digital identity.

CF: What initiatives are you working on right now?

MS: We are excited to announce our partnership with the NY Yankees as part of their Bronx Education All Star Day. About 9000 students and educators were at the game on May 29 being recognized for academic achievement, civic engagement, and leadership. We will be working with Bronx schools in the fall with our curriculum and resources to help students develop social projects for the 2019-2020 school year.

We have two new online courses for teachers and administrators we are excited to share. Our teacher course walks educators through building up their social media presence, managing student social media teams, and the best way to model and guide students in creating and managing content. Our admin course will help anyone looking for policy examples and guidance on how to respond and investigate social media incidents. We walk everyone through how to create an incident response plan related to a social media incident as well as how to work with law enforcement, social media companies, and gaming companies to get content removed that violates a platform’s terms of service.

We are excited to announce a collaboration project with the Well Being Trust, the foundation for Providence/St. Joseph Health, to create mental health and wellness curriculum around digital wellbeing. So much of what we see in digital wellbeing currently is focused heavily on devices and we are looking to develop some tools for young people and educators to help them talk about stress, anxiety, and online negativity.

CF: How can people get involved?

MS: Check out our website. You can always contact us through our site. We are very active on social media, @icanhelp on Twitter, @icanhelpofficial on Facebook, Instagram, and YouTube. We are sharing content all the time, so if schools are looking for ideas, that’s the place to connect with us.

That being said – share! So much of what we do is word of mouth and we have students all over doing some amazing work. We’d love for more people to know those stories.

If you are an educator or parent and care about this topic, please reach out and share our resources, invite us to your community or school, help us grow the conversation and keep it going.

CF: What is your take on the social media legislation being introduced around the world – Online Harms and Duty of Care in the UK, Sharing of Abhorrent Violent Material in Australia, the Christchurch Call?

MS: All of these actions are leading to new policy and regulation to hold companies accountable for the content on their site. The challenge will come when trying to enforce these laws and regulations as that part is still unclear. The intent behind these actions is clear, making the internet a better and safer place for users, particularly youth.

The one challenge I do have with all of this is the emphasis on government regulation and corporate responsibility. Whenever there is a major social incident – offline, that then goes viral and plays out online – we as users react. In this case, with a landslide of recent incidents, we got the white paper, identifying and removing terrorist content, and so on. It solves the problem now, but I often wonder if gets to the systemic underlying issue causing all of these problems. For example, there is so much talk about cyberbullying, but kids are still more likely to be bullied in person than online. Responding to cyberbullying is good, and needs to happen, but regulating companies is not enough. If the internet is going to be a better place, it needs to be a collective effort: users, nonprofits, content experts, education institutions, companies, you name it. It takes all of us.

CF: Can you give us a sneak peek at #Digital4Good 2019?

MS: We are really excited about #Digital4Good 2019. It’s being held at Facebook HQ in Menlo Park, CA. We have our winners selected and will be sharing more about them soon. It’s a diverse group of students from all over the US covering a range of topics and projects around bullying, race, homelessness, robotics, leadership development, and cancer research.

These students will be sharing their stories live on September 16, 2019. We’ve invited a few schools to attend, though seating is limited. You can tune into the live stream and see the event as it happens (or watch again later). To get notified about the live stream, fill out the form on the page. We will tell you when the event is happening and share out the schedule of student speakers.

CF: Thanks for sitting down with us, Matt!

MS: My pleasure, Carlos!

===



How the First Text Made the World a Better Place

December 3rd, 1992. An engineer named Neil Papworth sits down at his computer and composes a simple message. A few buildings over, a Vodaphone executive named Richard Jarvis relaxes at his office Christmas party. He glances down at the glowing green screen of his Orbitel 901 handset.

“Merry Christmas,” he reads.

He can’t text back — his clumsy, walkie-talkie-esque phone with its soft rubber keypad won’t be ready for that for another year — so he sends word back to Neil at the office the old-fashioned way. We don’t know if he sends an email or a messenger between buildings, but the message is clear:

It worked!

And with that, a communication revolution is born.

Cell phones have changed a lot over the years.

Have you ever thought about where you would be without that first text? Without the example and popularity of texting, we probably wouldn’t have social media as we experience it today. It’s astonishing how quickly the communication devices that we take for granted today — forums, blogs, and social media like Facebook, Twitter, and Snapchat — developed after the invention of SMS texts in 1992.

And it all started with a simple statement: Merry Christmas.

In the spirit of the season, let’s look at the ways chat has changed the world — the gifts of SMS, if you will. And in 25 years, there have been a few noteworthy ones.

It’s All About Connection

Some people have argued that texting, IM, and game chat has driven us apart, but evidence suggests that it’s just the opposite. Have you noticed that, when there’s a screen separating us, we tend to share our deepest feelings, connect faster, and show greater kindness and empathy? In short, we’re vulnerable.

We talk about toxic inhibition a lot — people are more comfortable saying cruel, abusive things in chat that they would likely never say to someone in the real world. Psychology professor John Suler identified toxic inhibition when he discovered the online disinhibition effect, which explains why people say and do things online that they would never do in real life. But the flipside of toxic inhibition is benign disinhibition, which we tend to forget about. And it’s a wonderful thing: users who experience benign disinhibition are more open, more giving, and more likely to share.

The beauty of benign disinhibition? Online chat has opened up communication channels for people who would previously find it nearly impossible to connect with others. The shy or introverted, people with speech impediments, the hard of hearing, the socially stigmatized — before the emergence of widespread online chat, marginalized people had very few places to turn.

And there are so many places to turn. Take a look at these numbers:

Is everyone online connected in positive, harmonious ways? Of course not. But enough of us have found common ground through the written (texted?) word that it hardly matters.

One Encyclopedia to Rule Them All!

The internet is the biggest, most comprehensive knowledge base in the world. And it’s all due to the great connection and sharing co-op that began with that first SMS text.

There are approximately 4.54 billion pages on the web and well over a billion websites. To put that in perspective: In 1994, there were about 3000 websites. Consider that the famously well-stocked Library of Alexandria (we like to refer to it as “the internet of classical antiquity”) only held between 200,000-700,000 books. Wikipedia comprises only a tiny fraction of cyberspace and it’s made up of nearly 35 million articles in 288 different languages.

Would the internet be this big without chat rooms, forums, and comments sections? Probably not. Imagine all of the new ideas that have sprung forth, fully formed like Athena from the head of Zeus, when users across continents, cultures, and even languages chat! That is the fundamental difference between physical spaces like the great library of ancient Egypt and virtual spaces like the global internet of 2017. Sometimes, size does matter.

Ever had a question that you had to answer right that second? Think of all the questions you’ve just had to research over the years. Questions like who’s written the most Tweets? What’s the fastest speed that a cheetah can run? What was the first message ever sent through AOL Messenger? And just what is Scotland’s national animal?

The answers to all of these questions are literally at your fingertips, thanks to the collaborative work of millions of people from different cultures and languages chatting, learning, and creating. And chat — the legacy of SMS — is at the heart of that cross-continent teamwork.

The Power of Empathy

There’s a secondary benefit to all of this information being shared across cultures — the more we read and the more we learn about other cultures, the more we cultivate empathy. Brene Brown says that “Empathy moves us to a place of courage and compassion. Through it, we come to realize that our perspective is not the perspective.”

When we gain perspective and learn greater empathy, we understand that the people who share the world with us are indeed just that — people. Not political foes, not rivals, not “them.” Just people, like us. We’re a lot less inclined to go to war with nations when we recognize that they’re made up of people with the same hopes and dreams.

We may be a few blogs and international forums away from world peace, but hey — the more you know, right? A little knowledge is a powerful thing.

The Gifts of Chat

Twenty-four years ago, an engineer sat down at his computer, typed fourteen letters, and sent a text to a colleague that changed the world. That simple holiday message drastically changed the way we communicate. The internet was in its infancy in 1992, and it learned a lot from that first SMS text.

Forums, message boards, social media — they’re all the offspring of Neil Papworth’s first text to Richard Jarvis. Chat has deepened our connection to the global community and broadened our knowledge with its vast collections of information. What does the future of chat look like to you? Will virtual reality finally make the written language obsolete? Will the chatrooms of the future abandon words altogether? How much smarter can smartphones get? Who knows what the next big innovation in communication will look like.

It’s the holiday season, when we look back and honor the past, savor the present, and look forward to the future. Today, when you text your spouse or chat with a friend over Google Hangouts, take a minute to think about the history of chat and how it’s affected your life — and changed the world.

And before we forget — Merry Christmas and Happy Holidays!

Still wondering about the answers to those questions? Flip your screen over:

¡uɹoɔᴉun ǝɥ┴
˙ɥ/ɯʞ 0ᄅƖ – 0ƖƖ
Ɛ> ¡ʍʍʍ∀ ”˙noʎ ssᴉɯ puɐ noʎ ǝʌo˥ ˙ǝɯ sᴉ ʇᴉ …pǝɹɐɔs ǝq ʇ’uop“
˙sʇǝǝʍ┴ 000’Ɩㄥㄥ’ㄥƐ ʇuǝs pɐɥ SIH┴ƎNƎΛ@ ‘9Ɩ0ᄅ ʎɐW ɟo s∀

 

Want more stories like this? Subscribe to our mailing list and never miss a blog!

* indicates required


Why Should Social Networks Encourage Digital Citizenship?

“Digital citizenship, and promoting a respectful yet vibrant environment is a multi-pronged effort.” — David Ryan Polgar

If one social network doesn’t prevent us from harassing strangers, then can we be expected to behave any differently when we switch platforms? If an online game is designed to create tension, then can we really be held responsible when we lash out at our teammates?

In other words — do social products have an ethical responsibility to encourage good citizenship?

To unpack this tricky topic, we turn to renowned writer, speaker, commentator and real-life Tech Ethicist David Ryan Polgar. He sat down with Two Hat Security’s Director of Community Trust & Safety Carlos Figueiredo to discuss the complex and sometimes divisive subject of social products and social responsibility.

Press play to listen:

Highlights & key quotes

On responsibility:

If [companies] want to have a sustainable business, they need to consider that this is business-critical. — Carlos Figueiredo

I think what we’re realizing is that the environment and the structures that we create are dramatically influential on human behavior… From a company standpoint, they now have that responsibility to try to prompt us towards the better use of their product. — David Ryan Polgar

 

On the “attention economy”:

We are giving a lot of time to [social media] companies. Does that create a responsibility because we are giving our time to them? — CF

It’s not a typical business-consumer relationship. Facebook and Twitter and Snapchat operate in this quasi-public space. And that’s very similar in law, what they’ve done with freedom of speech in a mall… Now, we, the general public, are expecting a voice in the way these companies operate. — DRP

 

On forging industry-wide alliances:

It’s so easy for us to think that each of these companies should take care of their own. Of course I believe that, but I also think that it’s time for the industry to have a wide discussion, and to have coalitions and alliances. If there is a consistency and a coherence amongst different companies, suddenly we can’t just have users and players jump from one platform to the other and bring bad behavior. — CF

Unless you have those coalitions, everybody is reinventing the wheel. You’re spending a lot of time, energy, and research in private endeavors instead of sharing, and having this open environment where we’re saying: as a community, as this collective, and as an industry, this is something we need to combat. — DRP

 

On the future:

I think the tide is turning in terms of digital citizenship, fair play, and sportsmanship when it comes to eSports and games. It’s financially smart and it’s ethically smart for the industry to talk about this. — CF

Social media is like a knife. It can be used to inflict pain or stab the truth. But it can also be used to carve a future that’s more socially just, more connected, and more intellectually curious. It’s like any tool.

The way to push social media forward, to build a better web, and to capitalize on what we know the internet should be, is to take that collective action, where people, businesses, and organizations come together and say “Here’s what we want — now how can we get there? How can we share the knowledge, how can we use the tools that we have and create new tools to build this better web?” — DRP

 

About the speakers

David Ryan Polgar

David Ryan Polgar has carved out a unique and pioneering career as a “Tech Ethicist.” With a background as an attorney and college professor, he transitioned in recent years to focus entirely on improving how children, teens, and adults utilize social media & tech. David is a tech writer (Big Think, Quartz, and IBM thinkLeaders), speaker (3-time TEDx, The School of The New York Times), and frequent tech commentator (SiriusXM, AP, Boston Globe, CNN.com, HuffPost). He has experience working with startups and social media companies (ASKfm), and co-founded the global Digital Citizenship Summit (held at Twitter HQ in 2016). Outside of writing and speaking, David currently serves as Trust & Safety for the teen virtual world Friendbase. He is also a board member for the non-profit #ICANHELP, which led the first #Digital4Good event at Twitter HQ on September 18th.

His forward-thinking approach to online safety and digital citizenship has been recognized by various organizations and outlets across the globe and was recently singled out online by the Obama Foundation.

Follow David on Twitter and LinkedIn.

Carlos Figueiredo

Carlos Figueiredo leads Two Hat Security’s Trust & Safety efforts, collaborating with clients and partners to challenge our views of healthy online communities.

Born and raised in Brazil, Carlos has been living in Canada for almost 11 years where he has worked directly with online safety for the last 9 years, helping large digital communities with their mission to stay healthy and engaged. From being a moderator himself to leading a multi-cultural department that was pivotal to the safety of global communities across different languages and cultures, Carlos has experienced the pains and joys of on-screen interactions.

He’s interested in tackling the biggest challenges of our connected times and thrives on collaborating and creating bridges in the industry. Most recently, he moderated the Tech Power Panel at #Digital4Good. On Wednesday, October 18th he’s presenting a free online workshop called Your Must-Have Moderation Strategy: Preparing for Breaking News & Trending Topics.

Follow Carlos on Twitter and LinkedIn.

About Two Hat Security

At Two Hat Security, we empower social and gaming platforms to build healthy, engaged online communities, all while protecting their brand and their users from high-risk content. Want to increase user retention, reduce moderation, and protect your brand?

Get in touch today to see how our chat filter and moderation software Community Sift can help your product encourage good digital citizenship.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


When Social Networks Put Online Safety First, We All Win

 “If we’re looking at the current zeitgeist, you have a consumer base that’s looking toward tech companies to showcase moral guidance.” — David Ryan Polgar

Users are fed up.

Tired of rampant harassment and abuse in social media, consumers have finally begun to demand safer online spaces that encourage and reward good digital citizenship. And they’re starting to hold social networks accountable for dangerous behavior on their platforms.

But what exactly are online safety and digital citizenship? And what can social networks do to make safety an industry standard?

We spoke with Trust & Safety experts David Ryan Polgar of Friendbase and Carlos Figueiredo of Two Hat Security to get their thoughts on changing attitudes in the industry — and the one thing that social networks can do today to inspire civility and respect on their platform.

Click play to listen:

Highlights & key quotes

On safety:

“Online safety… is very similar to driving. There are lots of dangers to getting on the road, but that doesn’t mean we don’t get on the road.” — David Ryan Polgar

“It’s important for us to consider not just safety, but what is a healthy online experience? It’s okay to have a certain amount of risk that will vary from community to community… We don’t want to focus just on the dangers and risks.” — Carlos Figueiredo

On “safety by design”:

“There are lots of examples where a company scaled up quickly and aggressively got millions of users, but they didn’t necessarily have the features in place to have a safe experience. We want safety, but we also want vibrancy, that happy mix  — what I call a ‘Goldilocks zone.’ And the danger is, once you get labeled as a place that allows for toxic behavior, it’s very difficult to alter that perception, even when you change some of the tools.” — DRP

“Whenever possible, safety needs to be a product and design consideration from the very beginning… by having this proactive approach, you can prevent a lot of issues.” — CF

On setting a positive tone in your product:

I think the big thing is intuitive tools. That’s always been a big complaint for a lot of individuals. Once you have a problem online, is it intuitive to report it? And then, potentially more importantly, what’s the protocol after that’s been reported?” — DRP

“One thing that I would definitely recommend that people start doing is, if they don’t have an individual or a team in charge of community well-being or community safety, have somebody where at least a big chunk of time is dedicated to this – and a team, even better. Put that as a key priority of your product. Employ really solid people who understand your community.” — CF

Online safety & digital citizenship resources

David is a board member for the non-profit #ICANHELP, which holds the first annual #Digital4Good event next month at Twitter HQ. This highly-anticipated event brings together students, representatives from the tech industry, and teachers to discuss and celebrate positive tech and media use.

Learn more on the #ICANHELP website, and follow @icanhelp and #Digital4Good on Twitter. 

Don’t miss the live-streamed event on Monday, September 18th. Carlos will be moderating a panel with three very special guests (more info to come!). They’ll be talking about player behavior in online games.

Two Hat Security is hosting an exclusive webinar about community building on Wednesday, September 13th. In The Six Essential Pillars of Healthy Online Communities, Carlos shares the six secrets to creating a thriving, engaged, and loyal community in your social product. Whether you’re struggling to build a new community or need advice shaping an existing product, you don’t want to miss this. Save your seat today!

David is a prolific writer who thoughtfully examines the ethical consequences of emerging technology. Recent pieces include Alexa, What’s the Future of Conversational Interface? and Has Human Communication Become Botified? Follow @TechEthicist on Twitter for insights into online safety, digital citizenship, and the future of tech.

About the speakers

David Ryan Polgar

David Ryan Polgar has carved out a unique and pioneering career as a “Tech Ethicist.” With a background as an attorney and college professor, he transitioned in recent years to focus entirely on improving how children, teens, and adults utilize social media & tech. David is a tech writer (Big Think, Quartz, and IBM thinkLeaders), speaker (3-time TEDx, The School of The New York Times), and frequent tech commentator (SiriusXM, AP, Boston Globe, CNN.com, HuffPost). He has experience working with startups and social media companies (ASKfm), and co-founded the global Digital Citizenship Summit (held at Twitter HQ in 2016). Outside of writing and speaking, David currently serves as Trust & Safety for the teen virtual world Friendbase. He is also a board member for the non-profit #ICANHELP, which is planning the first #Digital4Good event at Twitter HQ on September 18th.

His forward-thinking approach to online safety and digital citizenship has been recognized by various organizations and outlets across the globe and was recently singled out online by the Obama Foundation.

Carlos Figueiredo

Carlos Figueiredo leads Two Hat Security‘s Trust & Safety efforts, collaborating with clients and partners to challenge our views of healthy online communities.

Born and raised in Brazil, Carlos has been living in Canada for almost 11 years where he has worked directly with online safety for the last 9 years, helping large digital communities with their mission to stay healthy and engaged. From being a moderator himself to leading a multi-cultural department that was pivotal to the safety of global communities across different languages and cultures, Carlos has experienced the pains and joys of on-screen interactions.

He’s interested in tackling the biggest challenges of our connected times and thrives on collaborating and creating bridges in the industry.

 

 

About Two Hat Security

At Two Hat Security, we empower social and gaming platforms to build healthy, engaged online communities, all while protecting their brand and their users from high-risk content. Want to increase user retention, reduce moderation, and protect your brand?

Get in touch today to see how our chat filter and moderation software Community Sift can help you make online safety a priority in your product.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


How To Prevent Offensive Images From Appearing in Your Social Platform

If you manage a social platform like an Instagram or a Tumblr, you’ll inevitably face the task of having to remove offensive UGC (user-generated content) from your website, game, or app.

At first, this is simple, with only the occasional inappropriate image or three to remove. Since it seems like such a small issue, you just delete the offending images as needed. However, as your user base grows, so does the % of users who refuse to adhere to your terms of use.

There are some fundamental issues with human moderation:

  • It’s expensive. It costs much more to review images manually, as each message needs to be reviewed by flawed human eyes.
  • Moderators get tired and make mistakes. As you throw more pictures at people, they tend to get sick of looking for needles in haystacks and start to get fatigued.
  • Increased risk. If your platform allows for ‘instant publishing’ without an approval step, then you take on the additional risk of exposing users to offensive images.
  • Unmanageable backlogs. The more users you have, the more content you’ll receive. If you’re not careful, you can overload your moderators with massive queues full of stuff to review.
  • Humans aren’t scalable. When you’re throwing human time at the problem, you’re spending human resource dollars on things that aren’t about your future.
  • Stuck in the past. If you’re spending all of your time moderating, you’re wasting precious time reacting to things rather than building for the future.

At Two Hat, we believe in empowering humans to make purposeful decisions with their time and brain power. We built Community Sift to take care of the crappy stuff so you don’t have to. That’s why we’ve worked with leading professionals and partners to provide a service that automatically assesses and prioritizes user-generated content based on probable risk levels.

Do you want to build and maintain your own anti-virus software and virus signatures?

Here’s the thing — you could go and build some sort of image system in-house to evaluate the risk of incoming UGC. But here’s a question for you: would you create your own anti-virus system just to protect yourself from viruses on your computer? Would you make your own project management system just because you need to manage projects? Or would you build a bug-tracking database system just to track bugs? In the case of anti-virus software, that would be kind of nuts. After all, if you create your own anti-virus software, you’re the first one to get infected with new viruses at they emerge. And humans are clever… they create new viruses all the time. We know because that’s what we deal with every day.

Offensive images are much like viruses. Instead of having to manage your own set of threat signatures, you can just use a third-party service and decrease the scope required to keep those images at bay. By using an automated text and image classification system on your user-generated content, you can protect your users at scale, without the need for an army of human moderators leafing through the content.

Here are some offensive image types we can detect:

  • Pornography
  • Graphic Violence
  • Weapons
  • Drugs
  • Custom Topics
Example image analysis result

 

Some benefits to an automated threat prevention system like Community Sift:

  • Decreased costs. Reduces moderation queues by 90% or more.
  • Increased efficiency. Prioritized queues for purposeful moderation, sorted by risk
  • Empowers automation. Instead of pre-moderating or reacting after inappropriate images are published, you can let the system filter or prevent the images from being posted in the first place.
  • Increased scalability. You can grow your community without worrying about the scope of work required to moderate the content.
  • Safer than managing it yourself. In the case of Community Sift, we’re assessing images, videos, and text across multiple platforms. You gain a lot from the network effect.
  • Shape the community you want. You can educate your user base proactively. For example, instead of just accepting inbound pornographic images, you can warn the user that they are uploading content that breaks your terms of use. A warning system is one of the most practical ways to encourage positive user behavior in your app.
  • Get back to what matters. Instead of trying to tackle this problem, you can focus on building new features and ideas. Let’s face it… that’s the fun stuff, and that’s where you should be spending your time — coming up with new features for the community that’s gathered together because of your platform.

In the latest release to the Community Sift image classification service, the system has been built from the ground up with our partners using machine learning and artificial intelligence. This new incarnation of the image classifier was trained on millions of images to be able to distinguish the difference between a pornographic photo and a picture of a skin-colored donut, for example.

Classifying images can be tricky. In earlier iterations of our image classification service, the system wrongly believed that plain, glazed donuts and fingernails were pornographic since both image types contained a skin tone color. We’ve since fixed this, and the classifier is now running at a 98.14% detection rate and a 0.32% false positive rate for pornography. The remaining 1.86%? Likely blurry images or pictures taken from a distance.

On the image spectrum, some content is so severe it will always be filtered — that’s the 98.14%. Some content you will see again and again, and requires that action be taken on the user, like a ban or suspension — that’s when we factor in user reputation. The more high-risk content they post, the closer we look at their content.

Some images are on the lower end of the severity spectrum. In other words, there is less danger if they appear on the site briefly, are reported, and then removed — that’s the 1.86%.

By combining the image classifier with the text classifier, Community Sift can also catch less-overt pornographic content. Some users may post obscene text within a picture instead of an actual photo, while other users might try to sneak in a picture with an innuendo, but with a very graphic text description.

Keeping on top of incoming user-generated content is a huge amount of work, but it’s absolutely worth the effort. In some of the studies conducted by our Data Science team, we’ve observed that users who engage in social interactions are 3x more likely to continue using your product and less likely to leave your community.

By creating a social platform that allows people to share ideas and information, you have the ability to create connections between people from all around the world.

Community is built through connections from like-minded individuals that bond through shared interests. The relationships between people in a community are strengthened and harder to break when individuals come together through shared beliefs. MMOs like World of Warcraft and Ultima Online mastered the art of gaming communities, resulting in long-term businesses rather than short-term wins.

To learn more about how we help shape healthy online communities, reach out to us anytime. We’d be happy to share more about our vision to create a harassment-free, healthy social web.

Using Kindness to Grow Healthy Online Communities

For every 28 articles published on bullying, only one is published on kindness. Does this strike you as odd?

It gets weirder. Let’s operate on the assumption that kindness is the antithesis of bullying. Studies have found that friendly teachers and welcoming learning environments result in less bullying.

Makes sense. And yet, when it comes to changing bad behavior, the bulk of our efforts continue to focus on the problem instead of the solution. Are we going about this all wrong?

A lot of very smart people have been asking themselves that very question. Allow me to introduce you to a little-known, up-and-coming field of psychology called — you guessed it…

Positive psychology

Positive psychology studies the strengths that enable human beings to thrive. Psychology has traditionally focused on human suffering while treating mental illness. Positive psychology sets itself apart by studying what makes life worth living. Instead of focusing solely on healing psychological damage, it strives to cultivate the conditions that enable us to live fulfilling and meaningful lives.

Consider healthy food. Eating well is a proactive approach to looking after one’s health. In the same way, when we foster positivity we take a proactive approach to making the world a better place.

An interesting shift is occurring in schools right now. Instead of “What is wrong that needs fixing?” educators are asking themselves “How can we nurture the strengths and attributes of students?” This shift is in response to the success that positive psychology has had in reducing negative affect, increasing life satisfaction, and fostering creative thinking. If kindness becomes the standard, all anyone has to do is continue the trend. Those who are shown kindness are more likely to be kind to others. It’s an old concept — pay it forward — and it works.

You can see why it’s important that we study not only the phenomenon of bullying but also its counterpart, kindness… perhaps with the 28:1 ratio reversed. When we focus on the positive, we shift the conversation from the negative.

Imagine putting this into practice in your community. What would it look like?

Saving the world with kindness

“If we could change ourselves, the tendencies in the world would also change. As a man changes his own nature, so does the attitude of the world change towards him.” — Mahatma Gandhi

If you watch closely, you might notice you or a friend subtly imitating a conversation partner. In psychology, this is called “mirroring,” the subconscious tendency for one person to imitate the body language, speech pattern, or attitude of another. Copying another person’s nonverbal signals creates connection and builds rapport.

The Golden Rule encourages us to treat others as we would like to be treated. If we treat others with kindness and respect, is it fair to expect that they respond in kind (see what I did there)?

For most people, the answer is yes. It’s in our nature to imitate. Imitation allows for the transfer of cultural artifacts like customs, behaviors, and traditions, and plays a huge role in the creation of culture.

Kindness has the potential to catch fire in our culture and shape our communities for the better, but it has to start somewhere. Why not with you? And why not your community?

Polarization

The words “be kind” are a lot more effective than “do not bully others.” The “do not” is often lost in translation, and the message becomes “bully others.” Psychologists call this reactance theory. When someone feels that their freedom to choose is under threat, they can become motivated to do the opposite, no matter how irrational the choice might seem. Again, if we switch the message from the negative to the positive, people are far more likely to listen and respond. What if, as community managers, we rewarded our most positive users instead of only punishing our most negative users?

Of course, if ordering people to be kind always succeeded, we’d all be singing kumbaya around a campfire, and there would be no need for anyone to write a blog post about it.

To understand this better, let’s look at complementary and non-complementary behavior.

Breaking the cycle

“The self is not something ready-made, but something in continuous formation through choice of action.” — John Dewey, philosopher, psychologist, and educational theorist

Complementary behavior refers to our tendency to treat others as they treat us — the old Golden Rule. Sometimes, though, we are presented with situations that require unexpected reactions – what psychologists call non-complementary behavior. Non-complementary behavior doesn’t happen often because, simply put, it’s hard.

It’s instinct too: When someone makes you feel bad, are you ever inclined to make them feel good?

Well, that’s what makes non-complementary behavior so powerful and revolutionary. An episode of NPR’s podcast “Invisibilia” tells the story of a group of friends who were confronted by an armed robber in their backyard. No one at the party had cash. The robber now had to choose between leaving empty-handed or making good on his threat of violence. Tension grew until one of the women did something unexpected: she offered him a glass of wine.

It worked. The robber pocketed his gun and took the glass. The tension broke, and the situation was transformed. The end result? A highly improbable group hug.

Could you offer a glass of wine to the person holding a gun to your head? How committed are you to shaping a community that values kindness above all else?

Kindness is hard work. But as we’ve seen in study after study, and example after example, it’s worth it. Your community will be stronger for it.

How to contribute to a culture of kindness

“To achieve greatness, start where you are. Use what you have. Do what you can.” — Arthur Ashe, tennis champion

The next time you are on the receiving end of somebody’s kindness, consider doing more than simply thanking them — instead, pay it forward. Give back a little kindness of your own to the community at large. Try implementing it in your online community. Reward your good users for positive behavior with an extra item. Encourage them to pay it forward.

Most of us strive to live in harmony with our community. The movement to perform random acts of kindness, while exemplary, isn’t necessarily the fastest way to build habits around kindness. It could be argued that acts of kindness are not random at all but in fact completely natural and inherent to the human spirit. Instead, it is the intentional act of following those impulses that require conscious cultivation — and some hard work.

Ask yourself the question — what does kindness in the community mean to me? How does it benefit the community as a whole, and how can I cultivate it?

As we’ve learned, when you focus on the solution instead of the problem, the results are extraordinary.

From Brexit to Bulbasaur – On the Evolution of Language

There is something magical at the heart of language, isn’t there? At the intersection of noun, verb, and clause exists endless creativity and invention. Language encourages the artist in all of us.

We are more connected now than ever before, and we as individuals are shaping language to fit our needs, whether we’re busy texters, meme creators, or blog commenters. If we’re online, we are using language in unique ways. We reconstruct language every day.

The Great Vowel Shift

Language has always been in a state of transformation. Words come into and go out of style, and phrases expand and contract (what linguist Guy Deutscher calls “expressiveness” and “erosion”). Witness the Great Vowel Shift that happened in England between the 14th and 17th centuries, in which long stressed vowel sounds changed so completely that spoken Middle English is almost a different language.

You would have a very hard time understanding Chaucer if you talked to him today, although his great collection The Canterbury Tales is still famous – not to mention gloriously readable in written Middle English – today. The Great Vowel Shift led to the first attempts to standardize spelling and punctuation in the 1600s, a process that continues to this day.

The Two Hat Security Language & Culture team is responsible for building and maintaining the dictionary that classifies words and phrases based on their risk, subject, and context. Like the proto-linguists of the 17th century, they are inveterate listeners and watchers, students of pop culture with a passion for language in all its complexities and quirks.

Had the Language & Culture team existed in Elizabethan England, imagine the hours of overtime it would take to keep up with Shakespeare, inventor of roughly 1700 words, and the premier language builder of his time! 

Since the introduction of the World Wide Web in 1991 and the first user-friendly browser a year after that, language has undergone another Great Shift – not of pronunciation like the Great Vowel Shift, but of invention, like Shakespeare.

A new kind of language

We live in an age of memes that spread, evolve, and disappear within days, if not hours. New apps and social media platforms spring up overnight and with them, new terms and new ways of using familiar words. Friend is now a verb. Birds aren’t the only ones who tweet. It’s more imperative than ever that we – at Two Hat Security, in the Language & Culture and Client Success Teams, and anyone who manages an online community – recognize that language is a living, breathing, ever-shifting work of art.

Consider the last six months, and the words that have entered, and in many cases re-entered the cultural lexicon!

From Brexit to the Bataclan and Nice; from woke to #blacklivesmatter; from af to bae; not to mention the ten new terms that emerged in the time it took to write this blog – we have to stay on top of the trends. The ultimate goal in any community – and our mission statement here at Two Hat Security – is to maximize expressivity and minimize toxicity, and we can only do that if we speak the language of the now.

We haven’t even mentioned emoticons yet – the newest and most dynamic language trend. Also, so cute! ʕ•ᴥ•ʔ

From Brexit to Bulbasaur

In the last month alone Language & Culture has added an entire set of Pokémon Go words – gotta add ‘em all! – to the dictionary, including those delightfully whimsical names Bulbasaur, Charmander, Butterfree, and more. The Republican and Democratic conventions unearthed terms like bully pulpit, bellwether, and gerrymander, and added new words like Servergate. There is no doubt the Rio Olympics and the US election will prove just as linguistically rich in the coming months!

Language is alive, and change is a constant in life. Creative, unruly, and entirely human, it follows the ebb and flow of culture, politics, and technology. Two Hat Security and the Language & Culture team are both spectators and participants in the great language experiment.

Language & Culture does the hard work, but we encourage you as a community manager, a moderator, an app designer, or simply part of the community to do the rest. As humans, we are all part of this grand experiment called language – let’s build it together. Let’s make it magical.

About Two Hat Security

At Two Hat Security, we empower social and gaming platforms to build healthy, engaged online communities, all while protecting their brand and their users from high-risk content.

Want to increase user retentionreduce moderation, and protect your brandGet in touch today to see how our chat filter and moderation software Community Sift can help you build a community that stays on top of language trends.

 

Bulbasaur image: By Criszoe (Own work) [CC BY-SA 4.0 (https://creativecommons.org/licenses/by-sa/4.0)], via Wikimedia Commons

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


Keeping Kids Safe Online

Digital communication offers many incredible ways to communicate, discover, and share messages with others around the world. However there are risks inherent with conversations, social media, games, apps, and websites found online.

What are the best ways to keep kids safe online?

  • As a parent or guardian, you should educate yourself about the risks for kids being online
  • Appropriately educate your child about the risks for kids being online
  • Engage and experience your child’s games, apps, and websites they visit
  • Activate all child safety settings for your computer’s operating system, search engines, games and game consoles, and cellphones
  • Set rules and enforce rules
  • Establish limits for time online per day
  • Follow your child on social media, but respect their space and be careful not to stalk them
  • Encourage your child to behave as a good online citizen and to maintain a positive reputation
  • Tell your child to NEVER share their passwords or personal information
  • Encourage your child to share any questions they have or any unusual encounters they experience while online
  • Keep devices in a common area of the home
  • Engage a child in conversation about what they are and have been doing on the web
  • Be a model of positive online behavior
  • Encourage and engage in offline activities

You want to make being on the internet a positive experience for your child and for others, teaching them to be positive responsible online citizens.

Inspiration for this post:

Seven Steps to Good Digital Parenting
Dear Parents – a Message From Miss Florida 2015