How Maslow’s Hierarchy of Needs Explains the Internet

 In Online Safety, Social Networks

Online comments.

Anonymous egg accounts.

Political posts.

… feeling nauseous?

Chances are, you shuddered slightly at the words “online comments.”

Presenting Exhibit A, from a Daily Mail article about puppies:

It gets worse. Presenting Exhibit B, from Twitter:

 

The internet has so much potential. It connects us across borders, cultural divides, and even languages. And oftentimes that potential is fulfilled. Remember the Arab Spring in 2011? It probably wouldn’t have happened without Twitter connecting activists across the Middle East.

Writers, musicians, and artists can share their art with fans across the globe on platforms like Medium and YouTube.

After the terror attacks in Manchester and London in May, many Facebook users used the Safety Check feature to reassure family and friends that they were safe from danger.

Every byte of knowledge that has ever existed is only a few taps away, stored, improbably, inside a device that fits in the palm of a hand. The internet is a powerful tool for making connections, for sharing knowledge, and for conversing with people across the globe.

And yet… virtual conversations are so often reduced to emojis and cat memes. Because who wants to start a real conversation when it’s likely to dissolve into insults and vitriol?

A rich, fulfilling, and enlightened life requires a lot more.

So what’s missing?

Maslow was onto something…

Remember Maslow’s hierarchy of needs? It probably sounds vaguely familiar, but here’s a quick refresher if you’ve forgotten.

A psychology professor at Brandeis University in Massachusetts, Abraham Maslow published his groundbreaking paper “A Theory of Human Motivation” in 1943. In this seminal paper, he identifies and describes the five basic levels of human needs. Each need forms a solid base under the next. And each basic need, when achieved, leads to the next, creating a pyramid. Years later he expanded on this hierarchy of human needs in the 1954 book Motivation and Personality.

The hierarchy looks like this:

  • Physiological: The basic physical requirements for human survival, including air, water, and food; then clothing, shelter, and sex.
  • Safety: Once our physical needs are met, we require safety and security. Safety needs include economic security as well as health and well-being.
  • Love/belonging: Human beings require a sense of belonging and acceptance from family and social groups.
  • Esteem: We need to be desired and accepted by others.
  • Self-actualization: The ultimate. When we self-actualize, we become who we truly are.

According to Maslow, our supporting needs must be met before we can become who we truly are — before we reach self-actualization.

So what does it mean to become yourself? When we self-actualize, we’re more than just animals playing dress-up — we are fulfilling the promise of consciousness. We are human.

Sorry, what does this have to do with the internet?

We don’t stop being human when we go online. The internet is just a new kind of community — the logical evolution of the offline communities that we started forming when the first species of modern humans emerged about 200,000 years ago in Eurasia. We’ve had many chances to reassess, reevaluate, and modify our offline community etiquette since then, which means that offline communities have a distinct advantage over the internet.

Merriam-Webster’s various definitions of “community” are telling:

people with common interests living in a particular area;
an interacting population of various kinds of individuals (such as species) in a common location;
a group of people with a common characteristic or interest living together within a larger society

Community is all about interaction and common interests. We gather together in groups, in public and private spaces, to share our passions and express our feelings. So, of course, we expect to experience that same comfort and kinship in our online communities. After all, we’ve already spent nearly a quarter of a million years cultivating strong, resilient communities — and achieving self-actualization.

But the internet has failed us because people are afraid to do just that. Those of us who aspire to online self-actualization are too often drowned out by trolls. Which leaves us with emojis and cat memes — communication without connection.

So how do we bridge that gap between conversation and real connection? How do we reach the pinnacle of Maslow’s hierarchy of needs in the virtual space?

Conversations have needs, too

What if there was a hierarchy of conversation needs using Maslow’s theory as a framework?

On the internet, our basic physical needs are already taken care of so this pyramid starts with safety.

So what do our levels mean?

  • Safety: Offline, we expect to encounter bullies from time to time. And we can’t get upset when someone drops the occasional f-bomb in public. But we do expect to be safe from targeted harassment, from repeated racial, ethnic, or religious slurs, and from threats against our bodies and our lives. We should expect the same when we’re online.
  • Social: Once we are safe from harm, we require places where we feel a sense of belonging and acceptance. Social networks, forums, messaging apps, online games — these are all communities where we gather and share.
  • Esteem: We need to be heard, and we need our voices to be respected.
  • Self-actualization: The ultimate. When we self-actualize online, we blend the power of community with the blessing of esteem, and we achieve something bigger and better. This is where great conversation happens. This is where user-generated content turns into art. This is where real social change happens.

Problem is, online communities are far too often missing that first level. And without safety, we cannot possibly move onto social.

The problem with self-censorship

In the 2016 study Online Harassment, Digital Abuse, and Cyberstalking in America, researchers found that nearly half (47%) of Americans have experienced online harassment. That’s big — but it’s not entirely shocking. We hear plenty of stories about online harassment and abuse in the news.

The real kicker? Over a quarter (27%) of Americans reported that they had self-censored their posts out of fear of harassment.

If we feel so unsafe in our online communities that we stop sharing what matters to us most, we’ve lost the whole point of building communities. We’ve forgotten why they matter.

How did we get here?

There are a few reasons. No one planned the internet; it just happened, site by site and network by network. We didn’t plan for it, so we never created a set of rules.

And the internet is still so young. Think about it: Communities have been around since we started to walk on two feet. The first written language began in Sumeria about 5000 years ago. The printing press was invented 600 years ago. The telegram has been around for 200 years. Even the telephone — one of the greatest modern advances in communication — has a solid 140 years of etiquette development behind it.

The internet as we know it today — with its complex web of disparate communities and user-generated content — is only about 20 years old. And with all due respect to 20-year-olds, it’s still a baby.

We’ve been stumbling around in this virtual space with only a dim light to guide us, which has led to the standardization of some… less-than-desirable behaviors. Kids who grew up playing MOBAS (multi-only battle games) have come to accept that toxicity is a byproduct of online competition. Those of us who use social media expect to encounter previously unimaginably vile hate speech when we scroll through our feed.

And, of course, we all know to avoid the comments section.

Can self-actualization and online communities co-exist?

Yes. Because why not? We built this thing — so we can fix it.

Three things need to happen if we’re going to move from social to esteem to self-actualization.

Industry-wide paradigm shift

The good news? It’s already happening. Every day there’s a new article about the dangers of cyberbullying and online abuse. More and more social products realize that they can’t allow harassment to run free on their platforms. The German parliament recently backed a plan to fine social networks up to €50 million if they don’t remove hate speech within 24 hours.

Even the Obama Foundation has a new initiative centered around digital citizenship.

As our friend David Ryan Polgar, Chief of Trust & Safety at Friendbase says:

“Digital citizenship is the safe, savvy, ethical use of social media and technology.”

Safe, savvy, and ethical: As a society, we can do this. We’ve figured out how to do it in our offline communities, so we can do it in our online communities, too.

A big part of the shift includes a newfound focus on bringing empathy back into online interactions. To quote David again:

“There is a person behind that avatar and we often forget that.”

Thoughtful content moderation

The problem with moderation is that it’s no fun. No one wants to comb through thousands of user reports, review millions of potentially horrifying images, or monitor a mind-numbingly long live-chat stream in real time.

Too much noise + no way to prioritize = unhappy and inefficient moderators.

Thoughtful, intentional moderation is all about focus. It’s about giving community managers and moderators the right techniques to sift through content and ensure that the worst stuff — the targeted bullying, the cries for help, the rape threats — is dealt with first.

Automation is a crucial part of that solution. With artificial intelligence getting more powerful every day, instead of forcing their moderation team to review posts unnecessarily, social products can let computers do the heavy lifting first.

The content moderation strategy will be slightly different for every community. But there are a few best practices that every community can adopt:

  • Know your community resilience. This is a step that too many social products forget to take. Every community has a tolerance level for certain behaviors. Can your community handle the occasional swear word — but not if it’s repeated 10 times? Resilience will tell you where to draw the line.
  • Use reputation to treat users differently. Behavior tends to repeat itself. If you know that a user posts things that break your community guidelines, you can place tighter restrictions on them. Conversely, you can give engaged users the ability to post more freely. But don’t forget that users are human; everyone deserves the opportunity to learn from their mistakes. Which leads us to our next point…
  • Use behavior-changing techniques. Strategies include auto-messaging users before they hit “send” on posts that breach community guidelines, and publicly honoring users for their positive behavior.
  • Let your users choose what they see. The ESRB has the right idea. We all know what “Rated E for Everyone” means — we’ve heard it a million times. So what if we designed systems that allowed users to choose their experience based on a rating? If you have a smart enough system in the background classifying and labeling content, then you can serve users only the content that they’re comfortable seeing.

It all comes back to our hierarchy of conversation needs. If we can provide that first level of safety, we can move beyond emojis and cats — and move onto the next level.

Early digital education

The biggest task ahead of us is also the most important — education. We didn’t have the benefit of 20 years of internet culture, behavior, and standards when we first started to go online. We have those 20 years of mistakes and missteps behind us now.

Which means that we have an opportunity with the next generation of digital citizens to reshape the culture of the internet. In fact, strides are already being made.

Riot Games (the studio that makes the hugely popular MOBA League of Legends) has started an initiative in Australia and New Zealand that’s gaining traction. Spearheaded by Rioter Ivan Davies, the League of Legends High School Clubs teaches students about good sportsmanship through actual gameplay.

It’s a smart move — kids are already engaged when they’re playing a game they love, so it’s a lot easier to slip some education in there. Ivan and his team have even created impressive teaching resources for teachers who lead the clubs.

Google recently launched Be Internet Awesome, a program that teaches young children how to be good digital citizens and explore the internet safely. In the browser game Interland, kids learn how to protect their personal information, be kind to other users, and spot phishing scams and fake sites. And similar to Riot, Google has created curriculum for educators to use in the classroom.

In addition, non-profits like the Cybersmile Foundation, UK Safer Internet Center, and more use social media to reach kids and teens directly.

Things are changing. Our kids will likely grow up to be better digital citizens than we ever were. And it’s unlikely that they will tolerate the bullying, harassment, and abuse that we’ve put up with for the last 20 years.

Along with a paradigm shift, thoughtful moderation, and education, if we want change to happen, we have to celebrate our communities. We have to talk about our wins, our successes… and especially our failures. Let’s not beat ourselves up if we don’t get it right the first time. We’re figuring this out.

We’re self-actualizing.

It’s time for the internet to grow up

Is this the year the internet achieves its full potential? From where most of us in the industry sit, it’s already happening. People are fed up, and they’re ready for a change.

This year, social products have an opportunity to decide what they really want to be. They can be the Wild West, where too many conversations end with a (metaphorical) bullet. Or they can be something better. They can be spaces that nurture humanity — real communities, the kind we’ve been building for the last 200,000 years.

This year, let’s build online communities that honor the potential of the internet.

That meet every level in our hierarchy of needs.

That promote digital citizenship.

That encourage self-actualization.

This year, let’s start the conversation.

***

At Two Hat Security, we empower social and gaming platforms to build healthy, engaged online communities, all while protecting their brand and their users from high-risk content.

Want to increase user retention, reduce moderation, and protect your brand?

Get in touch today to see how our chat filter and moderation software Community Sift can help you build a community that promotes good digital citizenship — and gives your users a safe space to connect.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


Recommended Posts
Contact Us

Hello! Send us your question, and we'll get back to you as soon as possible.

Start typing and press Enter to search