Design Around These Edtech Challenges Before They Become Barriers To Success

Educational technology in the classroom is one of the latest trends to sweep the startup and tech industry. Designed to connect teachers, students, and parents, edtech platforms grow in popularity, not to mention profitability, every day. In fact, according to TechCrunch, in the first 10 months of 2017, global investors staked $8.15 billion in edtech ventures.

The most effective edtech platforms include communication features like online forums, message boards, or chat options. What are the benefits of using technology in education? Online social features provide students and teachers with an additional, highly engaging level of interaction, much like they would experience in a traditional classroom setting.

As you can imagine, that opens the door to all sorts of challenges. How do you stop people from using profanity in your forum? Forget profanity – what about abusive language? The gaming industry has been dealing with online bullying and harassment for years – so what can edtech designers learn from them?

Pillar #1: Privacy by Design
GDPR and COPPA. CIPA and FERPA. Privacy regulations are everywhere. With Mark Zuckerberg’s recent testimony before the American Senate and European Parliament, and today’s General Data Protection Regulation deadline heightening industry awareness of data privacy issues, we expect to see new and enhanced privacy regulations introduced over the next few years.

Going forward, any company that collects personal information – whether it’s an email address, birthdate, IP address, or more – will be expected to embed robust and transparent privacy features into their product.

Due to the extra sensitive nature of children’s personal information, kids’ products are already strictly regulated by COPPA, FERPA, and now GDPR.

While it may seem daunting at first, compliance doesn’t have to be scary, and it’s not insurmountable. In our recent webinar “The ROI of COPPA” (watch it on-demand), we explored the surprising benefits of building COPPA compliant kid’s connected products, including increased user engagement, retention, and LTV.

If you’re getting into the edtech business, it’s critical that ensuring user privacy is part of your plan. Need guidance? We recommend that you work with a Safe Harbor company to ensure compliance.

Check out our beginner’s guide to COPPA compliance for a list of reputable companies that can help (they can assist with GDPR, too).

Pillar #2: Safety by Design
When it comes to kid’s online platforms, safety should never be an afterthought.

Any product that allows children to interact online (including games, social networks, messaging apps, and virtual worlds) is responsible for creating a safe environment where users are protected from bullying, harassment, hate speech, and child exploitation.

Regulations like COPPA and CIPA already mandate that under-13 products (and in the case of CIPA, K-12 libraries and schools) protect children from sharing and seeing Personally Identifiable Information (PII) and harmful online content.

Regardless of demographic, edtech products – from learning management systems to tutoring apps – must make user safety a core feature of their offering.

Here are the two fundamental kinds of content that you should address in your platform:

1. Personally Identifiable Information (PII)
COPPA says that you must take all reasonable measures to prevent users from sharing PII, which can include full name, phone number, email address, home address, and even images of children, like a profile pic. But COPPA and FERPA compliance aren’t the only reasons you should protect kids from sharing PII.

Children don’t always understand the dangers of exposing their personal information online. To prevent safety breaches, a text and image filter that is sophisticated enough to identify PII (notoriously difficult to detect, due to persistent filter manipulation attempts by young, savvy users) is a must-have. Wondering how hard it is to build a smart filter in-house? Check out the “Filter and redact PII” section of our beginner’s guide to COPPA.

2. High-risk content
We’re all familiar with the dangers of the internet. Abusive comments, bullying, harassment, sexual language and images, and illegal content like CSAM (child sexual abuse material) – they all put children online at risk.

When you include social features and user-generated content like chat, messaging, forums, and images in your edtech product, it’s important that you include a mechanism to proactively filter and moderate dangerous content. And if it’s built into your platform from the design phase onwards, you can avoid the challenges of implementing safety features at the last minute.

Much like ensuring privacy and compliance, we don’t recommend that you do it all yourself. Third-party software will save you loads of development time (and money) by providing battle-tested profanity filtering, subversion detection, context-based content triaging, and flexible options for multiple languages and demographics.

After all, you’re the education and technology expert. Your focus should be on creating the best possible experience for your users, and building content that is engaging, sticky, and informative.

Edtech is a growing industry with endless opportunities for new and established brands to make their mark with innovative products. Techniques for engaging students are changing every day, with new approaches like BYOD (Bring Your Own Device) and after school esports leagues (check out these teaching resources) entering the classroom.

Online games, virtual worlds, and social networks have spent years figuring out how to keep children safe on their platforms.

So why not take a page from their playbook, and make Privacy by Design and Safety by Design two of the pillars of your edtech platform?


The ROI of COPPA Compliance

Join us on Thursday, May 10th for an exclusive webinar The ROI of COPPA. You’ll learn about the positive benefits of the children’s online privacy regulation, including increasing user retention, fostering engagement, and boosting profits.

If you can’t make it live, sign up anyway and we’ll send you a recording.

We’ve partnered with our friends at PRIVO, the industry experts in children’s online privacy and delegated consent management, to shed light on the business benefits of building connected products for under-13 users.

As we explained in our Four Step Beginner’s Guide to COPPA, obvious reasons to get compliant include avoiding FTC fines, bad press, and loss of user trust. But what else should you know about the regulation? And can COPPA actually be good for business?

This webinar is ideal for anyone who wants to better understand COPPA restrictions, requirements, and ultimately how compliance drives engagement, retention, and revenue.

Building an education app? Participating in the Designed for Families program for Google Play? Unsure if it’s even worth designing an online product for young children and families? You don’t want to miss this.

Save your spot today!

The ROI of COPPA
Thursday, May 10th | 10:00 AM – 11:00 AM PST | 1:00 PM EST – 2:00 PM EST

A Four-Step Beginner’s Guide to COPPA Compliance

There are few things more rewarding in the digital space than creating a great kid’s product. The market is booming, and there are endless possibilities for new, innovative products. Kids (and parents) today are looking for unique, engaging, and above all safe apps and sites to play, share, and connect with other kids across the world.

But if you plan to market your product to kids under 13 in the US, by law you have to ensure that it follows the Children’s Online Privacy Protection Act (COPPA). The act regulates how you collect and store personal information of under-13 users in your product.

And it’s a big deal. Companies like Hasbro, Mattel, Viacom, and JumpStart Games were fined in 2016 for violating COPPA. And three new lawsuits were announced this month. If you saw the last season of Silicon Valley, like Dinesh and the rest of the Pied Piper crew, you know that COPPA fines are hefty. Privacy regulations are a hot topic in 2018. GDPR is set to take effect in the EU on May 25th. And ESRB has proposed modifications to Safe Harbor, with public comments due on May 9th.

COPPA, like most regulations, can be confusing to navigate, especially if you’re new to the industry. That’s why we’ve put together this four-step guide to help you unpack the legalese.

(Register to watch our on-demand webinar The ROI of COPPA. In it, we explore the benefits of compliance, including increased engagement, retention, and profits.)

Let’s get the easy stuff out of the way first.

What is COPPA?

The Children’s Online Privacy Protection Act (COPPA) protects the online privacy of children under the age of 13 in the US. It’s a large and complex rule (you can read it in its entirety here), but there are a few key points you should know:

  • COPPA only applies to children in the US under the age of 13 (U13). If you are a US-based company then you are expected to protect U13 globally. However, if your company is based out of the US, you are only legally obligated under COPPA to protect American children on your platform — or risk a potentially crippling fine from the FTC or even a US State Attorney General.
  • Children outside the US will be subject to emerging similar regulations like the GDPR and its minors consent requirements. Those laws might also apply to US companies operating in those countries.
  • COPPA only applies to children under the age of 13. Other international regulations have different cutoff ages. For example, GDPR may allow member states to choose an age up to 15 for parental consent requirements.
  • COPPA is designed to protect children’s online privacy and data security — it doesn’t prevent cyberbullying or profanity
  • COPPA is not just about removing private information. It’s also about having parental consent to collect, use, disclose, track or share private information.
  • COPPA also applies to third-party plugins and services you use. This is a very tricky situation and requires proper due diligence. See the current class action law suits recently filed in this area as an example.

Please note: Two Hat Security is not a law office and cannot provide legal advice. While we provide technology that helps companies achieve compliance, you should still always seek independent legal counsel. We recommend reaching out to a Safe Harbor provider like PRIVO, kidSAFE seal, or ESRB (more about these companies below).

In addition, the FTC has put together a Six-Step Compliance Plan that is a great reference and another helpful starting point.

What is PII (Personally Identifiable Information)?

PII is any information that can be used to identify a person in real life. This list is from the FTC’s page:

  • full name;
  • home or other physical address, including street name and city or town;
  • online contact information like an email address or other identifier that permits someone to contact a person directly — for example, an IM identifier, VoIP identifier, or video chat identifier;
  • screen name or user name where it functions as online contact information;
  • telephone number;
  • Social Security number;
  • a persistent identifier that can be used to recognize a user over time and across different sites, including a cookie number, an IP address, a processor or device serial number, or a unique device identifier;
  • a photo, video, or audio file containing a child’s image or voice;
  • geolocation information sufficient to identify a street name and city or town; or
  • other information about the child or parent that is collected from the child and is combined with one of these identifiers.

Personal information isn’t just a user’s real name. It also includes cookie numbers and geolocation.

Let’s get into the meat and potatoes — the four crucial steps to get started.

Step one is a big one.

Step 1: Choose your strategy

There are a few options here. Each one has its pros and cons.

Option 1: Don’t have under-13 users in your product

A common strategy used by large sites and apps like Facebook, YouTube, Twitter, and others. To be in this category, your product cannot be child-directed. So if you’ve created a product that is clearly designed for children, if you partner with kid-directed products, and market yourself as a place for youth, this option won’t work.

However, if you qualify as a general audience, you can reduce your risk by providing an age gate and blocking U13. Typically that means asking users to enter their birth date when they sign up. If you do this, you’ll also need a mechanism to block users from first selecting a child’s birthdate, then changing their answer.

As well, you will need a strategy to manage accounts that you discover are under 13 — think of kids who post pictures from their birthday party with the caption “I’m 10 now!” If that’s reported, you have to build a process to deal with it.

So, if your product is clearly designed for children, you will have to go with one of the following two options.

Option 2: Verifiable Parental Consent

Works if you need users to share personal information

Parental consent is a necessity if your site or app is a social network for under-13 users who share their real identity. As well, if your site or app is designed for video or photo sharing, getting a parent’s permission is a smart option. With the photo regulations in COPPA, it’s much smarter to err on the side of caution.

Parental consent is tricky. Recommended steps include faxing or mailing in a signature, speaking to a customer support rep, or making a payment of one dollar or more. The FTC recently approved two new options — knowledge-based challenge questions and utilizing your driver’s license (that itself has its own security risks!).

Option 3: Email plus

Works if your product doesn’t require that users share personal information

The third option is to add a parental permission step to the sign-up process. This first requires that you build an age gate. Then, if the user is under 13, you require that they enter their parent’s email address or another online identifier to request consent before gaining access to their account. After that, you send an email with proper disclosure notices and an activation link to their parents. Animal Jam, Club Penguin, and many Disney products use this strategy.

The downside to this option? It can be bad for business. Every time you add a step to the registration process, you lose more users. In some cases, you can lose up to 50% of all candidates during the email sign-up process.

But adding parental permission during the signup process is just the first step. You still have to ensure that kids can’t disclose their personal information once they’re logged in.

Here’s how COPPA defines “disclosure“:

(2) Making personal information collected by an operator from a child publicly available in identifiable form by any means, including but not limited to a public posting through the Internet, or through a personal home page or screen posted on a Web site or online service; a pen pal service; an electronic mail service; a message board; or a chat room.

To translate: You can’t let your under-13 users post “My name is Joe Rogers and I live at 2345 Markum street. Will you be my friend?” At that point, you have collected PII and broadcast it publically. This also applies to profile pictures if they contain GPS information or a child’s face.

To prevent disclosure, you’ll need to combine the email plus option with a chat and/or image filter that checks every post for PII. More on that later.

Wait, there’s more!

Another factor to consider is what kind of third-party apps and services you are using. Google Analytics and others have a config setting that can turn off user-specific behavior tracking. If you’re building a mobile app, remember to scan all your plugins to ensure they are compliant.

Which option do you choose?

Whatever decision you make will be based on your unique product.

Fortunately, there are two companies that can help you implement whichever strategy you choose.

Full disclosure: We work with all three companies below, which is why we feel comfortable recommending their services.

Recommended products

AgeCheq offers a developer-friendly service that handles nearly all of the parent/child and technical complexities of COPPA compliance for you. After you provide accurate answers to a focused privacy survey and drop a few lines of code into your website or app, COPPA parental notice and consent (and ongoing consent management) are handled with a single-sign on parental dashboard that supports multiple children and (of course) multiple publishers.

AgeCheq has put a lot of effort into streamlining the effort required by parents and children to enjoy your content, while still complying with COPPA. DragonVale World and WolfQuest are two very popular apps that use the AgeCheq service. If you may have EU users, AgeCheq also offers a universal service called ConsentCheq CDK that provides GDPR, ePrivacy and COPPA consent with a single integration.

Kids Web Services (KWS) is a platform that makes it easy for developers to build effective and COPPA and GDPR-K compliant digital products for kids, and to increase engagement and conversion. Building awesome digital experiences for the under-13 market is hard. KWS is the self-service platform for developers to effortlessly build effective digital products for kids and parents that are compliant with all global data privacy laws, such as COPPA and GDPR. You build awesome and fun digital experiences, and they ensure everything you do is transparent and compliant by design.

 

PRIVO offers an FTC-approved kid and family customer identity and permission management platform. Their platform includes registration & account creation, identity proofing, sliding scale parental consent, single sign on, account management & permission portal.

From their site: “PRIVO helps you build trust with your users, partners, and regulators by helping you meet or exceed federal and international data regulations. PRIVO can act as your fractional Chief Privacy Officer.”

They are a COPPA Safe Harbor provider as well, so they can also review your compliance (more about Safe Harbor later).

Step 2: Have a clear privacy policy and notices

Like most people, we start to go cross-eyed when looking at legal documents. The FTC outlines the components of a COPPA compliant privacy policy here.

One of the most important takeaways? Be clear and avoid legalese. AgeCheq and PRIVO can help with that.

Step 3: Filter or redact PII

Here is where a chat filter and automated moderation software like Community Sift becomes business-critical.

Whenever an under-13 user gives you text, images, or video you need to ensure that it doesn’t contain PII (if you haven’t received verifiable parental consent).

Kids are persistent. And they’re very savvy — they know their way around simple blacklists and unsophisticated filters.

Some examples of kids sharing PII that we’ve seen:

“My add dress is one two 3 on homer street”

“Ping me at two 5 oh seven seven 55”

“250” (line 1)
“Seven” (line 2)
“Seven” (line 3)
“55” … (you get the idea)

Here’s the problem: Most filters confuse these intentional manipulations with innocent things like:

“9 8 7 6 5 4 3 2 1 go”

“My stats are 456.344.2222.222 with XP of 233”

“I have 3 cats 1 dog 4 hamsters and 67 goldfish”

You don’t want to prevent kids from talking about their pets or sharing their achievements. But can you do that without blocking all numbers?

Reputation matters

Community Sift is a reputation-based system, which is the best way to deal with endlessly complicated filter tricks. With 1.8 million language patterns and linguistic templates added to the system, we find the most common ways users try to break the filter.

When a user is caught manipulating the filter, they earn a “reputation” for actively trying to share PII. Then, the system automatically changes their settings to prevent them from sharing any numbers. Innocent users can still use numbers to talk about points, items, etc.

The best part is that users can change their reputation based on behavior. So, users with a “bad” reputation who stop attempting to share their phone numbers can earn a “good” reputation and regain their old settings. And it all happens in real time.

Sanctions matter, too

As a best practice, we recommend using both user reputation and sanctions to moderate PII. Progressive sanctions give users the opportunity to change their behavior over time.

You can start by displaying a message to the user, explaining why sharing PII in dangerous. If they try again, suspend their account for 24 hours, then 72 hours — and so on.

The good thing is that immediate feedback almost always works. After the initial warning, most users won’t try again.

Why does COPPA matter?

For us, one of the most important sections in the regulation is the definition of collecting PII: “Enabling a child to make personal information publicly available in identifiable form.”

The definition continues:

“An operator shall not be considered to have collected personal information under this paragraph if it takes reasonable measures to delete all or virtually all personal information from a child’s postings before they are made public and also to delete such information from its records,” from §312.2.2; emphasis ours

Kids don’t understand the inherent dangers of sharing PII online. They are persistent, and unless we put (reasonable) measures in place to prevent it, they will keep trying to post their phone numbers, email addresses, real names, and more.

You’ll never catch everything. But you can detect and block nearly all instances of PII without crippling your business, and without stifling actual conversation.

As operators of kid-friendly social products, it’s our responsibility to protect children from their own innocence.

COPPA was created to protect children’s privacy. But in the process, it also protects them from predators, grooming, and luring.

Step 4: Get a COPPA Safe Harbor to validate

COPPA compliance is no easy task. Every product is unique and requires individual attention.

As a final step, the industry best practice is to hire an FTC-approved COPPA Safe Harbor site to review your strategy and product, as well as your 3rd party contracts and settings, to ensure you are and remain compliant.

Yes, it will cost several thousand dollars — but it’s far cheaper than losing hundreds of thousands of dollars in FTC fines and parental trust; in addition to saving on legal fees and providing you confidence in your engagement strategy.

You also get a seal to put on your app or site. The seal means that the Safe Harbor company you’ve chosen has given their stamp of approval to your strategy.

If the FTC ever looks at you and finds a potential issue they will look at that stamp/seal and allow you a chance to cure any issues without the heavy stick of enforcement action. You are basically deemed compliant when you are in good standing with an approved Safe Harbor. Of course, you need to keep it current, but it’s a great protection.

Safe Harbor companies that we recommend:

Not only does PRIVO provide a system for verifiable parental consent, they also are a Safe Harbor and will review your site, or app, or smart toy for COPPA, GDPR, SOPIPA, the Student Data Privacy Pledge, and other regulations protecting kids privacy online. It is pretty powerful that they offer two products to cover both of your needs.

kidSAFE initially only provided a seal certifying that kid’s sites and apps have followed proper safety steps. Now they are an FTC-approved Safe Harbor, and can also provide a kidSAFE+ COPPA-CERTIFIED Seal. That added safety certification is highly advantageous for kid’s products.

kidSAFE reviewed Community Sift and helped us create a set of guidelines for under-13 products. They are experts when it comes to kid’s online privacy and safety.

 

You’ve heard this line before — “Rated ‘E’ for everyone.” The rating system created by the Entertainment Software Rating Board is clear, concise, and immediately recognizable. They are also a COPPA compliance and Safe Harbor provider.

Next steps

There you have it — your four-step beginner’s guide to COPPA compliance. To recap, your first step is to decide on a parental permission strategy.

Once you’ve done that, craft a readable privacy policy that anyone can understand.

Then, find the right moderation software to filter PII in chat/images/usernames.

Finally, select a Safe Harbor company to help validate your strategy and ensure that you’re compliant.

Easy as pie?

Maybe not — but while COPPA compliance may not be easy, it is important.

Editor’s note: Originally published in August 2017, updated April 2018

Want more best practices for keeping your online community COPPA compliant? Sign up for the Two Hat newsletter and never miss an update! 

You’ll receive invites to exclusive webinars and workshops, community management and moderation tips, and product updates.

We will never share your information with third parties and you can unsubscribe at any time.

* indicates required


Community Sift Achieves kidSAFE Certification to Protect Kids Social Apps

Community Sift, our risk classification tool designed to protect social platforms from trolls, cyberbullying, and abuse, is proud to announce that it has achieved kidSAFE certification.

The kidSAFE Seal Program is an independent safety certification service and seal-of-approval program designed exclusively for children-friendly websites and technologies, including online game sites, educational services, virtual worlds, social networks, mobile apps, tablet devices, connected toys, and other similar online and interactive services. kidSAFE also audits and certifies the practices of third party vendors that service this industry.

“We are thrilled to add Community Sift to our growing network of trusted brands and service providers,” said Shai Samet, Founder and President of kidSAFE. “We continue to be highly impressed with their service, technology, and know-how, but, most importantly, with their unwavering dedication to continue refining and improving their filtering tools. We look forward to working with Community Sift and share in their mission to help protect children online.”

“One of Community Sift’s main goals has always been to keep kids safe online,” said Chris Priebe, CEO of Two Hat Security and founder of Community Sift. “We built a stronger and smarter classification system to ensure that kids online are safe from predators, bullies, and abuse. We appreciate the recognition from Shai and his team at kidSAFE to further validate the huge difference we’re making in online safety.”

Community Sift underwent rigorous testing and independent review to achieve kidSAFE certification. Some of these measures include:

  • Safety measures for chat, community, and social features
  • Rules and educational info about online safety
  • Procedures for handling safety issues and complaints
  • Parental controls over child’s account
  • Age-appropriate content, advertising, and marketing

About the kidSAFE Seal Program

The kidSAFE Seal Program is an independent safety certification service and seal-of-approval program designed exclusively for child-friendly websites and technologies, including kid-targeted game sites, educational services, virtual worlds, social networks, mobile apps, connected devices, and other similar interactive services and technologies. Visit http://www.kidsafeseal.com for more information.

About Two Hat Security

Two Hat Security is a software company committed to protecting the right to share ideas and information online without fear of harassment or abuse. Founded in 2012, the company remains committed to leveraging the power of artificial machine intelligence to empower humans with meaningful data to make decisions.

About Community Sift

Community Sift is changing the world, one online community at a time. Born out of a desire to end harassment on the social web, Community Sift combines artificial intelligence with human experience to classify and filter User Generated Content (UGC) on a scale of risk, all while taking context into account. Working with a variety of online communities and apps, including ROBLOX, Animal Jam, and PopJam, Community Sift is committed to keeping all users safe in a connected world.

Why Does COPPA Compliance Matter?

Every day we’re inundated with news stories about how to keep kids safe, both in the real world and online. It’s overwhelming. It’s exhausting. Above all, it’s terrifying.

Are we fated to let our children wander in some lawless, retrograde technological wasteland? How do we protect them? Are there safe spaces, or is the internet — and social media in particular — the 21st-century version of the Wild West?

When I was five, I was at the park in my tiny, rural hometown where 4H was the club to join, and the local skating rink was the coolest place to hang out after school. That image in your head, of a close-knit farming community where everyone knows everyone and no one locks their doors at night? That’s my hometown.

I was playing on the slide when a man approached me with a camera and asked my name. I told him, and he asked if he could take my picture. I said yes, he took a picture of me, smiling and waving, on the slide, and he left. When I told my mother — who at the time was at the other end of the park with my younger sister and had seen none of this — she was horrified. I assured her that I only told the man my first name (apparently those kindergarten “stranger danger” warnings work!), although I didn’t think twice about him taking my picture. He asked nicely; he was an adult, and so I said yes.

Children are inherently trusting. And that’s a good thing; we want them to be open and empathetic. We don’t want them to know that the world is so often a scary, confusing place. To quote John Milton, “Innocence, once lost, can never be regained.”

It was the mid-80s, long before helicopter parents and attachment parenting, so my mother didn’t go into hysterics. She simply held my hand, and we left the park, perhaps with slightly more haste than usual.

The next day my picture was in the paper, accompanied by the caption “Five-year-old Leah, who didn’t want to give her last name, enjoys the slide on a sunny day at the park.”

There wasn’t a lot going on in my town.

My story ended well and was my first and only brush with small-town fame. I’m pretty sure my mother still has the newspaper clipping, now yellow and brittle with age, in an album somewhere. But here’s the thing: it didn’t have to end that way. There are many ways it could have ended, and not one of them is good. When children, in their innocence and good nature, give out personal information to strangers — whether in real life or online — they put themselves at risk.

At Community Sift we aren’t willing to take those risks. Whether it’s sharing a real name (especially a full name), asking for a picture, or providing links to potentially unsafe websites, we will do everything in our power — and our risk classification tool is a remarkably powerful thing — to keep them safe. We’ve spent years developing this technology because we know that it matters. We even have the name to prove it. The original iteration of Community Sift was called Potty Mouth Chat Filter. Kinda gross, but kinda perfect. It’s all about kids for us, and it has been since we formed Two Hat Security in 2012.

Safety is at the heart of what we do, and the reason we do it. We aim to remove bullying from the Internet. We aim to keep kids safe from bullies, from predators, and from anyone who would seek to do them harm.

And to that end, we are very excited and incredibly proud to announce that Community Sift is kidSAFE certified!

Check out the nifty logo and link in the lower right of the Community Sift website.

So, what exactly does that mean for us, and for you as a community manager, an app developer, a social media maker? It means that, in addition to being industry leaders in content classification and filtering technology, we are also recognized leaders in children’s online safety. We keep kids safe by protecting their personally identifiable information (called PII by those in the know) and ensuring that they aren’t exposed to bullying, profanity, and child exploitation.

Fast-forward thirty years, and in a nice piece of symmetry, I have a five-year-old niece. At four she was a seasoned YouTuber, and at five she’s newly obsessed with Pokémon Go. She knows how to get to mom’s Facebook page. She can’t read yet, but she’s a sharp kid, and she’s learning. Within the year she’ll be reading and writing, and like any parent figure, I worry. What kinds of things will she see online? What kinds of things will she write? What happens when she gets a phone, an iPad, her own computer? It’s scary, but to be honest (and I say this with sincerity), I don’t want her in a space that isn’t protected and safe — and Community Sift is the solution that I trust.

 

Learn more about how Community Sift keeps kids safe using our unique chat filter software.
Learn more about the kidSAFE Seal Program.