Client Spotlight: Kidzworld

With many schools shut down indefinitely and the summer break approaching soon, it’s more important than ever that children have safe online spaces to share and make new friends.

We recently caught up with Executive Vice President James Achilles and Community Manager Jordan Achilles of Kidzworld to discuss how they are keeping kids connected.

First of its kind
The first truly safe and secure kids’ social network, Kidzworld began life in 2001 as an online magazine for kids, long before kid-friendly content was widely available online. In 2007, Kidzworld recognized that kids wanted more than just online content – they were also looking for safe spaces to chat and make new friends. Ahead of the explosion of kid-oriented social networks, Kidzworld introduced their own moderated chat room, forums, and profiles for young internet users.

Kidzworld Logo

Like many organizations at the time, Kidzworld originally used an in-house blacklist/whitelist to moderate their social features, with moderators manually enteri

ng new words and phrases to the lists based on community trends. “At one point, we even manually turned the chat room on and off, based on when our moderators were available to watch the chat in real-time,” says James Achilles, Executive Vice President at Kidzworld Media.

As technology evolved and the needs of a childrens’ social network changed, Kidzworld looked to new solutions to make their moderation process smarter and more efficient, and to provide their users with a safe platform that still allowed freedom of expression.

Early adopters

James Achilles

An early adopter of Two Hat’s chat filter and content moderation platform Community Sift, Kidzworld met with CEO and founder Chris Priebe in 2012 to help build the chat filter using their data. In 2017 they officially came on board. “We wanted to see how we could evolve as a filter and allow the kids so much more freedom,” says James. “That’s when we came to Community Sift because it allowed the kids to say certain words but only within a certain context.”

What is the biggest change in moderation that Kidzworld has seen over the years? “The freedom and flexibility with words and phrases which didn’t exist when we started,” James says.

The team also uses other techniques to enforce community guidelines.

Jordan Achilles

“We have auto-messaging set up through Community Sift,” says Online Community & Web Content Manager Jordan Achilles. “If a user hits a certain threshold, they get a warning on both the negative and positive. There are messages that say What you’re saying isn’t allowed, review the rules. But on the flip side, we can reward the user, with a message like You’ve been communicating well! You are now a trusted user, which gives them more freedom to chat.”

The community itself is generally positive, adds Jordan. “They come to me online all the time to let me know if they saw a message that just didn’t feel right, or if someone is asking weird questions. That is so beneficial to us. We have a reporting system that is really great when the kids can just report one message and they know that I’m going to look at the rest of the messages and see what the other user’s intention is.”

A virtual playground
Today, thanks to this robust moderation platform, Kidzworld is a bustling online community, made up of kids from across the globe.

“They come here to catch up with each other, go in the chat room and be silly or go in the forums and do different role plays,” says Jordan. “The roleplay forums are where the strongest community of friends exist because they rely on each other to have that communication for the story threads, these fantasy stories that they’ve created. They create these stories with each other each day; one person posts and then they respond to each other, creating a full story.”

The Kidzworld role-playing forums are truly wonderful. Full of interactive, text-only stories set in TV sitcoms, hospitals, the worlds of Marvel and Harry Potter, school, and original worlds, they are places where a child’s imagination can run wild.

“The forums are where we’ve seen a huge change with the flexibility of the filter,” says Jordan. “Some of the stuff that they’re saying, random characters or different personality traits, our previous filter would block and reject.”

Roleplay forums in Kidzworld
The roleplay forums in Kidzworld

It also helps that the Kidzworld team has full access to the moderation platform and can update it in real-time. “If a weird obscure name that they’ve created for the roleplay character is blocked, I can approve it and even add it to the filter so it’s not blocked again,” Jordan says.

“It’s so cool to see their imaginations go. And that’s why we’re so happy to give them this space,” adds Jordan. “They can be these different people that they want to be online and it creates a space for them to let their imagination run wild and write stories and be someone that they can’t necessarily be in real life.”

Looking ahead
Asked what the future holds for Kidzworld, James Achilles says, “We love seeing more and more kids on the site. It is great for them to take advantage of all the opportunities on the site. The kids that are here love it and they’re consistent users. We would like to be more widely known for everything we have to offer kids. We are always looking for ways to improve the site. Right now we are working on some new technology, in partnership with Community Sift, that we know the kids are going to love.”

Gary, the Kidzworld mascot


Learn more about Kidzworlds’ commitment to safety on the parent and teacher resources section on their website. Read about their safety guidelines here.

And don’t forget to check out the kid-friendly content, from quizzes to movie reviews, and everything in between!

A Four-Step Beginner’s Guide to COPPA Compliance

There are few things more rewarding in the digital space than creating a great kid’s product. The market is booming, and there are endless possibilities for new, innovative products. Kids (and parents) today are looking for unique, engaging, and above all safe apps and sites to play, share, and connect with other kids across the world.

But if you plan to market your product to kids under 13 in the US, by law you have to ensure that it follows the Children’s Online Privacy Protection Act (COPPA). The act regulates how you collect and store personal information of under-13 users in your product.

And it’s a big deal. Companies like Hasbro, Mattel, Viacom, and JumpStart Games were fined in 2016 for violating COPPA. And three new lawsuits were announced this month. If you saw the last season of Silicon Valley, like Dinesh and the rest of the Pied Piper crew, you know that COPPA fines are hefty. Privacy regulations are a hot topic in 2018. GDPR is set to take effect in the EU on May 25th. And ESRB has proposed modifications to Safe Harbor, with public comments due on May 9th.

COPPA, like most regulations, can be confusing to navigate, especially if you’re new to the industry. That’s why we’ve put together this four-step guide to help you unpack the legalese.

(Register to watch our on-demand webinar The ROI of COPPA. In it, we explore the benefits of compliance, including increased engagement, retention, and profits.)

Let’s get the easy stuff out of the way first.

What is COPPA?

The Children’s Online Privacy Protection Act (COPPA) protects the online privacy of children under the age of 13 in the US. It’s a large and complex rule (you can read it in its entirety here), but there are a few key points you should know:

  • COPPA only applies to children in the US under the age of 13 (U13). If you are a US-based company then you are expected to protect U13 globally. However, if your company is based out of the US, you are only legally obligated under COPPA to protect American children on your platform — or risk a potentially crippling fine from the FTC or even a US State Attorney General.
  • Children outside the US will be subject to emerging similar regulations like the GDPR and its minors consent requirements. Those laws might also apply to US companies operating in those countries.
  • COPPA only applies to children under the age of 13. Other international regulations have different cutoff ages. For example, GDPR may allow member states to choose an age up to 15 for parental consent requirements.
  • COPPA is designed to protect children’s online privacy and data security — it doesn’t prevent cyberbullying or profanity
  • COPPA is not just about removing private information. It’s also about having parental consent to collect, use, disclose, track or share private information.
  • COPPA also applies to third-party plugins and services you use. This is a very tricky situation and requires proper due diligence. See the current class action law suits recently filed in this area as an example.

Please note: Two Hat Security is not a law office and cannot provide legal advice. While we provide technology that helps companies achieve compliance, you should still always seek independent legal counsel. We recommend reaching out to a Safe Harbor provider like PRIVO, kidSAFE seal, or ESRB (more about these companies below).

In addition, the FTC has put together a Six-Step Compliance Plan that is a great reference and another helpful starting point.

What is PII (Personally Identifiable Information)?

PII is any information that can be used to identify a person in real life. This list is from the FTC’s page:

  • full name;
  • home or other physical address, including street name and city or town;
  • online contact information like an email address or other identifier that permits someone to contact a person directly — for example, an IM identifier, VoIP identifier, or video chat identifier;
  • screen name or user name where it functions as online contact information;
  • telephone number;
  • Social Security number;
  • a persistent identifier that can be used to recognize a user over time and across different sites, including a cookie number, an IP address, a processor or device serial number, or a unique device identifier;
  • a photo, video, or audio file containing a child’s image or voice;
  • geolocation information sufficient to identify a street name and city or town; or
  • other information about the child or parent that is collected from the child and is combined with one of these identifiers.

Personal information isn’t just a user’s real name. It also includes cookie numbers and geolocation.

Let’s get into the meat and potatoes — the four crucial steps to get started.

Step one is a big one.

Step 1: Choose your strategy

There are a few options here. Each one has its pros and cons.

Option 1: Don’t have under-13 users in your product

A common strategy used by large sites and apps like Facebook, YouTube, Twitter, and others. To be in this category, your product cannot be child-directed. So if you’ve created a product that is clearly designed for children, if you partner with kid-directed products, and market yourself as a place for youth, this option won’t work.

However, if you qualify as a general audience, you can reduce your risk by providing an age gate and blocking U13. Typically that means asking users to enter their birth date when they sign up. If you do this, you’ll also need a mechanism to block users from first selecting a child’s birthdate, then changing their answer.

As well, you will need a strategy to manage accounts that you discover are under 13 — think of kids who post pictures from their birthday party with the caption “I’m 10 now!” If that’s reported, you have to build a process to deal with it.

So, if your product is clearly designed for children, you will have to go with one of the following two options.

Option 2: Verifiable Parental Consent

Works if you need users to share personal information

Parental consent is a necessity if your site or app is a social network for under-13 users who share their real identity. As well, if your site or app is designed for video or photo sharing, getting a parent’s permission is a smart option. With the photo regulations in COPPA, it’s much smarter to err on the side of caution.

Parental consent is tricky. Recommended steps include faxing or mailing in a signature, speaking to a customer support rep, or making a payment of one dollar or more. The FTC recently approved two new options — knowledge-based challenge questions and utilizing your driver’s license (that itself has its own security risks!).

Option 3: Email plus

Works if your product doesn’t require that users share personal information

The third option is to add a parental permission step to the sign-up process. This first requires that you build an age gate. Then, if the user is under 13, you require that they enter their parent’s email address or another online identifier to request consent before gaining access to their account. After that, you send an email with proper disclosure notices and an activation link to their parents. Animal Jam, Club Penguin, and many Disney products use this strategy.

The downside to this option? It can be bad for business. Every time you add a step to the registration process, you lose more users. In some cases, you can lose up to 50% of all candidates during the email sign-up process.

But adding parental permission during the signup process is just the first step. You still have to ensure that kids can’t disclose their personal information once they’re logged in.

Here’s how COPPA defines “disclosure“:

(2) Making personal information collected by an operator from a child publicly available in identifiable form by any means, including but not limited to a public posting through the Internet, or through a personal home page or screen posted on a Web site or online service; a pen pal service; an electronic mail service; a message board; or a chat room.

To translate: You can’t let your under-13 users post “My name is Joe Rogers and I live at 2345 Markum street. Will you be my friend?” At that point, you have collected PII and broadcast it publically. This also applies to profile pictures if they contain GPS information or a child’s face.

To prevent disclosure, you’ll need to combine the email plus option with a chat and/or image filter that checks every post for PII. More on that later.

Wait, there’s more!

Another factor to consider is what kind of third-party apps and services you are using. Google Analytics and others have a config setting that can turn off user-specific behavior tracking. If you’re building a mobile app, remember to scan all your plugins to ensure they are compliant.

Which option do you choose?

Whatever decision you make will be based on your unique product.

Fortunately, there are two companies that can help you implement whichever strategy you choose.

Full disclosure: We work with all three companies below, which is why we feel comfortable recommending their services.

Recommended products

AgeCheq offers a developer-friendly service that handles nearly all of the parent/child and technical complexities of COPPA compliance for you. After you provide accurate answers to a focused privacy survey and drop a few lines of code into your website or app, COPPA parental notice and consent (and ongoing consent management) are handled with a single-sign on parental dashboard that supports multiple children and (of course) multiple publishers.

AgeCheq has put a lot of effort into streamlining the effort required by parents and children to enjoy your content, while still complying with COPPA. DragonVale World and WolfQuest are two very popular apps that use the AgeCheq service. If you may have EU users, AgeCheq also offers a universal service called ConsentCheq CDK that provides GDPR, ePrivacy and COPPA consent with a single integration.

Kids Web Services (KWS) is a platform that makes it easy for developers to build effective and COPPA and GDPR-K compliant digital products for kids, and to increase engagement and conversion. Building awesome digital experiences for the under-13 market is hard. KWS is the self-service platform for developers to effortlessly build effective digital products for kids and parents that are compliant with all global data privacy laws, such as COPPA and GDPR. You build awesome and fun digital experiences, and they ensure everything you do is transparent and compliant by design.


PRIVO offers an FTC-approved kid and family customer identity and permission management platform. Their platform includes registration & account creation, identity proofing, sliding scale parental consent, single sign on, account management & permission portal.

From their site: “PRIVO helps you build trust with your users, partners, and regulators by helping you meet or exceed federal and international data regulations. PRIVO can act as your fractional Chief Privacy Officer.”

They are a COPPA Safe Harbor provider as well, so they can also review your compliance (more about Safe Harbor later).

Step 2: Have a clear privacy policy and notices

Like most people, we start to go cross-eyed when looking at legal documents. The FTC outlines the components of a COPPA compliant privacy policy here.

One of the most important takeaways? Be clear and avoid legalese. AgeCheq and PRIVO can help with that.

Step 3: Filter or redact PII

Here is where a chat filter and automated moderation software like Community Sift becomes business-critical.

Whenever an under-13 user gives you text, images, or video you need to ensure that it doesn’t contain PII (if you haven’t received verifiable parental consent).

Kids are persistent. And they’re very savvy — they know their way around simple blacklists and unsophisticated filters.

Some examples of kids sharing PII that we’ve seen:

“My add dress is one two 3 on homer street”

“Ping me at two 5 oh seven seven 55”

“250” (line 1)
“Seven” (line 2)
“Seven” (line 3)
“55” … (you get the idea)

Here’s the problem: Most filters confuse these intentional manipulations with innocent things like:

“9 8 7 6 5 4 3 2 1 go”

“My stats are 456.344.2222.222 with XP of 233”

“I have 3 cats 1 dog 4 hamsters and 67 goldfish”

You don’t want to prevent kids from talking about their pets or sharing their achievements. But can you do that without blocking all numbers?

Reputation matters

Community Sift is a reputation-based system, which is the best way to deal with endlessly complicated filter tricks. With 1.8 million language patterns and linguistic templates added to the system, we find the most common ways users try to break the filter.

When a user is caught manipulating the filter, they earn a “reputation” for actively trying to share PII. Then, the system automatically changes their settings to prevent them from sharing any numbers. Innocent users can still use numbers to talk about points, items, etc.

The best part is that users can change their reputation based on behavior. So, users with a “bad” reputation who stop attempting to share their phone numbers can earn a “good” reputation and regain their old settings. And it all happens in real time.

Sanctions matter, too

As a best practice, we recommend using both user reputation and sanctions to moderate PII. Progressive sanctions give users the opportunity to change their behavior over time.

You can start by displaying a message to the user, explaining why sharing PII in dangerous. If they try again, suspend their account for 24 hours, then 72 hours — and so on.

The good thing is that immediate feedback almost always works. After the initial warning, most users won’t try again.

Why does COPPA matter?

For us, one of the most important sections in the regulation is the definition of collecting PII: “Enabling a child to make personal information publicly available in identifiable form.”

The definition continues:

“An operator shall not be considered to have collected personal information under this paragraph if it takes reasonable measures to delete all or virtually all personal information from a child’s postings before they are made public and also to delete such information from its records,” from §312.2.2; emphasis ours

Kids don’t understand the inherent dangers of sharing PII online. They are persistent, and unless we put (reasonable) measures in place to prevent it, they will keep trying to post their phone numbers, email addresses, real names, and more.

You’ll never catch everything. But you can detect and block nearly all instances of PII without crippling your business, and without stifling actual conversation.

As operators of kid-friendly social products, it’s our responsibility to protect children from their own innocence.

COPPA was created to protect children’s privacy. But in the process, it also protects them from predators, grooming, and luring.

Step 4: Get a COPPA Safe Harbor to validate

COPPA compliance is no easy task. Every product is unique and requires individual attention.

As a final step, the industry best practice is to hire an FTC-approved COPPA Safe Harbor site to review your strategy and product, as well as your 3rd party contracts and settings, to ensure you are and remain compliant.

Yes, it will cost several thousand dollars — but it’s far cheaper than losing hundreds of thousands of dollars in FTC fines and parental trust; in addition to saving on legal fees and providing you confidence in your engagement strategy.

You also get a seal to put on your app or site. The seal means that the Safe Harbor company you’ve chosen has given their stamp of approval to your strategy.

If the FTC ever looks at you and finds a potential issue they will look at that stamp/seal and allow you a chance to cure any issues without the heavy stick of enforcement action. You are basically deemed compliant when you are in good standing with an approved Safe Harbor. Of course, you need to keep it current, but it’s a great protection.

Safe Harbor companies that we recommend:

Not only does PRIVO provide a system for verifiable parental consent, they also are a Safe Harbor and will review your site, or app, or smart toy for COPPA, GDPR, SOPIPA, the Student Data Privacy Pledge, and other regulations protecting kids privacy online. It is pretty powerful that they offer two products to cover both of your needs.

kidSAFE initially only provided a seal certifying that kid’s sites and apps have followed proper safety steps. Now they are an FTC-approved Safe Harbor, and can also provide a kidSAFE+ COPPA-CERTIFIED Seal. That added safety certification is highly advantageous for kid’s products.

kidSAFE reviewed Community Sift and helped us create a set of guidelines for under-13 products. They are experts when it comes to kid’s online privacy and safety.


You’ve heard this line before — “Rated ‘E’ for everyone.” The rating system created by the Entertainment Software Rating Board is clear, concise, and immediately recognizable. They are also a COPPA compliance and Safe Harbor provider.

Next steps

There you have it — your four-step beginner’s guide to COPPA compliance. To recap, your first step is to decide on a parental permission strategy.

Once you’ve done that, craft a readable privacy policy that anyone can understand.

Then, find the right moderation software to filter PII in chat/images/usernames.

Finally, select a Safe Harbor company to help validate your strategy and ensure that you’re compliant.

Easy as pie?

Maybe not — but while COPPA compliance may not be easy, it is important.

Editor’s note: Originally published in August 2017, updated April 2018

Want more best practices for keeping your online community COPPA compliant? Sign up for the Two Hat newsletter and never miss an update! 

You’ll receive invites to exclusive webinars and workshops, community management and moderation tips, and product updates.

We will never share your information with third parties and you can unsubscribe at any time.

* indicates required