Three Powerful Lessons We Learned at the Protecting Innocence Hackathon

You rarely hear about them, but every day brave investigators across the globe review the most horrific stories and images you can ever imagine. It’s called child sexual abuse material (known as CSAM in the industry), and it hides in the dark corners of the internet, waiting to be found.

The scope is dizzying. The RCMP-led National Child Exploitation Coordination Centre (NCECC) alone received 27,000 cases in 2016. And right now, it’s nearly impossible for officers to review those cases fast enough to prioritize the ones that require their immediate attention.

That’s why, on July 6th and 7th, volunteers from law enforcement, academia, and the tech industry came together to collaborate on solving this problem, perhaps the biggest problem of our time — how do we quickly, accurately, and efficiently detect online CSAM? Artificial intelligence gets smarter and more refined every day. How can we leverage those breakthroughs to save victimized children and apprehend their abusers?

Along with event co-sponsors the RCMP, Microsoft, and Magnet Forensics, we had a simple goal at the Protecting Innocence Hackathon: to bring together the brightest minds in our respective industries to answer these questions.

We ended up learning a few valuable lessons along the way.

It starts with education

Participants across all three disciplines learned from each other. Attendees from the tech industry and academia were given a crash course in grooming and luring techniques (as well as the psychology behind them) from law enforcement, the people who study them every day.

Make no mistake, these were tough lessons to learn — but with a deeper understanding of how predators attract their victims, we can build smarter, more efficient systems to catch them.

Law enforcement studied the techniques of machine learning and artificial intelligence — which in turn provides them with a deeper understanding of the challenges facing data scientists, not to mention the need for robust and permanent datasets.

It’s crucial that we learn from each other. But that’s just the first step.

Nothing important happens without collaboration

Too often our industries are siloed, with every company, university, and agency working on a different project. Bringing professionals together from across these disciplines and encouraging them to share their diverse expertise, without reservations or fear, was a huge accomplishment, and an important lesson.

This isn’t a problem that can be solved alone. This is a 25,000,000-million-images-a-year problem. This is a problem that crosses industry, cultural, and country lines.

If we want to protect the innocence of children, we have a responsibility to be transparent and collaborative.

Just do it

Education and collaboration are commendable and necessary — but they don’t add up to much without actual results. Once you have the blueprints, you have no excuse not to build.

The great news? The five teams and 60+ participants made real, tangible progress.

Collectively, the teams built the following:

  • A proposed standard for internationally classifying and annotating child sexual exploitation images and text
  • A machine learning aggregation blueprint for both text and image classification
  • Machine learning models to detect sexploitation conversation, as well image detection for as age, anime, indoor and outdoor, nudity, and CSAM

We cannot overstate the importance of these achievements. They are the first steps towards building the most comprehensive and accurate CSAM detection system the world has seen.

Not only that, the proposed global standard for classifying text and images, if accepted, will lead to even more accurate detection.

The future of CSAM detection is now

We actually learned a fourth lesson at the hackathon, perhaps the most powerful of them all: Everyone wants to protect and save children from predators. And they’re willing to work together, despite their differences, to make that happen.

At Two Hat Security, we’re using the knowledge shared by our collaborators to further train our artificial intelligence model CEASE and to refine our grooming and luring detection in Community Sift. And we’ll continue to work alongside our partners and friends in law enforcement, academia, and the tech industry to find smart solutions to big problems.

There are challenges ahead, but if everyone continues to educate, collaborate, and create, projects like CEASE and events like Protecting Innocence can and will make great strides. We hope that the lessons we learned will be applied by any agency, company, or university that hopes to tackle this issue.

Thank you again to our co-sponsors the RCMP, Microsoft, and Magnet Forensics. And to the Chief EnforcersCode Warriors, and Data Mages who gave their time, their expertise, and their fearlessness to this event — your contributions are invaluable. You’re changing the world.

And to anyone who labors every day, despite the heartbreak, to protect children — thank you. You may work quietly, you may work undercover, and we may never know your names, but we see you. And we promise to support you, in every way we can.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


How a Hackathon Hopes to Stop Online Child Exploitation

Every year, the National Center for Missing & Exploited Children reviews 25,000,000 images containing child sexual abuse imagery (CSAM).

How do you conceptualize a number like 25,000,000? It’s unthinkable.

For perspective, there are just over 24,000,000 people in Australia. The population of a large country — that’s how many vile, horrific, and disturbing images NCMEC had to review in a year.

In 2016, the Internet Watch Foundation found 57,335 URLs containing confirmed child sexual abuse imagery. 57,335 — that’s about the population of a mid-sized city like Watertown, NY.

Still not convinced of the epidemic?

How about this? Over half of the children depicted on those 57,335 URLs were aged 10 or younger.

We’ve all been ten years old. Many of us have ten-year-old children, or nieces, or nephews.

Now that’s unthinkable.

Protecting the helpless

These images aren’t going away.

That’s why we’ve spearheaded a hackathon taking place on July 6th and 7th in Vancouver, British Columbia, Canada. Sponsored by the RCMP, Microsoft, Magnet, and Two Hat Security, the Protecting Innocence Hackathon is an attempt to build a bridge between three diverse disciplines — law enforcement, academia, and the technology sector — for the greater good.

The goal is to work together to build technology and global policy that helps stop online child exploitation.

Teams from across all three industries will gather to work on a variety of projects, including:

  • designing a text classification system to identify child luring conversations
  • training an image classification to identify child exploitation media
  • coordinating on a global protocol for sharing CSAM evidence between agencies
  • and more…

We are hopeful that by encouraging teamwork and partnerships across these three vital industries, we will come closer to ridding the internet of online child exploitation.

The beauty of a hackathon is that it’s a tried and true method for hacking away at tough problems in a short period. The time box encourages creativity, resourcefulness, critical thinking — and above all, collaboration.

We’re honored to be working on this. And we’re indebted to the RCMP, Microsoft, Magnet, and all the hackers attending for their selfless contributions.

Protecting the innocent

Forget incomprehensible numbers like 25,000,000 or 57,335. We’re doing this for all the ten-year-olds who’ve been robbed of their innocence.

Today, it’s easier than ever for predators to create and share pictures, videos, and stories. And every time those pictures, videos, and stories are shared, the victim is re-victimized.

It gets worse every year. The Internet Watch Foundation found that reports of child sexual abuse imagery rose by 417% between 2013 and 2015.

At Two Hat Security, we’re doing our part to fight the spread of illegal and immoral content. In collaboration with the RCMP and universities across the country, and with a generous grant from Mitacs, we’re building CEASE, an artificial intelligence model that can detect new CSAM.

But we can’t solve this problem alone.

So this July 6th and 7th, we salute the code warriors, chief enforcers, and data mages who are coming together to make a real difference in the world.

We hope you will too.

Just a little sneak peek…

***

Want to know more about CEASE? Read about the project here.

We believe in a world free of online bullying, harassment, and child exploitation. Find out how we’re making that vision a reality with our high-risk content detection system Community Sift.

We work with companies like ROBLOX, Animal Jam, and more to protect their communities from dangerous content.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required