You rarely hear about them, but every day brave investigators across the globe review the most horrific stories and images you can ever imagine. It’s called child sexual abuse material (known as CSAM in the industry), and it hides in the dark corners of the internet, waiting to be found.
The scope is dizzying. The RCMP-led National Child Exploitation Coordination Centre (NCECC) alone received 27,000 cases in 2016. And right now, it’s nearly impossible for officers to review those cases fast enough to prioritize the ones that require their immediate attention.
That’s why, on July 6th and 7th, volunteers from law enforcement, academia, and the tech industry came together to collaborate on solving this problem, perhaps the biggest problem of our time — how do we quickly, accurately, and efficiently detect online CSAM? Artificial intelligence gets smarter and more refined every day. How can we leverage those breakthroughs to save victimized children and apprehend their abusers?
Along with event co-sponsors the RCMP, Microsoft, and Magnet Forensics, we had a simple goal at the Protecting Innocence Hackathon: to bring together the brightest minds in our respective industries to answer these questions.
We ended up learning a few valuable lessons along the way.
It starts with education
Participants across all three disciplines learned from each other. Attendees from the tech industry and academia were given a crash course in grooming and luring techniques (as well as the psychology behind them) from law enforcement, the people who study them every day.
Make no mistake, these were tough lessons to learn — but with a deeper understanding of how predators attract their victims, we can build smarter, more efficient systems to catch them.
Law enforcement studied the techniques of machine learning and artificial intelligence — which in turn provides them with a deeper understanding of the challenges facing data scientists, not to mention the need for robust and permanent datasets.
It’s crucial that we learn from each other. But that’s just the first step.
Nothing important happens without collaboration
Too often our industries are siloed, with every company, university, and agency working on a different project. Bringing professionals together from across these disciplines and encouraging them to share their diverse expertise, without reservations or fear, was a huge accomplishment, and an important lesson.
This isn’t a problem that can be solved alone. This is a 25,000,000-million-images-a-year problem. This is a problem that crosses industry, cultural, and country lines.
If we want to protect the innocence of children, we have a responsibility to be transparent and collaborative.
Just do it
Education and collaboration are commendable and necessary — but they don’t add up to much without actual results. Once you have the blueprints, you have no excuse not to build.
The great news? The five teams and 60+ participants made real, tangible progress.
Collectively, the teams built the following:
- A proposed standard for internationally classifying and annotating child sexual exploitation images and text
- A machine learning aggregation blueprint for both text and image classification
- Machine learning models to detect sexploitation conversation, as well image detection for as age, anime, indoor and outdoor, nudity, and CSAM
We cannot overstate the importance of these achievements. They are the first steps towards building the most comprehensive and accurate CSAM detection system the world has seen.
Not only that, the proposed global standard for classifying text and images, if accepted, will lead to even more accurate detection.
The future of CSAM detection is now
We actually learned a fourth lesson at the hackathon, perhaps the most powerful of them all: Everyone wants to protect and save children from predators. And they’re willing to work together, despite their differences, to make that happen.
At Two Hat Security, we’re using the knowledge shared by our collaborators to further train our artificial intelligence model CEASE and to refine our grooming and luring detection in Community Sift. And we’ll continue to work alongside our partners and friends in law enforcement, academia, and the tech industry to find smart solutions to big problems.
There are challenges ahead, but if everyone continues to educate, collaborate, and create, projects like CEASE and events like Protecting Innocence can and will make great strides. We hope that the lessons we learned will be applied by any agency, company, or university that hopes to tackle this issue.
Thank you again to our co-sponsors the RCMP, Microsoft, and Magnet Forensics. And to the Chief Enforcers, Code Warriors, and Data Mages who gave their time, their expertise, and their fearlessness to this event — your contributions are invaluable. You’re changing the world.
And to anyone who labors every day, despite the heartbreak, to protect children — thank you. You may work quietly, you may work undercover, and we may never know your names, but we see you. And we promise to support you, in every way we can.