Connect With us at the Crimes Against Children Conference in Dallas

For the second year in a row, Two Hat Security will be attending the Crimes Against Children Conference in Dallas, Texas as a sponsor. Founded in 1988, the conference brings together attendees from law enforcement, child protective services, and more to “provid[e] practical and interactive instruction to those fighting crimes against children and helping children heal.”

Last year, more than 4200 professionals attended CACC — a record for the conference and a sign of the growing need for these discussions, workshops, and training sessions.

Two Hat Security founder and CEO Chris Priebe and VP of Product Brad Leitch are hosting two sessions this year. Both sessions will provide investigators with a deeper understanding of the vital role artificial intelligence plays in the future of abuse investigations.

Session 1: Using Artificial Intelligence to Prioritize and Solve Crimes

Tuesday, August 8
1:45 PM – 3:00 PM
Location: Dallas D2

In this session, we explore what recent advances in artificial intelligence and machine learning mean for law enforcement. We’ll discuss how this new technology can be applied in a meaningful way to triage and solve cases faster. This is a non-technical session that will help prepare investigative teams for upcoming technology innovations.

Session 2: Beyond PhotoDNA — Detecting New Child Sexual Abuse Material With CEASE.ai

Wednesday, August 9
8:30 AM – 9:45 AM
Location: City View 8 (Exhibitor Workshop)

Traditionally, PhotoDNA has allowed organizations to detect already categorized child sexual abuse material (CSAM). Sadly, with new digital content being so easy to create and distribute worldwide, investigators have seen an epidemic of brand-new, never-seen CSAM being shared online.

CEASE is an AI model that uses computer vision to detect these new images. Our collaboration with the Royal Canadian Mounted Police has given our data scientists access to a permanent data set of confirmed CSAM, which we are using to train the model.

However, it’s still a work in progress. If you are a member of the law enforcement community or the technology industry, we need your expertise and vast knowledge to help shape this groundbreaking system.

Stop by our booth

We look forward to meeting fellow attendees and discussing potential collaboration and partnership opportunities. Visit us at booth 41 on the 2nd floor, next to Griffeye.

As we learned at the Protecting Innocence Hackathon in July, “If we want to protect the innocence of children, we have a responsibility to be transparent and collaborative.”

You can sign up for both workshops through the conference website. Feel free to email us at hello@twohat.com to set up a meeting.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


Three Powerful Lessons We Learned at the Protecting Innocence Hackathon

You rarely hear about them, but every day brave investigators across the globe review the most horrific stories and images you can ever imagine. It’s called child sexual abuse material (known as CSAM in the industry), and it hides in the dark corners of the internet, waiting to be found.

The scope is dizzying. The RCMP-led National Child Exploitation Coordination Centre (NCECC) alone received 27,000 cases in 2016. And right now, it’s nearly impossible for officers to review those cases fast enough to prioritize the ones that require their immediate attention.

That’s why, on July 6th and 7th, volunteers from law enforcement, academia, and the tech industry came together to collaborate on solving this problem, perhaps the biggest problem of our time — how do we quickly, accurately, and efficiently detect online CSAM? Artificial intelligence gets smarter and more refined every day. How can we leverage those breakthroughs to save victimized children and apprehend their abusers?

Along with event co-sponsors the RCMP, Microsoft, and Magnet Forensics, we had a simple goal at the Protecting Innocence Hackathon: to bring together the brightest minds in our respective industries to answer these questions.

We ended up learning a few valuable lessons along the way.

It starts with education

Participants across all three disciplines learned from each other. Attendees from the tech industry and academia were given a crash course in grooming and luring techniques (as well as the psychology behind them) from law enforcement, the people who study them every day.

Make no mistake, these were tough lessons to learn — but with a deeper understanding of how predators attract their victims, we can build smarter, more efficient systems to catch them.

Law enforcement studied the techniques of machine learning and artificial intelligence — which in turn provides them with a deeper understanding of the challenges facing data scientists, not to mention the need for robust and permanent datasets.

It’s crucial that we learn from each other. But that’s just the first step.

Nothing important happens without collaboration

Too often our industries are siloed, with every company, university, and agency working on a different project. Bringing professionals together from across these disciplines and encouraging them to share their diverse expertise, without reservations or fear, was a huge accomplishment, and an important lesson.

This isn’t a problem that can be solved alone. This is a 25,000,000-million-images-a-year problem. This is a problem that crosses industry, cultural, and country lines.

If we want to protect the innocence of children, we have a responsibility to be transparent and collaborative.

Just do it

Education and collaboration are commendable and necessary — but they don’t add up to much without actual results. Once you have the blueprints, you have no excuse not to build.

The great news? The five teams and 60+ participants made real, tangible progress.

Collectively, the teams built the following:

  • A proposed standard for internationally classifying and annotating child sexual exploitation images and text
  • A machine learning aggregation blueprint for both text and image classification
  • Machine learning models to detect sexploitation conversation, as well image detection for as age, anime, indoor and outdoor, nudity, and CSAM

We cannot overstate the importance of these achievements. They are the first steps towards building the most comprehensive and accurate CSAM detection system the world has seen.

Not only that, the proposed global standard for classifying text and images, if accepted, will lead to even more accurate detection.

The future of CSAM detection is now

We actually learned a fourth lesson at the hackathon, perhaps the most powerful of them all: Everyone wants to protect and save children from predators. And they’re willing to work together, despite their differences, to make that happen.

At Two Hat Security, we’re using the knowledge shared by our collaborators to further train our artificial intelligence model CEASE and to refine our grooming and luring detection in Community Sift. And we’ll continue to work alongside our partners and friends in law enforcement, academia, and the tech industry to find smart solutions to big problems.

There are challenges ahead, but if everyone continues to educate, collaborate, and create, projects like CEASE and events like Protecting Innocence can and will make great strides. We hope that the lessons we learned will be applied by any agency, company, or university that hopes to tackle this issue.

Thank you again to our co-sponsors the RCMP, Microsoft, and Magnet Forensics. And to the Chief EnforcersCode Warriors, and Data Mages who gave their time, their expertise, and their fearlessness to this event — your contributions are invaluable. You’re changing the world.

And to anyone who labors every day, despite the heartbreak, to protect children — thank you. You may work quietly, you may work undercover, and we may never know your names, but we see you. And we promise to support you, in every way we can.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required