Two Hat’s CEASE.ai Technology Integrates with Griffeye Analyze to Help Investigators Rescue Child Sexual Abuse Victims Faster

With this technical integration, law enforcement agencies worldwide can now easily access cutting-edge artificial intelligence to aid in child sexual abuse investigations

KELOWNA, British Columbia, August 12, 2019: Technology company Two Hat Security announced today that CEASE.ai, an artificial intelligence model that can detect, sort, and prioritize new, previously uncatalogued child sexual abuse material (CSAM) for investigators, is now available for law enforcement agencies using the Griffeye Analyze platform.

“A technology partnership between CEASE.ai and Griffeye has been a goal for us since the beginning,” said Two Hat CEO and founder Chris Priebe. “The aim is to provide this technology to law enforcement agencies worldwide that already use Griffeye Analyze in their investigations. CEASE.ai is designed to not only protect investigators’ mental health, which can be severely affected by viewing these horrific images, but also to help them find and rescue innocent victims faster.”

Built in collaboration with Canadian law enforcement and with support from Canada’s Build in Canada Innovation Program and Mitacs with top Canadian universities, CEASE.ai uses multiple artificial intelligence models to detect and prioritize new images containing child abuse. After investigators run their caseload against a hash list of known images, they can then rescan the remaining items through the CEASE.ai plugin to flag new and uncatalogued images.

“We’re thrilled to integrate CEASE.ai with the Analyze platform,” said Griffeye CEO Johann Hofmann. “We strongly believe that artificial intelligence is the future of technology to fight child sexual abuse, and this is an opportunity for us to work with a company that builds state-of-the-art artificial intelligence and get it into the hands of our law enforcement community. This will help them speed up investigations and free up time to prioritize investigative work such as victim identification.”

The growing number of child sexual abuse material has put investigators under enormous pressure. According to their 2018 annual report, analysts at The Internet Watch Foundation processed 229,328 reports in 2018, a 73% increase on the 2017 figure of 132,636. With increasingly large caseloads containing anywhere from hundreds of thousands to 1-2 million images, investigators struggle to sort and manually review all material. The CEASE.ai technology aims to reduce their workload significantly.

“If we seize a hard drive that has 28 million photos, investigators need to go through all of them,” said Sgt. Arnold Guerin, who works in the technology section of the Canadian Police Centre for Missing and Exploited Children (CPCMEC). “But how many are related to children? Can we narrow it down? That’s where this project comes in, we can train the algorithm to recognize child exploitation.”

Two Hat has also made CEASE.ai available for social platforms to prevent illegal images from being uploaded and shared on social networks. Learn more about how CEASE.ai is assisting law enforcement to detect and prioritize new child sexual abuse material on the Two Hat website.

About Two Hat Security

Two Hat’s AI-powered content moderation platform classifies, filters, and escalates more than 30 billion human interactions, including messages, usernames, images, and videos a month, all in real-time. With an emphasis on surfacing online harms including cyberbullying, abuse, hate speech, violent threats, and child exploitation, they enable clients across a variety of social networks to foster safe and healthy user experiences.

In addition, they believe that removing illegal content is a shared responsibility among social platforms, technology companies, and law enforcement. To that end, Two Hat works with law enforcement to train AI to detect new child exploitative material.

www.twohat.com

About Griffeye

Griffeye provides one of the world’s premier software platforms for digital media investigations. Used by law enforcement, defense and national security agencies across the globe, the platform gives investigators and intelligence professionals a leg up on ever-increasing volumes of image and video files

www.griffeye.com

Connect With us at the Crimes Against Children Conference in Dallas

For the second year in a row, Two Hat Security will be attending the Crimes Against Children Conference in Dallas, Texas as a sponsor. Founded in 1988, the conference brings together attendees from law enforcement, child protective services, and more to “provid[e] practical and interactive instruction to those fighting crimes against children and helping children heal.”

Last year, more than 4200 professionals attended CACC — a record for the conference and a sign of the growing need for these discussions, workshops, and training sessions.

Two Hat Security founder and CEO Chris Priebe and VP of Product Brad Leitch are hosting two sessions this year. Both sessions will provide investigators with a deeper understanding of the vital role artificial intelligence plays in the future of abuse investigations.

Session 1: Using Artificial Intelligence to Prioritize and Solve Crimes

Tuesday, August 8
1:45 PM – 3:00 PM
Location: Dallas D2

In this session, we explore what recent advances in artificial intelligence and machine learning mean for law enforcement. We’ll discuss how this new technology can be applied in a meaningful way to triage and solve cases faster. This is a non-technical session that will help prepare investigative teams for upcoming technology innovations.

Session 2: Beyond PhotoDNA — Detecting New Child Sexual Abuse Material With CEASE.ai

Wednesday, August 9
8:30 AM – 9:45 AM
Location: City View 8 (Exhibitor Workshop)

Traditionally, PhotoDNA has allowed organizations to detect already categorized child sexual abuse material (CSAM). Sadly, with new digital content being so easy to create and distribute worldwide, investigators have seen an epidemic of brand-new, never-seen CSAM being shared online.

CEASE is an AI model that uses computer vision to detect these new images. Our collaboration with the Royal Canadian Mounted Police has given our data scientists access to a permanent data set of confirmed CSAM, which we are using to train the model.

However, it’s still a work in progress. If you are a member of the law enforcement community or the technology industry, we need your expertise and vast knowledge to help shape this groundbreaking system.

Stop by our booth

We look forward to meeting fellow attendees and discussing potential collaboration and partnership opportunities. Visit us at booth 41 on the 2nd floor, next to Griffeye.

As we learned at the Protecting Innocence Hackathon in July, “If we want to protect the innocence of children, we have a responsibility to be transparent and collaborative.”

You can sign up for both workshops through the conference website. Feel free to email us at hello@twohat.com to set up a meeting.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required


Three Powerful Lessons We Learned at the Protecting Innocence Hackathon

You rarely hear about them, but every day brave investigators across the globe review the most horrific stories and images you can ever imagine. It’s called child sexual abuse material (known as CSAM in the industry), and it hides in the dark corners of the internet, waiting to be found.

The scope is dizzying. The RCMP-led National Child Exploitation Coordination Centre (NCECC) alone received 27,000 cases in 2016. And right now, it’s nearly impossible for officers to review those cases fast enough to prioritize the ones that require their immediate attention.

That’s why, on July 6th and 7th, volunteers from law enforcement, academia, and the tech industry came together to collaborate on solving this problem, perhaps the biggest problem of our time — how do we quickly, accurately, and efficiently detect online CSAM? Artificial intelligence gets smarter and more refined every day. How can we leverage those breakthroughs to save victimized children and apprehend their abusers?

Along with event co-sponsors the RCMP, Microsoft, and Magnet Forensics, we had a simple goal at the Protecting Innocence Hackathon: to bring together the brightest minds in our respective industries to answer these questions.

We ended up learning a few valuable lessons along the way.

It starts with education

Participants across all three disciplines learned from each other. Attendees from the tech industry and academia were given a crash course in grooming and luring techniques (as well as the psychology behind them) from law enforcement, the people who study them every day.

Make no mistake, these were tough lessons to learn — but with a deeper understanding of how predators attract their victims, we can build smarter, more efficient systems to catch them.

Law enforcement studied the techniques of machine learning and artificial intelligence — which in turn provides them with a deeper understanding of the challenges facing data scientists, not to mention the need for robust and permanent datasets.

It’s crucial that we learn from each other. But that’s just the first step.

Nothing important happens without collaboration

Too often our industries are siloed, with every company, university, and agency working on a different project. Bringing professionals together from across these disciplines and encouraging them to share their diverse expertise, without reservations or fear, was a huge accomplishment, and an important lesson.

This isn’t a problem that can be solved alone. This is a 25,000,000-million-images-a-year problem. This is a problem that crosses industry, cultural, and country lines.

If we want to protect the innocence of children, we have a responsibility to be transparent and collaborative.

Just do it

Education and collaboration are commendable and necessary — but they don’t add up to much without actual results. Once you have the blueprints, you have no excuse not to build.

The great news? The five teams and 60+ participants made real, tangible progress.

Collectively, the teams built the following:

  • A proposed standard for internationally classifying and annotating child sexual exploitation images and text
  • A machine learning aggregation blueprint for both text and image classification
  • Machine learning models to detect sexploitation conversation, as well image detection for as age, anime, indoor and outdoor, nudity, and CSAM

We cannot overstate the importance of these achievements. They are the first steps towards building the most comprehensive and accurate CSAM detection system the world has seen.

Not only that, the proposed global standard for classifying text and images, if accepted, will lead to even more accurate detection.

The future of CSAM detection is now

We actually learned a fourth lesson at the hackathon, perhaps the most powerful of them all: Everyone wants to protect and save children from predators. And they’re willing to work together, despite their differences, to make that happen.

At Two Hat Security, we’re using the knowledge shared by our collaborators to further train our artificial intelligence model CEASE and to refine our grooming and luring detection in Community Sift. And we’ll continue to work alongside our partners and friends in law enforcement, academia, and the tech industry to find smart solutions to big problems.

There are challenges ahead, but if everyone continues to educate, collaborate, and create, projects like CEASE and events like Protecting Innocence can and will make great strides. We hope that the lessons we learned will be applied by any agency, company, or university that hopes to tackle this issue.

Thank you again to our co-sponsors the RCMP, Microsoft, and Magnet Forensics. And to the Chief EnforcersCode Warriors, and Data Mages who gave their time, their expertise, and their fearlessness to this event — your contributions are invaluable. You’re changing the world.

And to anyone who labors every day, despite the heartbreak, to protect children — thank you. You may work quietly, you may work undercover, and we may never know your names, but we see you. And we promise to support you, in every way we can.

Want more articles like this? Subscribe to our newsletter and never miss an update!

* indicates required