Frequently Asked Questions

For Law Enforcement

What is for Law Enforcement?

Built in collaboration with Canadian law enforcement, and with support from the Government of Canada’s Build in Canada Innovation Program and Mitacs with top Canadian universities, is an ensemble of neural networks using multiple AI models to detect new images containing child abuse.

For investigators, our easy-to-use plugin helps reduce workloads by filtering, sorting, and removing non-CSAM, allowing them to focus their efforts on new child abuse images. Investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritize images that contain previously uncatalogued CSAM.

Better tools for overworked investigators and reduced mental stress will only help them reach innocent victims faster.

Does work with videos as well as images?

Videos are on our spring 2019 roadmap.

Currently, you can send frames of the video into the system yourself, or using another tool that already handles videos.

Does your solution operate on a network or is it standalone? is currently installed in a secure, air-gapped or firewalled network environment.

In the future, we expect could be installed standalone on a single computer with a powerful GPU.

What are the GPU server requirements?

The basic requirements are a NVIDIA GPU with at least 10GB of memory.

Is a hash list?

No. identifies new, previously uncataloged CSAM. It works in tandem with existing hash lists and sorts the remaining content after you’ve eliminated your list of known images.

Can I evaluate before I purchase?

Yes, please contact us for details.

What image categories is your AI trained to detect? sorts images into three categories: CSAM, adult pornography, and common.

How was developed? was built in collaboration with Canadian law enforcement and with support from the Government of Canada’s Build in Canada Innovation Program and Mitacs with top Canadian universities.

What sets apart from similar solutions?

Multiple factors make unique:

  • The model uses three detectors (CSAM, adult pornography, and common) instead of just one to sort images for more efficient prioritization.
  • is funded by a multi-year research grant and is a core Two Hat product, supported by ongoing development and maintenance.
  • Our long-term relationship with law enforcement has given us unprecedented access to ongoing training materials. Because of this collaboration, we don’t need to destroy our materials after 90 days, allowing for exceptional accuracy and precision recall.
  • The two hats of Two Hat Security are law enforcement and social platforms. By working with us you are making the system stronger for social platforms and preventing the images from being uploaded in the first place.
Will I receive training?

You’ll have access to a step-by-step guide, and our success team will support you through the process.

  • Remote and on-site training is available. Please contact us for more information.
Why is important?

Every year, the National Center for Missing & Exploited Children reviews 25 million child sexual abuse images.

Every day, predators share images online. And every year, the numbers are increasing.

With increasingly large caseloads containing anywhere from hundreds of thousands to 1-2 million images, investigators struggle to sort and manually review all material. The result? It takes longer to identify and rescue victims. Investigators are overworked, overwhelmed, and face increasing emotional trauma.

With for Law Enforcement, we aim to:

  • Identify and rescue victims of child abuse faster
  • Reduce investigator’s manual workload
  • Protect investigator’s mental health and reduce trauma

Better tools for overworked investigators and reduced mental stress will help them reach innocent victims faster.

Is it secure?

As a security company, we understand many of the questions that your security team will ask prior to installing

To simplify the solution, we designed so that it stands all on its own without any outside requirements. Secure steps include:

  • Download the docker images from the internet.
  • Firewall off your machine so it allows no outbound communication to any other machines on your network or beyond.
  • Firewall off the machine so it only allows one incoming port.
  • Send your request in to the port from your classification software and it will only be able to reply on that port.
What does stand for? stands for Child Exploitation Analysis System for Enforcement.

Don't see your question here? Contact us today

Don't see your
question here?

Contact us today

Start typing and press Enter to search