Artificial intelligence to
detect new child sexual abuse material & save victims faster.

Artificial intelligence to detect new child sexual abuse material & save victims faster.

Trained on real CSAM, our AI scans, identifies, and flags new
images containing child abuse with unprecedented accuracy.

Trained on real CSAM, our AI scans, identifies, and flags new images containing child abuse with unprecedented accuracy.

For Law Enforcement

  • Identify and rescue victims of child abuse faster
  • Reduce Investigator's manual workload
  • Protect investigator's mental health and reduce trauma

By working with both law enforcement and social networks, we not only help investigators rescue victims faster, we also provide early CSAM detection at the source.

Built in collaboration with the RCMP through the federally-funded Build in Canada Program, a generous Mitacs grant, and working with leading Canadian universities, CEASE is an ensemble of neural networks using multiple AI models to detect new images containing child abuse.

How It Works

For investigators, our easy-to-use plugin helps reduce workloads by filtering, sorting, and removing non-CSAM, allowing them to focus their efforts on new child abuse images. Investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritize images that contain previously uncatalogued CSAM. Better tools for overworked investigators and reduced mental stress will only help them reach innocent victims faster.

As technology evolves, so does the frightening reality of online child sexual abuse. Leading social platforms are looking to Two Hat to help solve this problem. What started as a partnership with law enforcement has now evolved to working alongside social networks to identify and label uploaded images containing child abuse using the CEASE API endpoint. Then, platforms can escalate images to law enforcement and reach innocent victims faster than ever before.

Want to help keep children safe? Contact us today

Want to help keep
children safe?

Contact us today

How It Works

For investigators, our easy-to-use plugin helps reduce workloads by filtering, sorting, and removing non-CSAM, allowing them to focus their efforts on new child abuse images. Investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritize images that contain previously uncatalogued CSAM. Better tools for overworked investigators and reduced mental stress will only help them reach innocent victims faster.

As technology evolves, so does the frightening reality of online child sexual abuse. Leading social platforms are looking to Two Hat to help solve this problem. What started as a partnership with law enforcement has now evolved to working alongside social networks to identify and label uploaded images containing child abuse using the CEASE API endpoint. Then, platforms can escalate images to law enforcement and reach innocent victims faster than ever before.

Contact Us

Hello! Send us your question, and we'll get back to you as soon as possible.

Start typing and press Enter to search