Frequently Asked Questions

New FAQ Coming Soon!
What is CEASE for Law Enforcement?

Built in collaboration with the RCMP through the federally-funded Build in Canada Program, a generous Mitacs grant, and working with leading Canadian universities, CEASE is an ensemble of neural networks using multiple AI models to detect new images containing child abuse.

For investigators, our easy-to-use plugin helps reduce workloads by filtering, sorting, and removing non-CSAM, allowing them to focus their efforts on new child abuse images. Investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritize images that contain previously uncatalogued CSAM.

Better tools for overworked investigators and reduced mental stress will only help them reach innocent victims faster.

What is CEASE for Social Networks?

Built in collaboration with the RCMP through the federally-funded Build in Canada Program, a generous Mitacs grant, and working with leading Canadian universities, CEASE is an ensemble of neural networks using multiple AI models to detect new images containing child abuse.

As technology evolves, so does the frightening reality of online child sexual abuse. Leading social platforms are looking to Two Hat to help solve this problem.

What started as a partnership with law enforcement has now evolved to working alongside social networks to identify and label uploaded images containing child abuse using the CEASE API endpoint. Then, platforms can escalate images to law enforcement and reach innocent victims faster than ever before.

What sets CEASE apart from competitors?

Two key differentiators set CEASE apart from similar solutions:

  1. Our collaboration with the RCMP through the federally-funded Build in Canada Program gave us unique access to a permanent dataset of confirmed CSAM images. CEASE was trained on this dataset, allowing for greater accuracy in our model.
  2. Unlike competitors, CEASE uses three detectors instead of one to sort and prioritize content for law enforcement into three buckets — CSAM, adult content, and common images.
Contact Us

Hello! Send us your question, and we'll get back to you as soon as possible.

Start typing and press Enter to search