Built in collaboration with Canadian law enforcement, and with support from the Government of Canada’s Build in Canada Innovation Program and Mitacs with top Canadian universities, CEASE.ai is an ensemble of neural networks using multiple AI models to detect new images containing child abuse.
For investigators, our easy-to-use plugin helps reduce workloads by filtering, sorting, and removing non-CSAM, allowing them to focus their efforts on new child abuse images. Investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritize images that contain previously uncatalogued CSAM.
Better tools for overworked investigators and reduced mental stress will only help them reach innocent victims faster.
Currently, you can send frames of the video into the system yourself, or using another tool that already handles videos.
CEASE.ai is currently installed in a secure, air-gapped or firewalled network environment.
In the future, we expect CEASE.ai could be installed standalone on a single computer with a powerful GPU.
The basic requirements are a NVIDIA GPU with at least 10GB of memory.
No. CEASE.ai identifies new, previously uncataloged CSAM. It works in tandem with existing hash lists and sorts the remaining content after you’ve eliminated your list of known images.
Yes, please contact us for details.
CEASE.ai sorts images into three categories: CSAM, adult pornography, and common.
CEASE.ai was built in collaboration with Canadian law enforcement and with support from the Government of Canada’s Build in Canada Innovation Program and Mitacs with top Canadian universities.
Multiple factors make CEASE.ai unique:
You’ll have access to a step-by-step guide, and our success team will support you through the process.
Every year, the National Center for Missing & Exploited Children reviews 25 million child sexual abuse images.
Every day, predators share images online. And every year, the numbers are increasing.
With increasingly large caseloads containing anywhere from hundreds of thousands to 1-2 million images, investigators struggle to sort and manually review all material. The result? It takes longer to identify and rescue victims. Investigators are overworked, overwhelmed, and face increasing emotional trauma.
With CEASE.ai for Law Enforcement, we aim to:
Better tools for overworked investigators and reduced mental stress will help them reach innocent victims faster.
As a security company, we understand many of the questions that your security team will ask prior to installing CEASE.ai.
To simplify the solution, we designed CEASE.ai so that it stands all on its own without any outside requirements. Secure steps include:
CEASE.ai stands for Child Exploitation Analysis System for Enforcement.