His job? Hack anything with a Mickey Mouse logo on it. A former lead developer on the safety and security elements for the children’s virtual world Club Penguin, Chris was passionate about online security. And as a father of four, his motivation was deeply personal.
Bullied as a child himself, Chris was particularly sensitive to online harassment. He was deeply disturbed by the industry-wide indifference to solving the problem of online toxicity. At Club Penguin, safety was built into the product from the very beginning. But many in the industry chose to ignore the growing problem of online abuse — and very few people were talking about how to fix it. It was time to change that.
Sitting at his desk on the fourth floor of the Club Penguin studio in Kelowna, Chris had an epiphany.
If hackers leave behind signatures breaking into a system, maybe toxic users leave behind language patterns. What if those could be used to find the worst human behavior on social networks?
With this flash of insight, Chris’s goal was clear: He would do everything he could to remove bullying from the internet.
On March 31st, 2012 Chris left Disney to make his vision a reality. Like many tech companies, Two Hat Security began life in its founder’s basement. Chris spent countless days and nights coding what would eventually become Community Sift, a text classification system that identified high-risk content like bullying, rape threats, and grooming. And in the process, he developed a new kind of artificial intelligence model that processes language using a blend of context, user reputation, risk level, subject matter, and hidden meanings.
Today, Two Hat Security has left the basement and moved onto the top floor of a skyrise. Community Sift is the industry leader in high-risk content detection and moderation, protecting some of the biggest online games, virtual worlds, and social products on the internet. With our latest project CEASE, we are collaborating with the RCMP and leading academic partners to train a groundbreaking new AI model to detect child sexual abuse material (CSAM).
In five years, the company has grown from just one employee to a dedicated team of engineers, data scientists, linguists, and community professionals.
Chris’s original vision is still our guiding principle. We believe in a world free of online bullying, harassment, and child exploitation. And every day, we work to make that vision a reality.