Two Hat Director of Community Trust & Safety Carlos Figueiredo shares his thoughts on the Australian eSafety Office's Safety by Design initiative for social products.
Chat, comments, profile pics, and videos are all great tools for user retention and community engagement. But what about the risks? Learn why content moderation matters more than ever.
Affected by the Smyte Closure? Two Hat Security Protects Communities From Abusive Comments and Hate Speech
Protect your community and your brand from abuse, harassment, and hate speech with our content moderation API. Statement from our CEO and founder.
Disappearing users. Overworked moderation team. Frustrated developers. Sound familiar? Avoid this fate in your online community with our top five recommended moderation workflows.
We’ve been stumbling around in this virtual space for just over 20 years with only a dim light to guide us, which has led to the standardization of some… less-than-desirable behaviors. So what now?
The internet has enriched our lives in countless ways. But has it also made us meaner?
There are a few key steps that Community Managers can take to reinforce and reward positive behavior in their communities. Find out what we recommend!
For many women, logging onto social media is inherently dangerous. Online communities are notoriously hostile towards women. So what can social networks do to change that?
Every community is different and requires different techniques. But there are a few guiding principles that work for just about every product, from social networks to online games to forums.
Do traditional moderation techniques like crowdsourcing and muting actually work? Are there more effective strategies? And what does it mean to engineer a healthy community?