The regulatory landscape is changing rapidly. In the last two months, we have seen huge changes in the UK and Australia, with potentially more countries to follow, including France and Canada. And just this week 18 countries and 8 major tech companies pledged to eliminate terrorist and violent extremist content online in the Christchurch Call.
As part of my job as a Trust and Safety professional, I’ve been studying the UK Online Harms white paper, which proposes establishing a Duty of Care law, which would hold companies accountable for online harms on their platforms. Online harms would include anything from illegal activity and content to behaviours which are “harmful but not necessarily illegal.”
It’s an important read and I encourage everyone in the industry to spend time reviewing the Department for Digital, Culture, Media & Sports’ proposal because it could very well end up the basis for similar legislation around the world.
All of this has got me thinking – how can platforms be proactive and embed purposeful content moderation at the core of their DNA?
As an industry, none of us want hate speech, extremism, or abuse happening on our platforms – but how prepared are we to comply with changing regulations?
Where are our best practices?
Are we prepared to deal with the increasing challenges to maintain healthy spaces online?
The changes are complex but also deeply important.
The eSafety Commissioner in Australia has identified three Safety by Design principles and are creating a framework for SbD, with a white paper set to be published in the coming months. It’s exciting that they are proactively establishing best practices guidance for online safety.
Organizations like the Fair Play Alliance are also taking a proactive path and looking at how the very design of products (online games, in this particular case) can be conducive to productive and positive interactions while mitigating abuse and harassment.
Over the past year, I was consulted for pioneering initiatives and participated in roundtables as well as industry panels to discuss those topics. I also co-founded the FPA along with industry friends and have seen positive changes first hand as more and more companies come together to drive lasting change in this space. Now I want to do something else that can hopefully bring value – something tangible that I can provide my industry friends today.
To that end, I’m offering free community audits to any platform that is interested.
I will examine your community, locate areas of potential risk, and provide you with a personalized community analysis, including recommended best practices and tips to maximize positive social interactions and user engagement.
Of course, I can’t provide legal advice but I can provide tips and best practices based on my years of experience, first at Disney Online Studios and now at Two Hat, working with social and gaming companies across the globe.
I believe in a shared responsibility when it comes to fostering healthy online spaces and protecting users online. I’m already talking to many companies and going over the audit process with them and look forward to providing as much value as I possibly can.
If you’re concerned about community health, user safety, and compliance, let’s talk.