YOLO: Life in the Fast Lane Drives a Vision for Safety

 In Online Safety, Social Networks

In its first 48 hours, YOLO acquired 1 million users, a plague of cyberbullying, a scalable content moderation solution, and a new vision for the future.

YOLO may have never lived at all if not for a weekend experiment. Gregoire Henrion and his cofounders weren’t really interested in an anonymity app, they were just curious what they could build over an idle couple of days. But when YOLO hit the App Store, it found instant traction and caught a ride on a viral loop via Snapchat.

“We had a million users in two days,” says Henrion. Unfortunately, the anonymous nature of the app was also providing a platform for cyberbullying, which spread like wildfire. “We hadn’t thought of it before, because we’d never dreamed of the scale. But even after one day, we knew it was a big issue.”

Desperate for a solution, Henrion reached out to peers in Paris’ apps ecosystem. “I spoke with a friend at Yubo who used Two Hat’s Community Sift and recommended it,” he says. “He connected me with Sharon, their account executive at Two Hat, who instantly went to work for YOLO.”

“Within a day, we went from having lots of bad behaviors, to being safe as could be.”

YOLO was at this time in the midst of a feeding frenzy of meetings, media and monetization that only the developers of such viral app sensations can truly understand. “We were doing funding calls and everything else – it was crazy – but Two Hat sorted out what we needed, and the implementation was completed within hours.”

YOLO initially went with very strong guidelines before easing settings based on user feedback. “We wanted to fix what was wrong, and we didn’t want to be associated with bad behaviors,” says Henrion. By experimenting with policies and settings we find we are able to deal with 95 to 99 percent of the issues.”

In YOLO’s configuration, inappropriate messages or comments simply do not get shared, but the offending party doesn’t know this. In the content moderation industry, this is known as a false send. But the bully just knows they’re not getting any attention back, which is often enough for them to stop and go away. “Now, we have the app tuned so that the filters are super-efficient,” says Henrion.

“We have a lot of control now; our users are happy, and we are super happy.”

Moving forward, YOLO plans to apply what the company has learned about anonymity and social media to carve a new approach to online safety. “When we look at user behavior now, one of the secrets of YOLO being successful is that even if the user’s name is hidden, you still see the face of the user on their profile pic. This bit of exposure – you don’t know me but you can see my face – is very often enough to make users regulate their behaviors. That’s naturally ‘Safe by Design’ because it’s our normal behavior.”

“If you want to create value you have to make something secure. We’re not naive anymore. We know all the bad things that can happen in social.”

YOLO envisions their community and others as a place where Safety by Design has encouraged users to change behaviors — Why bully, harass or cajole in a community if life on mute is the only possible outcome?

“Anonymity alone is not a sustainable approach to managing communities,” says Henrion. “Two Hat’s Community Sift gives us tools to help shift user behavior, the security system to deal with those who cause trouble, and a solution we know scales quickly.”

 


We’re currently offering no-cost, no-obligation Community Audits for social networks that want an expert consultation on their community moderation practices.

Our Director of Community Trust & Safety will examine your community, locate areas of potential risk, and provide you with a personalized community analysis, including recommended best practices and tips to maximize user engagement.

Sign up today and we’ll be in touch.

Recommended Posts
x
Subscribe to the blog and never miss a post!

Start typing and press Enter to search