Here at Yubo, we’re always looking for ways to innovate in online safety. In our previous blog, we explained how we take our responsibility to younger Yubo users very seriously. Since then, we’ve continued working with online safety expert Annie Mullins OBE to put a number of safeguards in place on the app and we’ve recently taken an important step forward by introducing real-time intervention in live streams.
Using a combination of algorithms and human moderators, we monitor activity on Yubo so we can identify users who violate our community guidelines and who could put themselves or others in danger while live streaming. We then take immediate action, sending a message to the user explaining how they have crossed the line, how to change their behavior and what action we’ll take if they don’t comply. Our options include temporarily disabling the live stream and suspending their account.
Yubo is the only social video app to intervene in this way to help keep teens safer. Yes, it’s a bold move. Yes, some young people are surprised when they receive a warning message from us. But we believe the most important thing is to support and protect our users and make the Yubo community an even better place to be.
In this blog, we take a closer look at live intervention and explore how it promotes behavioral change by encouraging Yubo users to stay safer and be more responsible as well as helping to reduce the risk of sexual exploitation and other issues. We also talk to one of our most experienced moderators and find out how our teenage users respond to his messages about inappropriate behavior or unsuitable content they have posted.
“We know there’s more that social networks can do and some sites like Yubo already have good practice on live streaming.” Andy Burrows, Associate Head of Child Safety Online, NSPCC (2018)
Educate and engage
Just as teenagers might experiment and test boundaries in real life, the same happens in the digital world. Online, young people might have already seen inappropriate behavior or content and aren’t always sure what’s acceptable. Whether it’s filming themselves in their underwear or making nasty comments about someone, it’s important that they understand and reflect upon the consequences their behavior could have for them and others. Furthermore, as a company, we must be able to take action when we see inappropriate or illegal behavior and content within live streams on Yubo (which we call Lives).
Our community guidelines are essential in setting standards for user behavior — our users have to agree to abide by them and we always refer to them when intervening in live streams. We make it very clear in these guidelines that we have zero tolerance for nudity and posing in underwear in Lives, for example.
If these guidelines aren’t followed, it’s our policy to engage and educate our younger users rather than punish them. That means using opportunities when rules have been broken to help them understand and think about what they have done wrong and nudge them to change their behavior in that moment and in the future. We’re already seeing the positive impact of real time in-app warnings and our approach has been welcomed by law enforcement, charities and other online safety advocates.
As our online safety advisor Annie Mullins OBE says,
“I’ve worked with Yubo putting live interventions in place to help prevent its users from putting themselves at risk. This is an important step change in the child online safety arena and one that other social media companies should consider following. The more real-time interventions with young people that take place, the more we’ll be able to support and educate them. Young people make mistakes — that’s part of growing up — but it’s important that social media providers offer safety nets for them and nudge them in the right direction when their behavior could put them at risk.”
Consultant Child and Adolescent Psychiatrist Dr Richard Graham adds,
“The future of the digital world now rests upon how willing we are to use behavioral insights to stop the bad from spoiling the good. Yubo is leading the way with live intervention, and there is no better way to learn than in the moment.”
Observations of a moderator
Our global team of moderators deal with a variety of situations in live streams, including young people posing in their underwear, taking their clothes off, harassing other users and threatening to self-harm.
We asked one of our moderators, Tom*, to explain why it’s so important to intervene in Lives.
“The content is being broadcast live online so it makes sense that we intervene immediately before any damage is done. The faster we get involved, the better,” he says.
Despite receiving a popup about our community guidelines when they enter live streams on Yubo and knowing that Lives can be seen by anyone (not just their friends), some users still choose to push the limits.
So, why do young people behave this way online? There are a number of motivations — some do it to show off or flirt, others because they see their idols and other celebrities acting a certain way or because someone is encouraging them or putting them under pressure to do it.
“It’s often about getting attention, which they might not get in real life,”comments Tom. “Going live is a good way of reaching lots of people and getting lots of views. The guys who go shirtless want to show off their muscles, for example.”
The potential risks of live streaming include being bullied, being coerced into sexually explicit activity and being contacted by strangers with ill intent.
Tom explains how younger users often become so immersed in the app, they don’t think about what they are doing.
“People often get caught up with being ‘in the moment’ and don’t understand the immediate risks or that the content might not go away after the Live. If they video themselves in their underwear, someone could capture that, publish it somewhere else and use it against them,” he points out.
If a Yubo moderator sees something happening that violates our community guidelines and could put someone at risk, they send a warning message to the user. The message might say that the Live will be shut down in one minute or it might warn the user that they will be suspended from the app if they don’t change their behavior.
We’re the only social video app to do this as we think it’s important that our younger users understand what the problem is and that we encourage them to change what they’re doing. After all, the best way to learn is in the moment and through real life experience.
We then take the appropriate action, which might include removing the content, suspending or closing down the user’s account and reporting potentially illegal activity to the police and other authorities, such as the Internet Watch Foundation (IWF). It’s our aim to help users change their behavior before it gets to that point but, in the end, we’ll do what is right for them and other users.
“Sadly, one of the things we come across is young people who are self-harming or suicidal,” says Tom. “Even if the people watching the Live try to help, they aren’t trained professionals and could unintentionally make the situation worse. Seeing someone talking about self-harming could also trigger similar feelings in other users.”
We don’t allow self-harm content to remain on Yubo and we explain to our younger users why this is the case. Live intervention means we can help someone in that crucial moment and direct them to support organisations.
“I will venture to say that… this app is shaping up to be a leader in live- streamed video safety.”Anne Collier, NetFamilyNews (August 2017)
Support and protect
On the whole, Yubo users respond positively to real-time intervention. They aren’t used to other apps communicating in this way but they generally receive the notifications well and comply with our requests. If someone receives a warning as they are getting undressed on camera, for example, they usually stop as soon as they see the message and put their clothes back on.
Tom notes that users often thank him for intervening as it means they don’t have to deal with a difficult situation themselves. If they are feeling under pressure to behave a certain way online, it’s good that they know we’re there to keep an eye on things and help them to manage those expectations.
Many of our younger users tell their friends about our rules and how the Yubo moderators are there to help — a clear sign that although young people like to be independent, they still need support. Indeed, educating one teenager can have a powerful network effect. We’ve received lots of positive comments about live intervention from our younger users, including “I think it’s useful, just keeps the community safer and cleaner I guess” and “It’s a really good idea, it gives people a chance to change their behavior and if not then they face the consequences.”
With research by Childnet revealing that more than one in 10 eight to 17-year-olds have ‘gone live’, live streaming is here to stay. We believe real-time intervention is a vital tool in helping young people have a safer experience on social video apps.
In Anne Collier’s words, it’s important to build a digital community of “guided practice” — one where young people can be confident, happy, respectful and safe. With our vision to be a leader in online safety, we’ll continue to develop innovative safety features for Yubo so we can do exactly that.
“I applaud Yubo for extensively reworking its safety features to make its platform safer for teens.” Julie Inman Grant, Australian eSafety Commissioner (March 2018)
Five ways Yubo helps young people
- Yubo has a minimum age limit of 13 and users must provide their real details, including their mobile number. We use technology and moderators to identify underage profiles and block them.
- Yubo users can set their own preferences, including whether to show their location and deciding who they talk to.
- Every user must agree to abide by our community guidelines, which include rules about bullying, nudity, fake profiles and other inappropriate or illegal behavior.
- We use innovative technology and human moderators to check what’s happening on Yubo and take action if we see our rules being broken.
- It’s easy to report any problems or concerns to us, via the flag icon in the app or on our Safety Centre. We aim to respond to reports within 24 hours and prioritize emergencies.