Meta’s Virtual World, Tackling Toxic Gaming and Safeguarding News
Last Updated on 17th February 2023
Read the script below
Hello and welcome to Safeguarding Soundbites, the podcast that brings you all the latest safeguarding news and updates. This week, we’re catching up on Meta’s virtual world, talking about AI generated sexual abuse images and finding out how one gaming company is taking on toxic gaming, plus more.
First up, it’s the news and this week we start in the virtual reality world as Meta wants to make Horizon Worlds available to young people aged 13 to 17. A leaked internal memo says the company wants to open up the VR app to younger users as early as next month.
The virtual reality world has not been without controversy during its two-year existence, with reports of sexual harassment and abuse occurring on the app. Meta has implemented tools to help combat such incidents, such as the ‘Safe Zone’– a protective bubble users can activate where no one can ‘virtually’ interact with them. Meta have also announced that they will be expanding their content ratings policies to cover events, too, meaning worlds within the game will have to have clear age ratings. It’s unclear what further safeguarding measures will be put in place to protect young people from adult and harmful content. If you’d like to learn more about Horizon Worlds, search our website ineqe.com for our guide to Horizon Worlds.
The creator of ChatGPT has spoken out about his fears for artificial intelligence’s future and how it might be used to create so called ‘revenge porn’ also known as image based sexual abuse. Sam Altman said that he’s been watching the use of open-source image generators to create this kind of harmful imagery with concern and that he thinks it’s causing huge and predictable harm.
While the use of image based sexual abuse is unfortunately nothing new, the ability to make AI versions is something we’ve only seen in the past number of years. As AI capability continues to improve, it’s extremely worrying that those advancements could be used to make this imaged based sexual abuse. Sharing non-consensual deepfake porn looks set to become a criminal offence as part of The Online Safety Bill but there’s currently no word on whether that extends to those who create it too.
A study has shown what we all might have already suspected…when it comes to what app children and young people are using the most, it’s TikTok that’s way in the lead. The new research by Qustodio shows children and young people spend 60% longer on the video sharing platform than they do watching videos on rival YouTube. TikTok also overtook all other video and social media apps, like Netflix, Snapchat and Instagram, with an average time spent on the app of 107 minutes per day. If you’re wondering what all the fuss is about, head over to our social media section on ineqe.com to find articles and guides on all things TikTok.
Game developers Ubisoft are taking on toxic gaming with a new partnership with the police. The Newcastle Ubisoft Customer Relationship Centre will be working with other global centres to tackle behaviours such as racism, bullying and rape jokes. Staff will monitor players and help requests, plus work with Northumbria Police who will be sharing their expertise on harmful online interactions. They will also be putting in place a method of fast-tracking information to the police for extreme cases.
Toxic behaviour in gaming is an ongoing concern for many parents and carers whose children and young people enjoy gaming. Many worry about the gaming culture that their child is exposed to and it seems there’s been an acceptance of harmful and abusive behaviour as a normal part of that culture. We hope this research will be a step towards a safer, kinder online gaming world in the future but remember, many games have safety settings that can help today. Visit the online safety section at ineqe.com to find advice, top tips and signposting to empower you and the children in your care to be safer online.
This week, our Online Safety Experts have been researching JusTalk, the free messaging and videoing app. After reports that children and young people have been receiving harmful messages, we took a look into what the app is, what the risks are and how you can help the children and young people in your care be safer on apps like JusTalk. Visit our website to read that research.
That’s all from Safeguarding Soundbites this week. Remember to check out our website at ineqe.com for all those articles, guides and resources I mentioned, plus many more. I’ll be back next week; in the meantime, you can find us on social media by searching for INEQE Safeguarding Group. Stay safe and speak to you next time.