Last Updated on 10th September 2024

Read the script below

Tyla: Hello and welcome back to another episode of Safeguarding Soundbites, with me, Tyla.

Natalie: And me, Natalie. As always, we’ll be bringing this week’s need-to-know news on all things safeguarding.

Tyla: This week we’re talking AI, vaping, esports and more.

Natalie: Shall we dive in then?

Tyla: Yes! What’s grabbed your attention in the news this week, Natalie?

Natalie: So the first thing I want to talk about is the news that Facebook and Instagram – so, Meta, basically – are going to be introducing tech to help detect and label fake AI images that appear on their platforms. Now, Meta already labels AI images that have generated by its own AI systems but not images that were created elsewhere and uploaded. So, the aim looks to be to cover that sort of content, too. Meta have also asked users to label their own audio and video content because their tool doesn’t work for this. They’ve added that if users don’t do that, there might be penalties.

Tyla: This links in, actually, with the next news story I was going to bring up, which is about a video that was circulating on Meta of Joe Biden. So the video was a fake and the Oversight Board, which is the board that oversees Meta –

Natalie: imaginatively named!

Tyla: Yep, does exactly what it says on the tin! But yes, the board reviewed this particular incident and video and said that Meta made the right call in deciding not to remove it as it didn’t violate its ‘manipulated media’ policy. Instead, it needs this type of content to be labelled.

Natalie: Ah ha! So this is possibly the catalyst.

Tyla: Or a very large part of the decision, anyway! But the board also added that Meta’s policy on manipulated media is “incoherent”!

Natalie: Not a great description.

Tyla: No, so they are now reviewing their policy.

Natalie: Sounds like a good idea! Actually while we’re on the subject of AI, I want to mention that this week ministers in the UK were warned against waiting for an AI-involved scandal before taking steps to regulate. In fact, it was put to them as ‘don’t wait for a Post Office-style scandal’. This comes after the government announced late last year that there will be a global AI safety summit, with major tech companies agreed to work together on testing their most sophisticated AI models. They are also providing regulators with 10 million pounds to tackle AI tasks and come up with an approach by the end of April.

Tyla: We’ll have to make a mental note to check back in with that in April then!

Let’s move on from AI with the news about an alleged hack that saw parents at a school in Billingham receiving explicit comments about their children through a hack of the Class Charts app. The app, which allows teaching staff and parents to view different information about individual pupils, has raised concerns over potential vulnerabilities. This hack follows an alleged data breach which allowed some users to access details about children that were not connected to them.

Natalie: Scary stuff, actually, that someone could be viewing the private and personal information of a child or young person.

Next news story now and it’s about a 14-year-old girl who has reported that she was groomed by an older man who posed as a teenager and offered to buy her vapes. This relationship then developed into a sexually exploitative relationship. Girls Out Loud, a charity that works with vulnerable young people, said that promising vapes has become an increasingly common tactic to lure children.

Now, a quick ad break and then we’ll be right back.

[Ad Break]
Do you know your streamers from your scrims? COD from Counter-strike? A clan from a LAN? And no, I’m not just making up words now…I’m talking about e-sports! Our upcoming Safeguarding in Esports course is for parents and safeguarding professionals who want a better understanding of competitive online videogaming. The live webinar will give you the knowledge you need to help create a safer environment for young gamers, learn about the unique challenges of safeguarding in esports and explore parallels with traditional sports safeguarding. Sign up today to ensure your spot by visiting the webinar page at ineqe.com.

Natalie: And speaking of esports…guess what our theme for today’s safeguarding success story is?!

Tyla: Hmmm….could it be…esports?!

Natalie: You’re so smart! It is. The NSPCC are hosting two festivals this year with the aim to promote safeguarding in gaming. Their Game Safe festival is taking place right now and runs until the 11th of February. And on the 9th of February they held a ‘Safeguarding in Esports’ conference.

Tyla: Much needed. We know that esports come with some concerning risks for children and young people – we won’t go through all of them! But the likes of the use of loot boxes in games which could encourage unhealthy spending habits and also is teaching the premise of gambling, essentially.

Natalie: That’s right. There’s also the potential that children and young people watching might view inappropriate content or something distressing – a player or streamer could be playing a game that’s not age-appropriate for a child who’s watching along. And as you said, Tyla, we won’t go through all of the risks and areas of concern with esports but all of those are outlined on our course, which I’d personally highly recommend for any parent or safeguarding professional to attend.

Tyla: I agree. But awesome stuff from the NSPCC and we hope everyone who went along had a great experience. Now, that’s all from us. As ever, you can follow us on social media by searching for INEQE. And remember to visit ineqe.com to find out more about the esports course as well as our full range of available training and webinars.

Natalie: And if you haven’t checked out the latest episode of The Online Safety Show, the new safeguarding show for children and young people, make sure to visit theonlinesafetyshow.com and while you’re there, take part in our survey about the Online Safety Act!

Until next time…goodbye and –

Both: Stay safe.

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think
and Plan

Guidance on how to talk to the children in your care about online risks.

Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.