Loading...

Last Updated on 5th April 2024

Read the script below

Hello and welcome to Safeguarding Soundbites with me, Tyla

And me, Natalie!

Safeguarding Soundbites is the podcast for catching up on all the week’s online and digital safeguarding news…

and what we’ve been up to here at Ineqe Safeguarding Group and Safer Schools.

So, what’s coming up in today’s episode, Natalie?

This week, we’re chatting social media news, finding out what’s in Ofcom’s latest media use and attitudes report, and discussing our new youth vaping article.

First up, let’s check in with social media news. Discord have begun rolling out a new voice messaging feature. The social platform that’s popular with gamers has introduced the ability for users to send voice notes through direct messages and on servers. Discord is a free online platform that hosts voice, video, and text chat on different servers – online communities that are based around interests, hobbies, TV shows, games and whatever you could think of! Whilst users can already connect via voice chat, this feature was more like a phone call, whilst the new voice messaging feature works more like voice notes on platforms like WhatsApp.

Over on TikTok, the platform has come under criticism for failing to disclose the true scale of content related to self-harm, eating disorders and suicide on the platform. Researchers, including the Center for Countering Digital Hate, found while TikTok readily makes info available on other topics, details on viewership data about harmful content is not included.

And the researchers also found 50 different hashtags about suicide, self-harm and eating disorders – with those videos viewed over 57.9 billion times.

And that’s very worrying, Tyla, when you think of the amount of young people who are using the app.

Yes, absolutely – in fact, we’ll be talking more about the Ofcom report in a minute, but the report shows that about half of all 3-17-year-olds are using TikTok.

I think it’s important to note here that TikTok’s age limit is 13-years too.

Yes, and it’s research like this that shows why it’s not only important to make sure children and young people are using platforms that are age-appropriate but also to know what to do if your child has come across harmful content on TikTok or any other platform.

We actually wrote an article on that recently which can be found on our website ineqe.com but would you run us through a few things parents and carers can do in that situation?

Yes, so first off, staying calm is really important – as is giving that child or young person time to tell you what’s happened, without interruption.

So giving plenty of space.

Exactly. Give reassurance and support and know how to identify help and further support if they need it.

Thank you – as I said, that article can be found on our website too, it’s called ‘A review of harmful content online’.

Moving on to other social media news – although still on the topic of harmful content – YouTube have announced they are updating their guidelines on eating disorder content. The video sharing platform has banned any content on eating disorders that could prompt users to imitate behaviours, for example videos about calorie restriction. Videos that don’t contain sufficient educational or factual information could also be age restricted and have a panel added to the video with crisis resource information.

It’s interesting to see platforms updating their guidelines and to be having these conversations about harmful content in light of the Online Safety Bill.

Yes, and that’s what I wanted to talk about next – we had an update this week about the Online Safety Bill. Michelle Donelan, the Technology and Science Secretary, announced this week that under the bill, social media bosses who repeatedly fail to remove harmful content, like suicide and self-harm videos, will face up to two years in jail.

And that’s if they have been requested by Ofcom numerous times to correct flaws on their platform that can cause serious harm to children?

Yes, so it’s not a ‘straight to jail, do not pass go’ measure. It applies when social media executives have disregarded enforceable requirements and not complied with their legal responsibilities to protect children from risk on their platform or site.

The Online Safety Bill has been in the news for other reasons this week, as the heads of messaging apps WhatsApp and Signal have jointly spoken out against the bill. In an open letter, they talked about their concerns for privacy and safety and that the bill could see end-to-end-encryption effectively outlawed.

For any listeners who might not know, end-to-end encryption is a security method that protects data between the sender and the recipient, yes?

That’s it. And it’s used on both Signal and WhatsApp. So, if I send you a message on WhatsApp, only you and I can access that message. It’s protected data. Which has caused a lot of controversy, especially in terms of the Online Safety Bill. On one hand, privacy campaigners think it’s important for people to have that assurance – that data is private between people. On the other, some believe that end-to-end encryption enables things like the sharing of child sexual abuse imagery.

So what is happening with the bill now?

It’s currently going through the committee stage, which means it’s being read through line by line, and suggested amendments are being analysed and voted on.

Can we expect more changes and additions?

It’s very likely so make sure you keep listening to Safeguarding Soundbites to stay in the loop!

Moving on now to other news, the government’s new emergency alert system will be tested this Sunday 23rd April, so depending on when you’re listening to this, it may have already happened!

But as we’re recording, it hasn’t!

True! At 3pm, phones across the UK will receive a test alert, consisting of an alert message on your home screen, an alarm sound and vibration for up to 10 seconds. The emergency alert system will be used to send out alerts about life endangering events, such as floods and extreme weather. To find out more about the test and the emergency system, visit our website at ineqe.com and head to the Online Safety section.

In England, new government guidance may mean that schools have to tell parents if pupils question their gender identity, use a new name or wear a uniform designed for a gender different from their assigned gender at birth. The guidance will also allow teachers to refuse to use a pupil’s preferred pronouns and single-sex schools will be able to refuse entry to transgender pupils.

Although it includes an exception clause where sharing gender identity information could cause harm at home, LGBTQ+ charity Stonewall are concerned that many teachers and schools might not have all the information about a child’s home environment.

Here at Ineqe, we’ve recently been receiving reports of children as young as eight-years-old vaping. What was once seen mostly as a method of quitting cigarettes, vaping has become a popular habit in its own right.

And as the trend seems to continue to rise amongst young people, our Online Safety Experts took a look at youth vaping, including what it is, why young people vape and some top tips if you’re worried about a young person in your care vaping. Find that on our website, ineqe.com.

A new survey has shown that more than 90% of teachers say safeguarding referrals have risen. The survey of teachers and senior leaders in the UK was carried out by the NSPCC and teaching union NASUWT. The results also revealed that teachers have seen an increase of 87% in neglect referrals and 84% in emotional abuse referrals.

The Ofcom Media Use and Attitudes report was released last week. The annual report looks into how children and young people aged 3-17 are using and understanding media, such as T.V., gaming platforms and the internet. It also looks at how parents are monitoring and feeling about their child’s media use. Amongst many other things, this year’s report revealed how children and young people are experiencing the online world, both positively and negatively. For example, over 80% of both parents and children aged 12-17 feel going online is helpful for schoolwork and homework. But 71% of five- to 15-year-olds have seen hateful content online.

You can read more about that and other key findings in our article on 2023’s Ofcom Media Use and Attitudes report. Find that on our website!

And that’s everything from us this week so thank you for listening!

You can follow us on social media by searching for ‘Ineqe Safeguarding Group’.
If you want to find out more about any of the safety settings we have spoken about today make sure you head over to the Safety Centre at OurSafetyCentre.com.

Join us next time to find out more about how we can all help keep children and young people safer online.

Bye!

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think
and Plan

Guidance on how to talk to the children in your care about online risks.

Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.

2024-04-05T09:44:58+01:00
Go to Top