Last Updated on 5th April 2024
Read the script below
Hello and welcome to Safeguarding Soundbites. If you’re looking for a roundup of this week’s safeguarding news and advice from online safety experts, you’re in the right place! This week, we’ll be catching up with YouTube livestreaming and the latest safeguarding news in the world of social media
First up, YouTube have released a new feature to their livestreaming function. Called Go Live Together, it allows users to stream live with other creators. It was released last year but is slowly being rolled out and is currently available on some users’ iOS and Android devices. Livestreaming has become popular over the years, with children and young people enjoying both watching other people stream and streaming themselves. But many might question if livestreaming is safe, with concerns around their child or young person giving too much information away, the interaction with strangers in YouTube’s Go Live Together and the type of activities taking place when they’re streaming. Our Online Safety Experts have created a guide that outlines all of these risks and more, plus more information on what livestreaming is, why young people like to stream, and top tips. Find that on our website at ineqe.com.
One potential reason children and young people might be drawn to livestreaming is a desire for connection. Childline have reported that calls to their helpline from children under 11 who are struggling with loneliness have increased by 71 percent in five years. While this is no surprise after several years of pandemic restrictions and isolations, it is still shocking to hear, and has sparked concern for the state of children’s mental wellbeing in the UK. To read more about this, visit Children and Young People Now.
If you were listening last week, we spoke about Twitter tightening their rules around violent content on the platform. Unfortunately new research from the BBC has found contrary results, with hundreds of accounts who were banned for spreading abuse now allowed back on the platform. Over 1,100 Twitter accounts were reinstated despite having previously been banned for abuse or spreading misinformation. Nearly 100 of those accounts promoted hate and violence, including depictions of rape and misogynistic abuse. And despite Twitter’s claims to have a “zero tolerance” policy towards materials that feature or promote child sexual exploitation, the BBC found a number of accounts containing drawings showing just that.
The BBC also found exclusive data showing that harassment and trolling have intensified, alongside a 69% increase in users following misogynistic and abusive profiles. Twitter insiders have spoken to the BBC, saying they can no longer protect users from trolling, state-co-ordinated disinformation and child sexual exploitation.
Snapchat have been accused of not doing enough when it comes to removing underage users from their platforms. In preparation for the Online Safety Bill, Ofcom asked both TikTok and Snapchat for information on how many suspected accounts belonging to under 13-year-olds they had removed. Whilst TikTok came back with an average of 180,000 thousand a month, Snapchat reported only 60 per month.
A spokesperson for the company told news agency Reuters that the figures misrepresented the scale of work the company does to keep under-13s off its platform but declined to provide more details on those measures.
But while TikTok may have won that particular round, they’ve meanwhile come under fire for being too slow on their moderation of eating disorder and suicide content. More than two dozen organisations, including the NSPCC and the Molly Rose Foundation, have written to the platform, urging them to strengthen their policies on this after research suggested that the app’s algorithm pushes this type of harmful content directly to teenagers using TikTok.
The Molly Rose Foundation was set up by the father of Molly who died of suicide. It was later discovered that she had viewed thousands of images online depicting suicide ideation and self-harm related content. If the child or young person in your care has been exposed to any materials related to these harmful topics, firstly – stay calm. Remember to approach them with understanding and love, not panic or anger. Open the conversation up by asking open questions, rather than questions with a ‘yes’ or ‘no’ answer. For example, ask them how they are feeling. Make sure they know where they can turn to for help, even if that might be someone else, rather than you. Research some organisations that you can signpost them to or that you can call for help, such as Childline or Beat.
And finally, if you’re listening to this on the day we release, it’s Friday! But are you listening to it while you work from home? If you’re one of the estimated 87% of workers who work from home on a Friday, have you ever kept your child off school whilst doing so? MPs heard this week from the Children’s Commissioner about concerns over the amount of children in England who are missing school due to ‘mum and dad’ staying home from the office. The increase in Friday absences since the pandemic has triggered an inquiry by the Education Select Committee.
That’s everything from me this week – thanks for listening and I’ll be back again next week! In the meantime, you can follow us on social media by searching for INEQE Safeguarding Group. Speak to you next time!
Join our Online Safeguarding Hub Newsletter Network
Members of our network receive weekly updates on the trends, risks and threats to children and young people online.
Pause, Think
and Plan
Guidance on how to talk to the children in your care about online risks.
Visit the Home Learning Hub!
The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.