Loading...

Last Updated on 26th September 2023

Share this with your friends, family and colleagues

As we get further into 2022, many of today’s top social media apps and platforms have started rolling out new features with the intention of protecting their users. Instagram’s new features mainly focus on screen time limits, account security, and content control – areas that its parent company Meta was accused of neglecting last year.

Instagram is growing in popularity with children and young people. This has led to a rise in children under 13 lying about their age to create a profile. Our online safety experts have looked at all of Instagram’s new features to help parents, carers, school staff, and safeguarding professionals make young people safer on the platform.

instagram Logo
As some of these features are currently in testing or development, we will continue to monitor their progress.

1. Take a Break feature

What is it?

This feature was launched in December 2021 after months of testing. Users can choose to enable a notification system that will interrupt their in-app activity with a ‘Take a Break’ message after a select period: 10, 20, or 30 minutes. This includes a list of possible alternative activities, such as ‘Go for a walk’, ‘Listen to some music’, or ‘Write down what you’re thinking’. Testing showed that 90% of users kept this feature enabled.

Areas of concern

  • Settings like this may create an overreliance on technology for a young person’s digital wellbeing.
  • It does not educate on the impact of screen time overuse.
  • This feature is easy to turn off, skip over, or ignore.

2. Sensitive Content Control

What is it?

This setting allows users to limit and control the type of content they are exposed to on their Instagram ‘Explore’ page. There are three options to choose from:

  • Allow: Users may see more photos and videos that could be inappropriate and/or offensive for younger audiences. This option is not available to users under the age of 18.
  • Limit: Users may see some photos and videos that could be inappropriate and/or offensive for younger audiences. This option is the default setting for all users.
  • Limit Even More: Users may see fewer photos and videos that could be inappropriate and/or offensive for younger audiences.

Remember: Instagram’s minimum age requirement is 13. Some children may lie about their age, and if they say they are over 18, could be exposed to sensitive content. Talk to the children in your care about age restrictions and why it’s important to follow them.

Areas of Concern

  • This type of filter is never 100% fool proof.
  • Young people may try to ‘break’ a search term by adding different letters (such as å, ë, and ô) as well as symbols like $ or @ to show results that might otherwise be blocked (e.g. $åf£).

Potential Risks

  • Children and young people may still be exposed to inappropriate, upsetting, or harmful content, even with this filter enabled.

3. Activity Dashboard

What is it?

This feature creates ‘one place to manage all activity’ on the app. The dashboard provides an overview of an individual’s Instagram account activity. It includes time spent on the app, URLs visited, and search history.

Users can also download a copy of all personal information they have shared with Instagram since their account was created. If they would like to delete or archive their posts, they are able to do this in bulk through a checklist on the dashboard.

Potential Risks

  • In our conversations with young people, they’ve said they are more likely to delete or archive a post if it doesn’t get ‘enough likes’, sometimes reposting at a time they know more of their peers are online. This reinforcement of ‘like culture’ may have a negative impact on the mental health and wellbeing of anyone but is shown to be especially harmful to younger users.
  • The pressure of ‘curating’ a feed may lead to obsessive behaviour and influence the choices a young person makes (e.g. not doing or enjoying something because it’s not ‘Instagramable’).
  • Posts with potentially harmful or upsetting content can be archived or deleted after being posted, preventing them from being reported by other users.

4. Security Check Up

What is it?

The aim of this security check up is to guide individuals through securing their account. It uses step-by-step instructions to:

  • Check login activity
  • Review profile information
  • Confirm accounts that share a login
  • Update account recovery contact information (email addresses and phone numbers)
  • Set up two-factor authentication for extra security

Areas of Concern

  • Young people may choose to skip this process as it involves multiple steps, or continually put it off ‘until they have more time to do it.’
  • Settings like this can create an overreliance on technology for account security and online safety.
  • None of these steps fully protect someone from negative effects if their account is compromised.

5. Friend’s Help

Status: In development and internal testing

What do we know so far?

This is a new form of account recovery currently in testing. The idea is that a user can choose two friends to confirm their identity after entering an old password they used on their account. Their friends will not have access to their account. Instead, chosen friends will be prompted by an alert on Instagram to confirm the person’s identity within 24 hours. Instagram recommends contacting the chosen friends to tell them it’s a genuine request. If they do not respond, the user will be able to try one more time with two different friends.

As this feature is still in development, our online safety experts were not able to test this feature.

Areas of Concern

  • Young people could be purposefully denied access to their account through a prank or joke where other users deny a verification ‘for fun’.
  • Other teenagers could potentially take advantage of this feature to gain access to someone else’s account.

6. Account Status

Feature Status: In testing*

What do we know so far?

This new section will allow individuals to check the status of reports they have made on the app, as well as see any reports or restrictions made to their own account. They will also be notified here if their account is at risk of being disabled. If a young person feels there has been a mistake, they are able to ‘Request a Review’ to have the action appealed by Instagram moderators.

Areas of Concern

  • With 1.22 billion people using Instagram each month, it is easy for things to be missed. Some users may experience censorship on their accounts even if they see other accounts posting similar content. This can be frustrating, especially for young people, and might negatively affect their overall mood or mental health. It may also cause them to move to less filtered platforms and websites such as Reddit or 4chan, which could expose them to inappropriate and harmful content.

*As this feature is still in testing, our current information is subject to change.

7. Parental Controls

Feature Status: In development*

What do we know so far?

While there is only limited information on Instagram’s plans for this feature, it is clear the platform plans to introduce parental controls. The hope is to eventually allow parents and carers to monitor how much time their young people spend on Instagram and to set time limits. Instagram also plan to release an Educational Hub, which will help parents and carers by giving them tips and tutorials about children’s social media use.

What is a ‘Finsta’? A shortened term for ‘Fake Instagram’, which is a secondary ‘secret’ Instagram account created by someone in order to post more ‘real and unfiltered’ content.

Areas of Concern

  • ‘Finstas’ are currently popular among young people, and added involvement from parents and carers may encourage a young person to create one. This means that a parent or carer may not get an accurate understanding of how their young person is actually using Instagram.
  • A young person may be frustrated by the amount of time their parent/carer allows them to be on the platform, especially if their friends are not experiencing similar restrictions. This may cause a strain on family relationships, as well as negatively impacting the young person’s mood or mental health.

*As this feature is still in development, our current information is very likely to change.

8. Addressing Harmful Content

Feature Status: In development*

What do we know so far?

This new feature may be slightly confusing as it is technically a wider change to how Instagram will work. This change to how content shows up on individual accounts aims to take stronger action against posts with bullying or hate speech. Essentially, if a user reports a post, they will be less likely to see similar posts at the top of their feed.

Areas of Concern

  • This will only work for individual posts, meaning accounts that feature harmful or upsetting content will not be reported.
  • This feature relies heavily on a user reporting content they find upsetting or harmful themselves, which young people are less likely to do.

Potential Risks

  • Similar content will still be visible, even if it is at a lower place on someone’s feed. This means young people can still be exposed to content they have reported.

*As this feature is still in development, our current information is very likely to change.

9. Story Likes

Feature Status: Gradual Release*

What do we know so far?

This feature will allow individuals to ‘like’ an Instagram story. Previously, reacting to a story would automatically send a direct message (DM). Now users will be able to react to a story quickly and without ‘clogging up’ someone’s DMs. These likes will not be public. An individual will have to consult the ‘view sheet’ to see who has liked their story and how many likes it has.

As this is a gradual release, our online safety experts were not able to access this feature. It is unclear how it will change the current Stories format.

Areas of Concern

  • Interactions that would have previously been easy to spot or avoid might be missed if a user does not consult their view sheet. They may not be as obvious (e.g. someone inappropriately reacting to a young person’s story can now simply view it and like it without interacting with the young person), meaning there is less chance for a young person to block and report someone who might be viewing their content inappropriately.
  • ‘Liking’ is a much more passive action compared to the old form of reacting through DM. This may introduce added pressure for a young person to curate the ‘perfect’ story to get more likes.

*As this feature is still in development, our current information is very likely to change.

Source: TechViral

Our Top Tips

To help you support the young person in your care with Instagram’s new features and navigating the platform, our online safety experts have created the following list of tips and guidance.

  • Set up and explore your own Instagram profile to understand the appeal it has for your young person. Even if you do not follow their account, you’ll have an idea of what the platform and these features are like, as well as the risks your young person might experience.
  • Discuss the importance of safety and privacy settings. It is important young people know how to enable these settings, but even more important that they understand why those options are there. You can use our Safety Centre to walk and talk them through many of the features currently available on Instagram.
  • Encourage the young person in your care to set their account to ‘private’. This means they will have to manually accept and reject follow requests, and their posts/stories/reels will remain locked to the public. Explain why it’s important to only accept requests from people they know in real life.
  • If a child has parental permission to use Instagram despite being under the age of 13, it is important that they correct their age once they turn 13. This is to ensure the app knows their true age and processes their data appropriately.
  • Remind them that what they see on Instagram is only a small glimpse of ‘real life’. People often exaggerate how good or bad things are to get a like or a comment. You can check out our resource on Sad Fishing here to help explain.
  • Ask the young person in your care who they would talk to if they saw something on Instagram that made them uncomfortable. You can use our Trusted Adult resource to help.

Share this with your friends, family and colleagues

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Pause, Think
and Plan

Guidance on how to talk to the children in your care about online risks.

Image of a collection of Safer Schools Resources in relation to the Home Learning Hub

Visit the Home Learning Hub!

The Home Learning Hub is our free library of resources to support parents and carers who are taking the time to help their children be safer online.

2023-09-26T12:59:00+01:00
Go to Top