Last Updated on 5th April 2024

Reading Time: 5.9 mins

18th August 2023

Share this with your friends, family, and colleagues

Our online safety experts have received reports from our Safer School partners about a peer support app that could be reinforcing harmful behaviour. Vent markets itself towards children and young people as a platform where they can express themselves, “chill out” and have their mood “lifted”. We have reviewed and tested this app and found that it features unhealthy and potentially dangerous behaviours, some of which are age-inappropriate or illegal.

vent logo

We’ve produced this brief for parents, carers and safeguarding professionals to learn about what Vent is, the risks it poses to those in their care, and how they can respond.

Here’s what you need to know…

What is Vent?

  • Vent is an online platform designed to be a “social diary” for users to share their feelings with one another. It was created by TalkLife Limited, another peer support network for mental health.
  • The app’s intention is to create positive, supportive, and understanding communities for people all over the world.
  • It operates as a ‘constant feed’ of user’s posts from the general public, special interest groups, one-to-one chats, or a private diary.
  • The app has been criticised for enabling a vacuum of negative emotions and comments, as well as allowing harmful peer-to-peer support.
  • There are very few effective safety settings on Vent, and all moderation seems to be based on users reporting inappropriate posts.
  • Vent is available for mobile devices on the App Store and the Google Play store. A monthly paid subscription that allows access to all features is also available.

Age Rating

Vent states that users can only post to the app if they are ‘aged 13 years or older’. However, other ratings suggest 16+ or 17+ age limits, as the app may include “suggestive themes” such as profanity/crude humour, mild sexual content, nudity, and drug use reference – content that is not suitable for Vent’s suggested age rating.

Our testers also noted that there is ineffective age verification on this platform. Users just need an email and a password to create an account and are only asked to tick a box that says, “I am over 16 years of age.”

What are the key functions?

  • Hidden features – Users are able to choose to hide topics that may be triggering for them, such as ‘Sexual’, ‘Violence’, and ‘Self-Harm’.
  • User feed – Users see text posts from other users in either the ‘General’ public section and special interest ‘Groups’ section.
  • Choose Your Mood – Users must select how they are feeling from an ‘emotion pack’ featuring various emotions from ‘Cynical’ to ‘Killer’.
  • User posts – All personal posts are text-based, and can be posted to Public, Private (My Diary), or Mutuals.
  • Comments – On all public or group posts, users are able to comment and react to what others have posted.
  • Wellness Centre – Users can track their moods throughout each day over the week to see their progress and can take ‘self-directed’ learning modules that seek to explain complex emotions and feelings.

‘I Need Help’

Within the user profile section, users can find an ‘I Need Help’ option. This is meant to provide anyone who is “having a really difficult time” with anonymous support from a trained provider. Vent has then included instructions for a 24/7 text line they call ‘Vent Crisis Messenger’. After testing the app, we discovered this is actually connected to Shout Helpline.

Our online safety experts reached out to Shout with concerns about the use of their helpline in the Vent app. Shout’s spokesperson then confirmed that Shout does not have any relationship with Vent, and that the signposting had been done without Shout’s knowledge or consent.


There is an abundance of inappropriate and explicit content, including harmful or triggering topics such as sexual fetishes, eating disorders, and self-harm methods. These appeared on the app even after our testers chose to block these topics.
There is a large presence of negative and unhealthy language within user posts. Groups include ‘Eating Disorders’, ‘Depression’, and ‘Abuse’, and user posts include similar tags.
As these open forums allow users to ‘vent’ without specific moderation, others may begin to feed into the more negative posts rather than try to support the user or report them.
Peer support and self-help apps have become popular due to limitations on in-person help. However, the absence of professional advice could mean a young person develops unhealthy coping mechanisms.
A young person may begin to rely on or give inaccurate/harmful advice when talking to others that fits in with their own views and beliefs. This may inspire or encourage self-destructive behaviours.
Some of the posts contain graphic or explicit details of harmful behaviours and habits, which could lead to irreversible damage or medical emergencies if a young person attempts to harm themselves.
Young people might use these apps to try and ‘fix’ problems that need professional help because they are embarrassed, confused, scared, or feel misunderstood.
If a child is being vulnerable, they could be more swayed by those who ‘understand them’, introducing harms such as bullying or grooming by other users who wish to exploit their emotional state.

Our Advice/Top Tips

If you are worried that a child or young person in your care may be using Vent or a similar website/app, don’t panic. Our online safety experts have curated the following advice to help you support those in your care:

  • Stay calm and talk. To encourage healthy discussion, try not to panic if they name the website or app. Be inquisitive, listen to what they say and remind them that sending anonymous messages or talking to strangers online can put them at risk of exposure to harmful content.
  • Teach online safety. Talk to the young person in your care about staying safe online and how to protect themselves, especially when encountering new or ‘trendy’ platforms that their peers may be using.
  • Discuss reporting/blocking. Ask them what they would do if another person made them feel uncomfortable or worried, or if they see something they don’t like, and work on a plan of action together. This should include functions like blocking and reporting.
  • Talk about consent. It’s important that children and young people feel empowered when navigating relationships, both online and offline. Use our helpful resource to discuss the importance of consent with them!
  • Remind them of privacy. Encourage them to make more informed online decisions by talking through the importance of privacy. Mention that they should not be sharing personal details with others online – even friends!
  • Tell them you care. If a child or young person in your care is struggling with something, remind them that you are always there for them. Remember that just trying to engage with them about their day could make all the difference!
  • Encourage other supports. If they feel they need someone to talk to, discuss different forms of support they can access, such as talking with a Trusted Adult or seeking professional help. Remind them that they can always use supports like Childline for advice.
Share this with your friends, family, and colleagues

Join our Online Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online.

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Discussing Online Life With Your Child

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

Go to Top