NEWS1 July 2022

One in six young people flag harmful content online

News Trends UK Youth

UK – Two-thirds of young adults have encountered a potentially harmful piece of content online, but only around one in six go on to report it, Ofcom has found.

Young people using smartphones

The findings come as the government’s Online Safety Bill continues to make its way through parliament. Ofcom will be responsible for enforcing these new laws and has already started regulating video-sharing platforms in the UK such as TikTok, Snapchat and Twitch.

Ofcom’s Online Experiences Tracker shows that a majority of younger people aged between 13 and 24 ( 65%) believe the overall benefits of being online outweigh the risks. But around the same proportion – 67% – have encountered potentially harmful content.

Younger people said the most common potential harms they came across online were offensive or ‘bad’ language ( 28%); misinformation ( 23%); scams, fraud and phishing ( 22%); unwelcome friend or follow requests      ( 21%) and trolling ( 17%).

A significant number of young people ( 14%) also encountered bullying, abusive behaviour and threats; violent content; and hateful, offensive or discriminatory content, targeted at a group or individual based on their specific characteristics.

The research reveals a worrying gap between the 67% of young people who experience harm online and those who flag or report it to the services. Fewer than one in five young people ( 17%) take action to report potentially harmful content when they see it.

Younger participants say the main reason for not reporting is that they didn’t see the need to do anything ( 29%); while one in five ( 21%) do not think it will make a difference. Over one in ten ( 12%) say they don’t know what to do, or whom to inform.

User reporting is one important way to ensure more people are protected from harm online. For example, TikTok’s transparency report shows that of the 85.8 million pieces of content removed in the fourth quarter of 2021, nearly 5% were removed as a result of users reporting or flagging content.

In the same period, Instagram reported 43.8 million content removals, of which about 6.6% were removed at users’ requests.

To help galvanise more young internet users to report potentially harmful content, Ofcom has joined forces with social media influencer Lewis Leigh and behavioural psychologist Jo Hemmings to launch a new campaign.

The social media campaign aims to reach young people on the sites and apps they use regularly to highlight the importance of reporting posts they may find harmful.

Hemmings commented: “What is clear from the research is that while a potential harm experienced just once may have little negative impact, when experienced time and time again, these experiences can cause significant damage.

“Worryingly, nearly a third of 13-to-17-year-olds didn’t report potentially harmful content because they didn’t consider it bad enough to do something about. This risks a potentially serious issue going unchallenged.

“That is why I’m working with Ofcom to help encourage people to think about the content they or their children are being exposed to online, and report it when they do, so that the online world can be a safer space for everyone.”

Anna-Sophie Harling, online safety principal at Ofcom, added: “As we prepare to take on our new role as online safety regulator, we’re already working with video sites and apps to make sure they’re taking steps to protect their users from harmful content.

“Our campaign is designed to empower young people to report harmful content when they see it, and we stand ready to hold tech firms to account on how effectively they respond.”

0 Comments