NEWS8 April 2019

Government outlines plans to tackle online harms

News Public Sector Technology UK Youth

UK – The government has outlined a new regulatory framework for online safety, proposing that companies should be fined for failing to protect users from online harms.

Someone scrolling through a blue mobile phone in the dark

The plans include a new statutory duty of care that would see companies forced to take more responsibility for people’s safety and address harms caused by content or activity on their services.

This could include ‘substantial fines’ issued by an independent regulator, which would also have the power to impose liability on individual senior managers if companies fail in their duty of care towards users.

Harms with a clear legal definition, including child sexual exploitation and terrorist content, are covered, but the scope also extends to harms that are less clear, such as cyberbullying and trolling, and underage exposure to legal content.

The regulatory framework would apply to any company that allows people to share or access user-generated content or interact with others online, meaning social media platforms, file hosting sites, public discussion forums, messaging services and search engines would all need to comply.

Other proposals include giving the regulator the power to require annual transparency reports from companies outlining what harmful content there is on their platforms, and how they plan to address it. Reports would then be published online and made publicly available by the regulator.

The government has invited organisations to submit their responses to its online harms white paper as part of a consultation, with a deadline of 1st July 2019.

Jeremy Wright, digital secretary, said: "The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough. Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users. However those that fail to do this will face tough action."

Paul Bainsfair, director general, IPA, welcomed the proposals. He said: "The IPA has been consistently calling for measures to maximise brand safety on online platforms. Reputable brands and advertising agencies would never want to be associated with harmful content or to unwittingly fund antisocial or terrorist activity. We have a vested interest in working with government to solve the issues identified in this white paper."

@RESEARCH LIVE

0 Comments