NEWS14 May 2021

Clarity needed for online harms bill

News Public Sector Technology UK Youth

UK – A UK online harms bill needs clear guidelines on what constitutes harmful online material if the law is to be successful, BCS, the Chartered Institute for IT, has said.

Online_harm_internet_crop

BCS also said that more work was needed to equip everyone with the skills and opportunity to use the internet safely and to increase online access, as well as striking a balance between limiting online harm and maintaining freedom of speech.

The government included an Online Safety Bill in the Queen’s speech on Wednesday, which will seek to provide better regulation for social media and online.

Among the proposals are to provide Ofcom with the power to fine companies up to £18m or 10% of global annual turnover, whichever is higher, and have the power to block access to websites that fail in their duty of care to users.

The law seeks to restrict illegal and harmful content online, while also strengthening people’s rights to freedom of expression online.

Dr Bill Mitchell, director of policy at BCS, The Chartered Institute for IT, said: “The proposed bill will be challenging in practice unless there is robust and objective guidance for social media platforms on what legal online content constitutes as harmful and must be removed.

“That must be informed by a comprehensive public debate on how we balance the need to limit online harm and at the same time nurture freedom of speech and the freedom to disagree in a civilised manner, which underpins a democratic society.”

Mitchell said BCS fully supported the government’s focus on tackling child-abuse, racist and misogynistic abuse online, and added that “there are people from a wide range of backgrounds who need more help with online safety, and indeed access to the benefits of the internet”.

Professor Andy Phippen, professor of IT ethics and digital rights at Bournemouth University and member of the BCS Law Specialist Group Committee, said: “We can’t make kids safe online, but we can make them informed about risk and help them mitigate them. Industry is responsible for providing them with the tools to help mitigate risk, but can’t solve it on their own.

“Educational, civic society and many other sectors are key to understanding the nature of these problems, but also in delivering better digital education, guidance and building functional resilience to online harms.”

@RESEARCH LIVE

0 Comments