This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here

NEWS12 March 2019

Public should think like hackers

Data analytics GDPR Impact 2019 News Privacy Public Sector Technology UK

UK – Everyone needs to think more like a hacker to protect themselves online, according to Carl Miller, author of new book Death of the Gods, speaking at Impact 2019.

Miller, research director at thinktank Demos’ Centre for the Analysis of Social Media (CASM), said the public should be thinking about opsec – operational security, the process of determining whether you could be undermined by your online activities – as part of our everyday lives.

“Everyone needs to think more like hackers. Spend a Sunday doing hostile reconnaissance on yourself. It sounds so technical, but as an idea, it has to become much more of an everyday habit, like making sure our houses are secure or not walking down a dimly lit street,” said Miller.

Adopting this way of thinking is essential as power is now shifting in multiple different ways, from crime and warfare to politics and media – and cybersecurity is just one area where the law hasn’t caught up with technology.

Miller said: “The crime survey of England and Wales started asking about cybercrime in 2016, and the numbers were astonishing. It estimated that as much crime was happening through the internet than all other crimes put together. You are more likely to receive a virus than be a victim of all kinds of violent crime altogether.”

During a panel session, Miller, the session’s keynote speaker, and cybercrime and information experts discussed the implications of power shifting due to the impact of technology.

Some of the key takeaways from the session were:

Big problems require cross-border partnerships
In the process of researching the book, Miller accompanied police to a cybercrime raid. The only surprise about this experience, he said, was that the perpetrator was caught.

Miller said: “I don’t think it was because the police didn’t have skills – it came down to geography. Time and time again, they found the perpetrators were in one country, the victims were in another and the police cybercrime teams across countries couldn’t reach across borders to bring this to a British courtroom.”

Governments and regulators are lagging far behind…
Chris Monteiro, cybercrime researcher, compared today’s data-driven economy to fossil fuels. He said: “It’s a very quick, efficient way of kick-starting the economy, and it’s having all of these long-term effects we didn’t previously consider. The back of the system has been siphoned off for years now by crooks, destroying lives and economies in some cases. I think there’s a complete lack of awareness from governments and regulators that this is happening. The emphasis on policing – ‘We’ve caught the bad guys, it’s OK’, is completely wrong.”

Our moral value systems need to adapt to keep pace with technology, Miller added: “Tech and the power that flows through it is disappearing into the distance and all the human norms and institutions that shape our lives are simply not able to catch up. If we cannot find a way of making our moral architecture as fast flowing and rapid as our technological one, the norms we have been used to, will rapidly disappear.”

…But regulation may restrict freedom of expression
Tighter regulation means governments have the potential to take more control over expression and freedom of speech, according to Jodie Ginsberg, chief executive of the Index on Censorship. This is particularly problematic when deliberating the question of what constitutes harm, she said. “We have to be careful what we wish for – the narrative is that the internet is a pit of porn and abuse, and that conveniently allows governments to have control of what can and can’t be said. A whole swathe of speech would become outlawed [if regulators are given more power]. When we talk about something like harm, there are very few concrete definitions. It could mean harm as in someone being offended.”

People tend to fall into the trap of thinking problematic content can be easily identified, she added. “We think that regulation would only get rid of the nasty stuff, but it’s almost impossible to define it in a way without expression of opinion suddenly very easily falling under the definition of what’s harmful. That troubles me.”

0 Comments