It’s no secret that one of the chief concerns currently surrounding cybersecurity is the ‘insider threat’ which is linked to the ‘human’ side of things. It’s a valid argument that whilst an organization can invest heavily in sophisticated security technology, if they have members of staff who are not educated in cybersecurity, they will always be vulnerable to an attack.
There’s no doubt that the spate of high-profile data breaches we saw last year has really opened people’s eyes to the issue of security, but there’s still a worrying, yet unsurprising, lack of knowledge about what safe security practice is.
Let’s look at the Talk Talk breach as a prime example. As a result of its widespread media coverage this cyber-hack set the country into somewhat of a panic mode – friends and relatives were on the phone to anyone they knew who worked in, or had any knowledge of IT security asking them what a cyber-breach is, what does it mean for their data, what happens next etc. The panic and general surprise that ensued as a result of the breach really highlighted the lack of knowledge the average internet user has about cybersecurity.
Fast forward to today, we appear to be in pretty much the same position. No doubt people are more aware of what of what a cyber-attack is and they are certainly more conscious of the fact that their data can be maliciously targeted.
Yet we are still seeing the same mistakes being made. You just have to look at SplashData’s first and second placed “worst passwords of 2015” (123456 and password, respectively – again!) to see that people either aren’t getting the message of how to practice safe security, or more likely, they just don’t understand the message.
Dr Jessica Barker is an independent consultant who specializes in the ‘human’ side of cybersecurity and uses her expertise to advise institutions and individuals on the steps they can take to keep their information safe in the context of our changing relationship with the internet. With a background in sociology and psychology, Jessica is fascinated with exploring the ways in which messages of cybersecurity, if conveyed correctly, can not only raise awareness but actually change behavior.
This week, Dr Barker was a guest speaker at the Cyber Security Challenge’s Cyber Insights Camp at the University of Greenwich, a three-day camp aimed at strengthening the skills of students and recent graduates from computing and other appropriate disciplines and promoting career opportunities in the businesses involved and wider cybersecurity professions.
Before she presented her discussion ‘Cybersecurity – Myths and Monsters’ I managed to share a few words with her on her thoughts about the current state of play of the ‘human’ side of cybersecurity and how it is affecting the industry.
“In the last year it’s been interesting that there’s been a big rise in awareness, so cybersecurity is in the press a lot more, people are a lot more aware of it, but behaviors haven’t really changed,” she said.
“Everybody thinks of cybersecurity as being very technical and everybody looks at trying to fix it with kit, firewalls and spending more money on products, whereas really the biggest weakness is the human factor. For example, we’ve seen a huge rise in spear phishing emails leading to CEO fraud, so it just goes to show that you can have all of the technical defenses in the world but if you’ve got members of staff who click on phishing links then that’s the way an attacker is going to get into your organizations.”
I entirely agree with Jessica here – hackers know that a simple phishing email can lead them to the elusive ‘big pay out’, so they will continue to target members of staff who still, in many cases, have no real grasp of how phishing works or what signs to look out for. After all, the average user may often not perceive any danger in clicking on a suspicious link in an email.
For me, this points to one thing: education. Staff need to be taught about security and empowered with the confidence to raise a concern with their superior if they receive an email or piece of spam they feel may be malicious. However, I also think it is important to avoid bombarding staff with technical language that they do not understand, and conceive ways of conveying the message that allow the average internet user to see and understand the full picture.
In an email to Infosecurity Dr Adrian Davis, Managing Director EMEA, (ISC)2 expressed a similar view, arguing that educating staff about cybersecurity is a far more valuable defensive strategy than purely pouring money into IT infrastructures.
“You can buy lots of security technology, but if you don’t have the staff to implement or use the technology – or staff who can understand the value of that technology, then it could turn out be a waste of money,” he said.
“Investing in staff at all levels, enabling them to be digitally-literate and security aware and recruiting people with these skills is just as valuable as buying technology and making direct information security investments. Cybersecurity is a people issue and having good staff, well trained, is vital.”
Moving on to Dr Barker’s presentation, her interest in the ‘user’ (the human) and their insecure behavior on the internet was a central theme. She explored some of her own research into information security and how awareness effects behavior online, drawing some fascinating comparisons between well-known fictional and mythical characters (Dracula, Frankenstein, Gollums) and cybercrime.
Dr Barker explained that one of the most effective (and straightforward) techniques for protecting data is the use of two factor authentication, although her recent survey of 1000 people revealed that 80% did not know what it was and 70% of people were not using it.
She also argued that we are currently facing a ‘hybrid’ of technology and society, with human beings struggling to work with technology that they often don’t understand and aren’t adequately trained to use, which is “causing lots of harm” and leads to data breaches.
Dr Barker added: “Most of the data breaches that the ICO investigates were caused by human error and could have been very easily avoided if there had been training.”
A particularly interesting theory that Jessica touched upon, which she referred to as ‘Pygmalion’, suggests that as human beings our behavior is influenced by the information we are presented with. She divulged further, referencing a piece of research from the 1970s in which a group of students were divided in two, with one half told they were on course to perform poorly and the other told they were on course to perform well.
The results of the study found that all the students who were told they would fail, failed, and the students who were told they would perform well, did. Whilst the ethics of such a study are certainly questionable today, it is fascinating to see how you treat people, and the expectation you have of them and the confidence you install in them will actually affect their behavior and performance.
This suggests that by encouraging and empowering staff members with knowledge about cybersecurity, giving them the confidence to spot a potentially malicious act, organizations can mitigate and reduce the risks surrounding the ‘insider threat’.