When Human Vulnerabilities Become Technical Ones
Date published: 02 May 2017

The human and technical elements of cyber security can seem so distinct from one another. We often talk about the subject in a one-sided way, with the cause of breaches often pigeonholed as either human vulnerabilities or technical failing. It is also rare for professionals to bridge the gap: few are specialists in, or equally comfortable traversing, both spheres. We talk about people, process and technology, but we rarely acknowledge the inter-dependencies.
This approach is, of course, far too simplistic. When it comes to cyber security, any divide between technology and the people who create and use it, is blurred. Indeed, the very term itself has man and machine at it’s core. The history of the word cyber can be traced from the Ancient Greek term of kubernao, meaning to steer a ship, through the Roman guberno meaning govern. In the 1940s, the mathematician Wiener used ‘cybernetics’ to mean “control and communication theory, whether in the machine or in the animal” before Gibson coined ‘cyberspace’ in his literary works Burning Chrome and Neuromancer. The evolution of the term therefore represents the interplay of human and technical elements that defines cyber security today.
The human vulnerabilities in cyber security are often expressed in relation to the people who use technology. Users are often described as the ‘weakest link’, attributed to poor password management, clicking on malicious links or becoming victim of phishing emails. Too often we fail to acknowledge that there is an excessive security burden placed on the people using, rather than the companies developing, technology.
Likewise, too often we fail to acknowledge that we are all ‘users’ and that those of us working within the industry are just as susceptible to human frailties, which in turn can cause serious cyber security problems.
Imposter syndrome and human vulnerabilities
Let’s look at Imposter Syndrome. This is the feeling that you do not know as much or are not as skilled as people seem to think, no matter how outwardly successful you are. Imposter Syndrome is a nagging feeling that you are a fraud, even if you have years of experience, training and knowledge under your belt. It seems to be quite prevalent among cyber security professionals so let’s consider how these human vulnerabilities can manifest in technical vulnerabilities.
Imagine Bob takes on a new role, part of which involves managing a firewall. Bob respected his predecessor, Alice, and was always confident in her abilities. However, when Bob reviews the configuration of the firewall, he notices that some of the rules don’t make sense to him and he doesn’t understand why Alice took the approach she did. If you were in Bob’s shoes, what would you do?
He could approach Alice with his list of questions, but will he be held back by the fear that he is exposing a lack of understanding to someone he admires? Maybe he worries that she will tell their peers of a basic failing in his understanding. Or perhaps he’s intimidated, concerned that Alice will feel that he is questioning her work and doubting her technical ability. So, he considers approaching other people in the team with his questions to see if they can explain why the firewall was configured as it was. But, will Bob keep quiet out of the worry that the rest of the team may question his knowledge and even his ability to do his job?
Many people, if they were Bob in the scenario I’ve described, would assume that they are ‘stupid’. They would convince themselves that the firewall must be correctly configured, even if it doesn’t seem to be, and that they should stay quiet for fear of being exposed as a ‘fraud’. Unfortunately, while this kind of response is understandable, it is at best wasteful of an opportunity to learn and develop (if there was a sensible rationale for the way the firewall was configured, Bob may never understand it unless he speaks up) and at worst, potentially puts the organisation at greater risk (if the firewall was in fact poorly configured, it will remain so).
The culture of organisation
How you act, and how you are treated, often depends on the culture you are operating within. So, the culture of your organisation will have a bearing on your behavior and the culture of the cyber security profession will influence you, too. Of course, in turn, behavior feeds into culture. With this in mind, the responsibility is on all of us to work towards the creation of a positive and empowering environment, in which it is OK to say “I don’t understand” and ask questions. In recognizing that human vulnerabilities exist in all of us, and in creating an environment where dealing with those vulnerabilities is supported, we will help to minimize technical vulnerabilities, too.
This is a long-term solution; it takes time for culture to change. While we work towards positive cultural change, or in environments where that kind of empowerment is not being championed by senior leadership, we need other ways to manage human vulnerabilities.
This is where automation tools such as Nipper Studio can help. Using a tool like Nipper Studio would provide Bob with an evidence-based and objective report, around which he can structure a discussion with his colleagues and senior management. This way, Bob can ask questions without worrying that he is exposing a personal lack of knowledge. Vulnerabilities often come down to a combination of factors relating to people, process and technology; when we combine solutions based on people, process and technology we stand our best chance of minimizing those vulnerabilities before threats are realized.