Why we need to rethink the human factor
For twelve days in November, Cyber Work will be releasing a new episode every single day. In these dozen episodes, we'll discuss career strategies, hiring best practices, team development, security awareness essentials, the importance of storytelling in cybersecurity, and answer some questions from real cybersecurity professionals and newcomers.
– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast
Transcript
[00:00:00] CS: Hello and welcome to the Cyber Work with Infosec podcast. For 12 days in November, Cyber Work is premiering a new episode every single day. In these dozen episodes, we'll discuss hiring best practices, career strategies, team development. And in today's episode, you'll learn about changing the culture of security awareness. Back on October 5th, Cyber Work introduced you to Bruce Hallas, author, speaker and host of the Rethinking the Human Factor podcast to talk about his security awareness journey and his strategies. In today's episode, taken from Infosec's Inspire Virtual Conference, Bruce joins host Kristin Zurovich to speak about the ways that companies can move their security awareness strategies from a have-to mindset, as in, “I have to remember to do this, because IT will yell at me if I don't,” to a want-to mindset, in which security becomes not just a check mark on a to-do list, but something that everyone in your company takes personal ownership of even after the security training modules have been finished. We hope you enjoy this 30-minute conversation between Bruce and Kristin.
If you want to learn cyber security or move up the ladder in your career, we're giving all Cyber Work listeners a free month of access to hundreds of courses and hands-on cyber ranges with Infosec Skills. Infosec Skills is aligned to the work roles knowledge and skill statements in the NICE workforce framework and can help you at any stage in your career. Be sure to use the code cyberwork when signing up. More details can be found in the episode description below. Catch new episodes of Cyber Work every Monday, 1PM Central Time on our YouTube channel or on audio whenever you like to get your podcasts.
And now let's start the show.
[00:01:43] KZ: Let's kick things off. I think maybe a good way would be to talk a little bit about the security journey that brought you to where you are today and a little bit about what prompted you to kick off the research you did about Rethinking the Human Factor.
[00:01:58] BH: Okay. So very briefly, by joining the industry, I came in from quite a different group to majority of people. So I actually originally trained in law finance and marketing, and after about 10 years I came across opportunity, which led to me joining the industry. And I've been here for over 20 years now. So I'd like to think of getting something right, or maybe just people are too thick skins. Anyway, so I think early lesson actually for me, the thing that really helped me working with both my own board of directors at the time, but also our clients, was because of my legal training and because of my marketing training and because of my finance training, I was able to have conversations with the finance director, the marketing director and the company lawyer advocate and relate to them why information security was something they should be interested in. And this was a really powerful tool to all the IT managers I was day-to-day speaking with, because it was a language that they hadn’t been trained in.
And so I would work with them to sort of we would understand what it meant from an IT perspective, but I would help translate that into meaningful benefits to all those different stakeholders within the business, and that helped the business back. And majority of my work for the first 14 years was around governance risk and compliance. So I would help boards understand what the risk is to the key performance indicators. And the ones I was particularly focused on was cash flow and profitability. And we would then work with the board and all those different stakeholders and then they would say, “We're happy with that. We're not happy with that.” And that would then give the people I was – At an operational level who were responsible for controls, that would give them something to move towards. And so they would design technical controls and they would design policies, processing procedures.
And exactly that was the first point where I started to realize the importance of the human aspect really here, because we would write everything policies. The board would say, “This is our expectation,” and we'd have an employee user policy or supplier policies, whatever. Some of them we could enforce technically, but actually a fair amount of it, we were really relying on people to do as we ask them. And regularly we would see that people wouldn't necessarily do as we ask them to do, which is very frustrating, because then the finance director of me was thinking, “Hold on. We've invested all this in having our own teams, bringing in consultants, maybe outsourcing elements of security to various vendors, and yet we're still having this reoccurring problem. We haven't sorted the problem. It still exists.”
And as a finance trained person, I'm thinking, “Well, the return on investment on that's not particularly sharp.” And then it was like, “Okay.” So I started to realize when you started to question the value of what it is we do and then to really focus my mind upon for all the work I do in government's risk and compliance, it really comes down to whether or not people choose to behave in mind about policy. And that seven years ago kicked me off on a research path, which I called the Rethinking the Human Factor, where I rethought pretty much every aspect of what I do when it comes to information security, including the bit that we call education west.
[00:05:51] KZ: Fantastic, and that's actually what drove our conversation for today. So I'm interested, basically, the conclusion you've come to then is that to move from that have-to culture to a want-to culture really is based in influence, right? And I know through your research, you actually explored several different behavioral theories tied specifically to influence, things like choice architecture, and nudge theory and so forth, but that when applying those kind of things to security, or more specifically to security education, you can actually significantly change behaviors or reduce risk. And I'd love if you could share a little bit about, first of all, what these theories are. And then secondly, a couple of examples of how these types of theories have been applied in other situations to really shift the behaviors and to influence the behaviors to be less of that I have to do this, versus I want to do this.
[00:06:51] BH: Okay. Okay. So I mean I think what we're talking about here when you mention the face nudge theory and the behavioral economics, I mean it is one part of what's come out of my research in terms of how we might want to set ourselves around behavior. And it's one – Yeah, there are lots of different techniques. I think the underlying thing that came from my research is we sort of assume that if we present people with information that they'll process that information in a logical way and that they'll come through that process and they'll come to the same conclusions, which drove us to actually write the policy in the first place or write the training in the first place.
And so that's based on the assumption that we're logical as humans. I think, anecdotally, most of us know that in day-to-day life we probably come across lots of examples where that's not particularly logical and people don't think, “Why do they do that?” Now it's interesting because about 30, 40 years ago, that social science, our understanding of people, of humanity, how we make decisions. How we process information. How we receive information actually. How we process it. How we make decisions, choices, let me put it that way, has completely changed. And so the thing that was really interesting in my research in the early days is that much of society, from government, to private sector, and to not-for-profits, has finally got to grips with its understanding that actually people often make decisions in certain circumstances without thinking. They don't cognitively think something through.
Actually a lot of it is subconscious or unconscious responses to stimuli. And this is as a result of how we've evolved as human beings. I don't know. And it served us really well thousands of years ago when we were maybe on the Savannah or we were hunter-gatherers types. It served us very, very well, okay? But we've evolved and things have – Societies evolve, and now we're making decisions using the same old tool, but having to make a lot more decision a lot more quickly. And basically part of how we've evolved is that the brain will say, “Look, I've got two sides of the brain. One side is for thinking. The other side is for do.” It’s just getting on with that.
And the evidence is that we use this lizard side of our brain, the bit which is automatic. We use that a lot more than we like to think. We like to think that we're all pretty good at logic and reasoning. If you ask anybody to rate themselves in terms of their driving ability, most people will rate themselves as above average, which you can't be above average. If everybody rates yourself above average, that doesn't work, okay? So yeah, I mean that was one of the things that comes across from my research was that people are using this side of the brain a lot more than we would like to think, and they're using it for all manner of things and decisions. And that could be being applied to cyber security as well.
[00:10:23] KZ: So talk to me a little bit about some examples where we've been able to shift that thinking from that side of the brain into something that is more in line with where we want people to go perhaps with their decisions on a day-to-day basis.
[00:10:35] BH: One of the best examples that I've shared with people is around attacks. So here in the UK we have a department called HMRC responsible for tax. And traditionally once a year, I think it's the same in the United States, there's a certain date in the year where you have to submit your tax return. And if you don't submit your tax return on time, there's usually a follow-up process of reminders which usually in the form of letters, and the language gets stronger, shall we say, okay? And a lot of that language is about by law you have to submit your data, your information about this. If you don't, this is the potential penalty, okay? Not too dissimilar to information security policy. You've been trained on it. If you don't comply with it, here you go. Okay. There's a penalty for doing that.
So in the UK, there was a member of the civil service who was given the responsibility to get as much of the unpaid tax back to the treasury. And I think the figures were in the 600-700 million pound mark. So it's a reasonable sum of money. And they've become aware of work, which had sort of been done from academic research perspective in the US with some – Cayman, for example, and [inaudible 00:12:04], and they basically brought them over to the UK. Got them to sit down with the cabinet office and explain this concept of behavioral and economic behavioral science. And a couple of things that came out of it, but basically humans make decisions often using these pretty stranded principles and rules. One of them is that we're incredibly influenced by something called social influence. So we're influenced by what we see going on around us, okay?
And what other people in our peer groups are doing at any one time, so a marketeer will tell you, for example, that know somebody that buys BMW, for example, is probably got other friends or work colleagues that have BMWs. Same with Mercs and all the other brands. But anyway, on an understanding that people tend to be influenced by what other people do, this particular civil servant took the letters and actually changed the small parts of the letters to emphasize the behaviors of groups that the person receiving a letter might be associated with. And they did it in really simple ways. And one was to use the postcode. So in the US, you have a ZIP code. And your ZIP code defines roughly – Geographically, this is where you are into your ZIP code. And in the UK we have we postcode.
So what they did is they actually said that X-percentage of people have paid their tax on time in your postcode. Now a lot of people go, “What difference does that make?” Well, it made a difference. It actually led to an increase in the number of people paying their tax on time. Then they didn't saying with letters about targeting – I think was doctors, I think it was. There was an example where they sent a tax reminder reminding the doctors that actually other doctors have paid on time. Again, it led to an increase. And they did this over a couple of different examples of applying the same principles of social influence but applying it to a number of different letters. And then on the fourth letter what they did is they actually accumulated all the different things, put it all into one letter and the increase in terms of actual payments found was really significant. It was like a 14% or 15%, if I remember rightly.
Now 15% doesn't seem like a lot to a lot of people, but 14% to 15% of 700 million is quite a reasonable return on making some changes to the words within your letter. And I think this is – This isn't the only example. But something, when I found this, I started to find lots more examples of this going on everywhere and how people are using this insight to deliver better change results in terms of the influence of behavior. And the likes of Facebook and LinkedIn use this as a core part of how they design their platform and their service. And getting people to interact with their platform is really important metric, okay? And it's really important metric, because it's a thing that investors will look at and go, “Well, are people interacting with the platform?” Because if they're not interacting with the platform, then advertising-wise, we've got lower rates.
And that's where generating revenue and profit, that’s what it's really about at the end of the day as far as the investors are concerned. There's all this stuff going on. And I looked and went, “Oh! Could we be using that when it comes to trying to influence behavior within the security context?” And you know what? Yeah. I mean, I've experienced that having done the research, gone out and work with organizations where we've applied it. And you can see difference. People who are maybe resistant in the past are more willing to engage, because what you're using, you're using behavioral insights. Designing things with that in mind, increasing the chances that somebody will choose to engage with your content, for example.
[00:16:12] KZ: Absolutely. And I love that, because we're seeing that in a shift in some of our clients, for example, that are doing just that thing, where rather than here's all this content we're going to serve up training modules or phishing simulations every single month. And yeah, we're going to track how many people click or don't click. And it's really all about the content, right? They're all looking at serving up as much content as they can and giving you as much information as they can versus just that subtle change to, “Hey, did you know that within your department, 87% percent of the individuals in your department have completed their training already. Or have you seen that the accounting department has completed all their training or the engineering department has completed all their training? But we're seeing that the sales department hasn't yet.” So using that nudge theory to say, “Look what your peers are doing. Do you want to be like your peers?” Is a very subtle yet effective way to kind of move that thought process from one side of the brain to the other, which I find fascinating.
[00:17:16] BH: Yeah. And I think that's – And what really makes it interesting is the intervention, which is what it's all about, is low risk, low cost. If it doesn't produce the data you're expecting, then you can take off and say it didn't produce through. Now it's unknown. In security, we're constantly working with unknown type things. And we're like, “Okay. Yeah, these are small interventions that you can make, low cost, low risk. Things don't quite work out and you're not going to find yourself with a major problem.”
[00:17:52] KZ: Exactly.
[00:17:55] BH: And if you sit down with the board and they say, “Okay, can you show us where this stuff's being applied?” And I go, “Well, it's won two Nobel prizes.” Okay. And ask the question, say, “Do you think Facebook's successful? Do you think LinkedIn is successful?” You can talk about these big brands and they'll probably say yes, and you go, “Okay.” So their strategy is based around an understanding of how behaviors are formed and implemented. How is that any different from any strategy including that within information security?
[00:18:29] KZ: Right. Right. And I think back even to security awareness or education in general, there are a lot of organizations that do their annual training program, and October tends to be a popular month for that given that it's National Cyber Security Awareness Month, and they do a full-on blitz at that point and a lot of really, really well done programs, I will say that. But then it's kind of like we need to get this all completed at this time of the year. But then throughout the course of the year, it wanes a bit. And I think even to your point of you're nudging people throughout the year. Whether it's posters up in the break room or fun things, not the you have to do this by X and X date, but more of that almost friendly competition or you're letting people know how others are successfully doing. I think those kind of concepts that you can continue to layer in. Like you said, they're low cost, they're low risk. If one particular poster or event doesn't work out, try another one, right?
[00:19:32] BH: Yeah. Trying to get by – Sorry. Trying to get the buy-in for doing that is really important, because I think you've just highlighted something I see a lot of, which is people get drawn into October or whenever their education and awareness month is. And there are other arguments for that. But it's interesting, with a lot of the work that I do. So I've looked into like how long we remember things for. So if you do it all in October, the likelihood is that will forget most of what you did in October by December.
[00:20:11] KZ: Correct. Yeah.
[00:20:12] BH: Okay. And it's like a case of – So like you say, you need to keep the pressure up. And most people struggle with. They say, “Well, I can't get more time. The ball won't give me more time. The employees I get maybe, if I'm lucky, one hour in the year.” And this is the great thing about that, that whole application of nudges. You can use nudge both in terms of how you do education awareness and comms campaigns on a longer basis. But even home design, security day-to-day, and you can increase the chances that people will buy into it that way. And it's a really powerful tool, which can be deployed throughout the whole view.
[00:20:59] KZ: Yeah. The importance or the impact of repetition and reinforcement can't be understated.
[00:21:07] BH: Yeah.
[00:21:06] KZ: It's the same way we change our buying behaviors based on the repetition of ads that we see, things that we hear. It's that repetition and reinforcement. And then knowing that there are others in our peer group also using that type of solution or product or whatever.
[00:21:26] BH: And from a marketing perspective, we fully appreciate the fact that if I bought the product once and the experience was great, I'm likely to buy it again. If I buy the product once, if you persuade me to buy the product to, buy into security and the experience is a nightmare, okay? What's the likelihood that I’m going to want to engage with your content again and give you my time and make a choice which is in favor of what your values are more than mine?
[00:21:54] KZ: Right. I'm interested –
[00:21:55] BH: And that’s really important. Really important.
[00:22:01] KZ: My apologies for interrupting. I’m interested, I know we talked um prior to this call about the ability for – I mean, you have your organizational culture, right? And then you have your security culture. And I'm wondering if you can talk a little bit about how security culture relates to the overall organizational culture and how do you marry these two?
[00:22:32] BH: Well, some might say marriage is a very traditional institution. I think it's really interesting. I mean I finished writing a couple of chapters for a book, and culture, it's one of the hard area – It's a very hard area to deal with. We talk about organizational culture and then we talk about security culture. So I guess the thing that I push back people to seriously think about is why would you have one over the other? Why have security culture? I mean we do talk about the importance of security being everybody's responsibility. As soon as we create two, sort of in our minds, or on an organizational sheet, or anything, as soon as we create that, we're sort of like saying, “This is the organization product. This is the security culture.” Some people go, “Well, that's separate.” And as soon as you separate it people feel like that's not my responsibility. It's really important that whole sense of responsibility.
I guess the other thing about culture which really interests me is that organizational culture is actually – Organization culture is the employees within an organization, good people, and the values that they associate with. But those values and majority of those values have already been embedded within them before they even started work. So actually one of the things came from my research as soon as I got to realize it, actually if depending on your definition of culture, and for me I have a very specific definition of culture, it's all around values. Then most of those values actually are developed, embedded when you're born and through the early years of your life. If you then go into a working environment and you take those values in with you and you normally try to move towards organizations that share your values, like Google, for example. And it's great for Google, because they promise these values. It really pulls in some amazing people, totally amazing people doing totally amazing things. Okay, right? But then it hits the problem, because then people start to – Within Google, it’s like your values aren't the same as mine in some circumstances. And we've seen that. In the US and in the UK and in Europe, we're seeing – And Google employees leaving the offices, standing outside and saying, “I'm not happy.” So I think it's a really interesting thing when you look at security culture. One, do we really want to have a separate culture? Don't we want security to be just the organization the way things get done in its organization? Two, organizational culture is actually generally based upon the values that everybody brings to an organization.
So really, what for me is organizational culture? I think there's a crossroads there that goes on. And what security is about is about trying to find how it's going to act as a linchpin through all of that. And that means about understanding what are the values which are prevalent in your group. And the reason why this is really important is that when you go back to behavioral economics and understanding how behaviors are formed and influence. I did an interview with a guy called Dan Ariely, one of the world's best known behavioral psychologists, and he was pointing out that actually values are a really big driver of decisions. And that's why you've got to look at culture, okay? You've got to understand what the values are prevalent within your organization and you've got to thread it through your strategy.
[00:26:21] KZ: Sure. So less talking about a security culture, but actually talking about security as part of the organizational culture is what you're essentially saying.
[00:26:31] BH: Yeah. I mean, what does it deliver? And this is what a good CISO should be doing. They should be having a conversation with the board, which is that this is how it contributes to the success, the resilience of our organization. And part of that is about reinforcing the values that the board has. If you're supporting them and their values you’re much more likely to get by it.
[00:26:54] KZ: Absolutely. We're running up on time. I knew our time together would go so quickly. I'm wondering if you could maybe recap for our viewers today if there were maybe two things you'd like them to take away, or if they're just getting started, shifting their organization from this have-to to a want-to mindset. What are two things or what's one big thing that they should be thinking about? What's one thing you want them to take away from today?
[00:27:22] BH: I think the one thing, which is completely throwing me on probably the most interesting journey I've had in terms of information security is I recognize that it was all around influencing behavior and culture. And I asked myself a serious question. What skills, experience, knowledge do I have around something what it means to be human? What it means to actually how behaviors are formed and influence? And I don't mean just frameworks. I have my SABC framework. But you got to actually understand the principles about how behaviors are formed and influenced. You can even make your own framework about it. And so for me, that's really, really the big thing to take away for the whole of our industry is if we really want to influence behavior and culture, then do we know enough about how behaviours and cultures are formed? Because if we don't, we're really making it difficult for ourselves.
[00:28:24] KZ: So I guess that’s maybe an action item or something for all of our viewers to think about, is if you haven't thought about that side of how behaviors are made or how decisions are made, there's plenty of literature out there, there's plenty of resources out there that will help you start moving your program in that direction. So I think it's a great opportunity to start moving beyond just the compliance, I have to, or my employees have to to the want to side. And like I said, there are a lot of resources available to do that. Bruce, if folks want to reach you or catch up with you after today, where can they reach out to you?
[00:29:06] BH: Right. Okay. Well, best way to do that, I mean listen to the podcast and you will be like – The series four is coming up. We've got some really interesting people on the podcast. We always mention how you can get in contact with us there. You can drop me an email at bruce.hallas@marmaladebox. Another way, a shortcut is pick up the book. It's a really thin book. It's designed to cost less than a Starbucks and your lunch.
[00:29:34] KZ: Fantastic.
[00:29:35] BH: And that's a good way to start engaging with the ideas over that.
[00:29:40] CS: Thanks for checking out Rethinking the Human Factor with Bruce Hallas and Kristin Zurovitch. Join us again tomorrow for our next episode; Security Awareness Behavior and Culture: Ask us Anything, featuring today's guest, Bruce Hallas; as well as Jinan Budge, principal security and risk analyst at Forester. Jinan and Bruce were featured in an open mic q a conversation at our Infosec Inspire Conference and answered an array of questions from our virtual audience.
Cyber Work with Infosec is produced weekly by Infosec. The show is for cyber security professionals and those who wish to enter the cybersecurity field. New episodes of Cyber Work are released every Monday on our YouTube channel and at all the places where you like to get podcasts. To claim one free month of our Infosec Skills platform, please visit infosecinstitute.com/skills and enter the promo code cyberwork for a free month of security courses, hands-on cyber ranges, skill assessments and certification practice exams for you to try on.
Thanks for listening, and I'll see you back here tomorrow for more Cyber Work. Bye for now.
Subscribe to podcast
How does your salary stack up?
Ever wonder how much a career in cybersecurity pays? We crunched the numbers for the most popular roles and certifications. Download the 2024 Cybersecurity Salary Guide to learn more.
Weekly career advice
Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.
Q&As with industry pros
Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.
Level up your skills
Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.