[00:00:00] Chris Sienko: Cyber Work listeners, I have important news before we dive into today’s episode. I want to make sure you all know that we have a lot more than weekly interviews about cybersecurity careers to offer you. You can actually learn cybersecurity for free on our InfoSec skills platform. If you go to infosecinstitute.com/free and create an account, you can start learning right now.
Today on Cyber Work, I’m joined by TEDx speaker, security researcher, host of the podcast, MiC Club and all-around expert on security awareness and social engineering, Dr. Erik Huffman. Erik spoke at our 2021 InfoSec Inspire Virtual Conference and for those of you, who like, me were captivated by his presentation, prepare for another hour of Dr. Hoffman’s insights on why we need to teach security awareness from insight rather than fear or punishment. How positive name recognition in an email can short circuit or common sense, and how to keep your extrovert family members from answering those questions online about your first pet and the street you lived on as a child. All that and more today on Cyber Work.
[00:01:43] CS: Welcome to this week’s episode of the Cyber Work with Infosec Podcast. Each week, we talk with a different industry thought leader about cybersecurity trends, the way those trends affect the work of Infosec professionals and offer tips for breaking in or moving up the ladder in the cybersecurity.
Dr. Erik Huffman is a cybersecurity researcher, TEDx speaker, and an award winning entrepreneur who has made a big impact in the industry, in part from his discussion about the human side of phishing attacks in the psychology of social engineering. He’s also the host of the newly created MiC Check podcast and I always love to have a fellow podcaster on the show, because I know we’ll have a good conversation.
So, I first came into contact with Dr. Erik Huffman via our 2021 InfoSec Inspire Conference. And apart from being a captivating engaging speaker, his knowledge of and passion for the human side of security awareness, especially the part where in the human doing the work on the computer or mobile device isn’t a faulty machine component to be punished and replaced. But a person with an opportunity to learn more about their own psychology. And someone who has certain insights about their interaction with technical space can navigate it with less trepidation and uncertainty. Those are ideas that were very appealing and exciting to me.
So, this is something we had InfoSec are passionate about as well. I’m looking forward to learning more about Dr. Huffman’s security journey, his passion for cybersecurity awareness, and perhaps some more strategies we can implement in our own online life to avoid some of the worst-case scenarios that all start with that one ill-advised click. Dr. Huffman, thank you for joining me today. Welcome to Cyber Work.
[00:03:04] Erik Huffman: Thank you for having me. It’s an honor, I appreciate it.
[00:03:08] CS: Very, very glad to have you. I guess I’m really looking forward to our discussion today. So, we always like to start out with the sort of superhero backstories. Where did you first get interested in computers and tech? And when did you first get excited about IT and information security as a calling? What was the initial draw?
[00:03:24] EH: Oh, it was as a kid, actually. My dad, I a lot of credit goes to him because my dad was a – well, yeah, he retired recently, so he was a software engineer. And he always had these computers and I would always try to play with them, build one, take components out when I shouldn’t have, gotten a lot of trouble during that time. Oh, yeah. But it resonated with me, but the moment clicked was when I had two computers and I was able to get one file to send from one computer to another.
[00:04:01] CS: What year would have this been? That seems common now. But what year would that have been?
[00:04:07] EH: Oh, gosh. That was probably early ‘90s. Pretty young at the time.
[00:04:16] CS: Yeah, you might as well built a bridge across the Grand Canyon first, because that was at that point.
[00:04:23] EH: You’re right. When I was able to get that to happen. It just blew my mind, and I was hooked. I was like, “Hey, I was.” I’m one of the unique people that – I’ve never changed industries, really. It’s I’ve either been IT Cyber or teaching IT Cyber, and I don’t think I’m ever moving. I don’t think I’m moving on from this. It’s too much fun. It’s too much to learn, too impactful for all communities involved.
[00:04:56] CS: There’s always a new insight around the corner.
[00:04:58] EH: Oh, definitely.
[00:05:00] CS: Okay, so to that point, with so many of our guests, I find that a LinkedIn profile is worth a thousand words of biography and yours is especially interesting. So, from 2007 through 2015, you worked as a technician and IT project manager at Walgreens, natural starting point. But from there you spent several years in higher education as a curriculum director and then as an Associate Dean for College America and Independence University, respectively. Was there a text cybersecurity component to those jobs? Or were you entering your career pivot at that point? Because after that, you became an instructor with cybersecurity education group Secure Set. So, did time in higher education influence the education you picked up and provide others now?
[00:05:39] EH: Oh, definitely. Being with Walgreens was the time of my life. It was fine, I was able to grow, meet a lot of cool people, do a lot of cool things, and definitely sharpen up my skills. But when I started to get into the education side, I wanted to just impart a little bit of what I knew onto others as well. So, spent time in the classroom teaching, ended up growing in that role more than I could ever imagine, because I started out as an adjunct. So, I just wanted to teach at night, fell in love with it, I will say. This is actually amazing. I’m talking IT and cyber to people that want to listen to IT and cyber, rather than business minded individuals that had no idea, didn’t even care what I was saying. And so, I fell in love with that. I ended up growing in that role, becoming an Associate Dean.
So, it was a career pivot. But I picked up a lot, of being able to teach. Hopefully, I would imagine everyone should, at some point. Being able to teach helps you be able to train in it, because if you can teach someone, you can train someone. So, taking that time into education, and getting back into the industry now has helped me tremendously, has definitely helped me out and helped me grow as I achieved my master’s degree and to achieve my doctorate, being in the education forum. I wouldn’t change it for the world. I absolutely love it.
[00:07:13] CS: Fantastic. So, when you were curriculum director, was that specifically for sort of tech curriculum? Or was it for the entire curriculum in school?
[00:07:22] EH: Tech curriculum.
[00:07:25] CS: That was the missing piece that I was missing. Because it sounded like you were just going into sort of general higher ed, and so you were like managing the English department, the History department.
[00:07:34] EH: No, no. Oh, gosh, I couldn’t do that.
[00:07:39] CS: Fantastic. So, that gives us a much smoother arc in terms of your job journey here. Basically, if you look at your resume, again, on LinkedIn, you have a number of jobs and work areas happening at all times these days, and so it’s like, you have a lot of things that say, up to present. Can you sort of break that down and tell us all about your current activities via teaching, speaking engagements, entrepreneurial activities, or research projects?
[00:08:04] EH: Yeah, very fortunate to be given to the community in a multitude of ways. So, one thing that I’m doing now actually is exciting. I’m a track coach, so I coach sprints at a local high school here in Colorado, have a ton of fun doing that. I’ll do that forever. But I’m also the IT director at BombBomb, which is a SaaS company, a software as a service company. So, help lead to IT and InfoSec, information security departments there, trying to grow those departments. When things happen across the platform, until like recently, we got Okta and all that, that impacts me on the daily so we’re helping them pivot from those attacks.
I was fortunate, very fortunate enough to be on the barstool for the US Army for ASA, which is a research and development, a bunch of scholars getting together to help the ASA which is a presidentially appointed position for the army, help him or her do their job more efficiently and help him or her understand what’s going on now from a research perspective, so they can make better and sound decisions to a small degree. I’m a small part of a very, very large team that are doing some absolutely phenomenal things. But just fortunate to be elected into the barstool.
A mayor’s young leader, sorry, not mayor’s young leader, a Mayor’s Young Leader award winner, but I’m also Colorado Springs, Mayor Civic Leader Fellow to help my local community. Just trying with the mayor, talk about some difficult conversations and make a positive impact. That’s what I’m all about.
[00:10:03] CS: I love that. Okay. without breaking any NDAs or anything. Can you talk a little bit about your work with the Okta breach? Is there anything – what is your portion of the sort of rebuilding here or the sort of recovery?
[00:10:17] EH: At BombBom, fortunately, we were thinking about using Okta, but we did not. So nevertheless, anytime something like that happens, what we’re looking at is, are we going to be a third, fourth or fifth party to have some type of impact there? Who do we partner? Which one of our partners utilize Okta, and how the software we create potentially be impacted? Doing a risk assessment, vulnerability assessment, figure out like, “Hey, is this something we really need to be concerned about?” If it is, let’s move forward. If it’s not, then, hey, we’re good.
[00:10:58] CS: Okay, so you are kind of manning the lines in case you start getting some nervous phone calls from your clients like, “Oh, god, does this affect me?”
[00:11:04] EH: Definitely. All the time. You got to be able to provide guidance at every turn you possibly care.
[00:11:14] CS: Okay. So, as I said in the intro, the main theme I wanted to talk about in today’s episode is security awareness, social engineering in the way that phishers aren’t hacking machines, they’re hacking people.
So, you gave a TED talk on this topic in 2019. And again, you talked some about this at our Inspire Conference last year, thank you for that. It’s a standard thing to say, these days that the human is the weakest part of the security of a computer network. I suppose that’s technically true. But it also kind of puts a premium on using fear and shame as drivers to keep users vigilant in their daily work routines, which might not be the best method. So, can you talk about some of the focus of your TED Talk in your own personal research, into the psychology of why we clicked the link when our rational brain would otherwise tell us not to?
[00:11:57] EH: Yeah, the big portion of that, when I started doing this research, if you look at the trajectory of successful attacks, and it just continuously rises. But if you look at the trajectory of innovation in technology, or innovation, in information security is continuing to rise. So, any sane researcher or any sane person would think, what you are doing is not helping. What you’re doing is actually hurting, which we know is not to be always – it’s not totally accurate. What we’re doing is helping, but what’s the commonality there? The commonality is people. The commonality is us. Everything is changed around the environment, to digital environment, protocols, websites, even cables and standardizations, all those have changed. We’ve remained the same and so that problem remained consistent.
So, I started researching into us researching into people from a generalistic perspective. Not because I’m male, and there’s a female, black and you’re white, or anything like that. From a physiological biological perspective, why do we all? We all seem to have this one common trait where if I was to ask you for your credit card information right now, the answer 99.9%, because there’s probably someone out there that would say, yes. But everyone would say no. However, if I reach you, digitally, if I happened to send you a message or digitally try to socially engineer you, I wouldn’t be far more successful in that likeness. That caused me just to barrel down into the neuroscience behind people and the psychology with people and how do these things continuously happen, and why aren’t we seeing the trend go the other way? Because we’ve shamed people, as you alluded to. We shamed people so far to say, like stupid user, and we have a really understand like, who are the users? We’re the users as well and we’re all falling into the same problem.
No matter what industry, no matter what the size of your organization, we all share this one common thing. So, at a basic level, that’s the genesis of the research in why I continue to do what I do, because I believe wholeheartedly, people are the weakest link. But I believe wholeheartedly that people also our biggest strength. So, if we’re able to utilize the strength, our collective strength to deductively reason our way through these attacks, we will be far, in a far better place then, than we are right now. So, with all of our users, we have to empower them, as an IT director, Information Security Director, there’s plenty of technology I can put in place. But then there’s also things that I can’t do anything about. If someone happens to open the front door to the house and allow someone to walk in, it doesn’t matter what security system I have, they’ re going to be able to walk in.
So, I empower and I encourage everyone else to empower your users as you’re conducting security training and security awareness, and let them know that, hey, if they want to see us fail, they can do that. By all means, you can do that. If you want to see us succeed, though, it’s going to take their effort and their help to do that.
[00:15:44] CS: Yeah. So usually, this is the part of the show where I talk about the most persistent and upsetting threats to the cybersecurity landscape. But in the case of this talk, obviously, the topic isn’t going to be the sneakiest piece of ransomware, or the most evasive, worm or virus, but the subtleties of social engineering, as you said. Based on your research, what are some of the most common types of deception or social engineering tricks that you see particularly pernicious or susceptible?
[00:16:08] EH: There’s a ton. There’s an absolute ton. But to make this clear, it’s kind of how the attacker impacts people. Psychologically, the goal for the attacker is to get your attention and to influence your behavior. And so, how do they do that is utilizing a few principles of influence. Everyone has these principles of influence, and one of the strongest ones that is out there is a principle of liking. So, if you see someone that you like, they will influence your behavior. If you see your favorite pop star, your favorite star, you’ll probably run up to him or her, jump up and down, like, oh, my God, shake their hand, hug them, whatever you do. However, you see me walk down the street, you’re not doing that. You’re not going to do the same way.
[00:17:02] CS: I am. But now I know you’re a star. But that’s hypothetical, so please go ahead.
[00:17:08] EH: So, as they utilize this principle of influence to impact your behavior, like in a digital setting, how would this happen? This would happen by spoofing. So, by spoofing, you see a name of your CEO, it’ll influence your behavior. It could be the principle of liking, like, “Hey, I like the CEO.” Or it could be the principle of authority, “I’m scared of the CEO.”
[00:17:32] CS: Yeah, “My CEO never contacts me. This must be really important.”
[00:17:38] EH: Exactly. During 2020, mid pandemic, we saw this at a higher scale, where people were fearful of the principle of authority, because so many people are being laid off. So, the CEO spoof held a significant amount of weight, that the CEO spoof began to really take off and how these people are beginning to do this, and this goes further than, “Hey, read the email header” or something like. When you see that name, biologically, you see the name, you read the name, you understand the name, and then you begin to read. And as you begin to read, the default voice you read in is your own voice.
So, you read in your own voice, you’re reading in common voice, a light voice, which is your own, unless you intimately are very closely know that name, then you begin to read in their voice and begin to read it their cadence. So, if you don’t believe me, if like, “How the hell would this even happen?” Have you ever read a book, and then watch the movie, and you hate the movie, because the person didn’t look or sound like they did in the book? Because as you read the book, you create this character in your head, and so as you see the movie, like, “She doesn’t sound like that. That is not how that happened at all.”
[00:18:59] CS: Yeah, every time. And then counterpoint, I don’t watch the movie before I read the book, because then I can’t hear the voice in my head, anything other than the actor.
[00:19:09] EH: Definitely. That’s how it happens to every one of us, and the same biological principles hold weight as we read emails. And so, as attackers begin to break people down, it’s all a psychological compromise. So, we’ve heard of the scam that’s out there that, “Hey, I need you to go to the store, buy 10 iTune gift cards, scratch off the back, take a picture of them, send them to me.” You realize how many steps that is? Yeah. And I’m not calling those people stupid or illogical. It’s just one of those principles of influence, impact it so hard that you could not see anything else. You just started to act and you just kept acting and you kept doing it and you send that information over, and there’s nothing in – there’s a lot wrong with it. But it can get everybody, which is why, with open source intelligence, as we give so much of ourselves out there to the internet, it makes this a little bit easier. So, they can right click and save as a picture of a loved one and pretend to be that person. They can follow you – if they really, really, really wanted to. Not many people do this, but they can follow you on LinkedIn, or they can follow you on Facebook or Twitter, and understand how you write. If they’re going after CEO. If your CEO listen to this, and your company’s growing, it could happen to you, they can just follow and see how you write, begin to write like you write, and send that information over to one of your coworkers, or people that work with you, and they’re going to believe that it’s you.
And with new browser to browser attacks, speaking of new technical attacks, the new browser to browser attack allows attackers to spoof a domain to a point where it’s like they’re undetected.
[00:21:11] CS: Absolutely. Yeah, that’s frightening.
[00:21:14] EH: So, it’s hard to catch it. But when it comes to people, it’s all a psychological attack, is a physiological attack in a personal, realistic face to face perspective. Imagine this. I was born and raised Southern Baptist Mormon. I love my mom to death. And so, what I was taught you hold the door open for everybody. Like you hold the door, and you definitely hold the door open for a woman. However, in a security instance, you don’t want to hold the door open for anybody. It’s so hard for me. Even when I’m working with the government, it’s so hard for me to let that door close in front of a woman. Like, “Hi, you know what, you got to have the badge in.”
[00:22:01] CS: Yeah. Especially when you kind of look them in the eye and you’re like, “Sorry.” Terrible feeling. You better get used to that uncomfortableness, because it’s important.
[00:22:11] EH: Definitely. So, as we begin to embark deeper into this digital realm security, we have to let the door close. Also, in a social engineering perspective, from a digital perspective, there are times where you have to go against that helpful factor that you have, that you constantly have in your heart. Because you’re like, “Hey, I just want to help this person out.” Yeah, they’re preying on that. They’re preying on that. They’re preying on what made you a good person, they want to use that against you. Everything can and will be used against you.
[00:22:50] CS: Yeah, the combination of I want to help, and also. what’s the harm. Just this one time and we know.
[00:22:58] EH: Definitely.
[00:22:59] CS: So, can you talk about change – we’ve started talking about the problems here. You talking about changing security mindsets, or learning new habits that can make these types of attacks easier to navigate?
[00:23:08] EH: Yeah, in a collective sense, unfortunately, I believe it’s going to take some time. We’re learning how to interact in this digital landscape. We’re really learning how to interact. Zoom or Teams or any kind of video, FaceTime, any type of video is helpful for us, because this is more like talking in front of one another. However, you take that away, it’s really difficult for us as people to understand how to interact. So, part of me feels that this is just going to take a little bit of time.
Another analogy for you is that if you’ve ever read a text message from a boyfriend, girlfriend, husband, wife, and you misinterpret it, and you thought they’re mad at you the entire day, come to find out they weren’t mad at you the entire day. That’s how far we need to come. We still just can’t read a message and understand the context. So, we’ve tried to assist ourselves by using like emojis and emoticons like, yeah, smiley face. I’m joking here, because you may not get that context.
So, we’re really learning as a community, as a digital society, we’re really learning how to interact with one another in this way. But for a security awareness side of things, if you’re a security professional figuring out like, “Well, it’s all doom and gloom”, which is going to take time. No, there are things we can do. If you run a phishing campaign, instead of just failing people, talk to them. Understand why did they click? did they click on this particular phishing scam, but they didn’t click on the other phishing scam? So, you can better understand the person. Because there’s two types of appraisals, you need to understand not for everyone, but for probably the key players in your organization, the C levels, the VPs, and the senior directors in those alike. You need to understand their threat appraisal, how are they vulnerable, and then do a coping appraisal. If they were to click, if they were to do something, are they going to react in a way that in lines with policy or none? If they will not, then you’re missing it. For example, I do a ton of phishing campaigns, just all day, every day. I’m doing phishing campaigns. That’s just part of my research. So, as I fish a VP or something like that, there are a few times, if by go back to something we all can relate to, 2020.
[00:25:56] CS: We’re still there, honestly.
[00:25:58] EH: Definitely. World is locked down. Everyone’s panicking. I’m panicking a little bit, but I’m like, “I can research.” So that’s what I’m doing. I phish a lot of people utilizing the current pandemic as leverage, because that’s how the attackers are attacking us. And there are many cases, when I do a semi structured interview, when I’m talking to the people that clicked and I hear things like, you spoke to CEO, the CEO scares me and I’m scared to lose my job. So, I’m going to do anything that the CEO wanted me to do.
In turn, just internalize things like that and think, “Well, stupid user, just don’t do it. That doesn’t work.” So, I then go talk to Mr. and Mrs. CEO. And I say, “Can you put something out to put your team at ease? Because they’re fearful for their jobs right now.” That is not a technology vulnerability. That’s a humanistic vulnerability. The only way we can “patch that”, the human patching is by addressing it personally. So, the CEO, they would address the team, and put a lot of people at ease, like, “Hey, financially we’re doing great. There are no layoffs coming up in the foreseeable future.” So, that vulnerability doesn’t then just to get clashed because people will be like, “Ah, you know what, they’re lying, or whatever.”
But most people, let’s say, okay, I feel a little bit better about that. S
o the next time I see something like that, I know that most likely, that is that’s not truthful. As things we can do, if you’re conducting phishing campaign, or if you’re conducting a social engineering campaign, to see how you can break into a facility or something like that within your own organization. After the people click, after the people are fooled, talk to them. Understand the person and why they clicked on this one and not the other one, because I guarantee you they didn’t click on every single one of them. If they clicked on every single one of them, that’s a problem.
[00:28:06] CS: We need to talk.
[00:28:08] EH: Definitely. If they didn’t click every single one of them, ask them why did they clicked this one? And then understand the person, try to patch or try to help them understand the dynamics and how they’re being compromised. Because what will compromise you will not compromise me most likely. I’m a big sports fan, so if someone sends me like some breaking NFL trade or something like that NBA trade, I’m more likely to click on it, because that’s what I’m interested in. Not everybody’s interested in that.
If we preach in our organization, diversity, equity and inclusion, we understand we want a diverse population. If we preach that, why are we training our people the same way? And we know that everyone’s different. Why are we treating them all the same and putting them in the same box for security, but we understand that each person’s unique and different, and we’re inclusive to everyone because their uniqueness and their differences are what we need to hire, because we’re better. If we firmly wholeheartedly believe that, and I think we do, we need to stop treating people the same way and treating them as like the human vulnerability. Everyone falls for the same thing. You’re stupid because you fall for it, and it’s just because you fell for it. No, we’re all more unique than that. So, we got to take an extra step forward and understanding the person. Sorry for the long-winded answer.
[00:29:39] CS: No, I love it.
[00:29:39] EH: But that’s what I’m thinking.
[00:29:40] CS: One thing I wanted to add to that, if you don’t mind is, I feel like I’ve seen credit card companies, other ones are using this particular communication line of defense, I think would really work in corporate situations to where you would lay out how you would be contacted by – if you were to be contacted by your CEO, he would only be through this channel. I’m only going to call you. I’m only going to visit your desk. Obviously, that’s different with people working from home in large numbers right now and stuff. But like when you see that note that says, “PayPal is never going to ask you for your vital contacts through email.” Amazon’s never going to do that. I think that’s a really good sort of patch on some of this stuff. Because a lot of times no one wants to – you hear the stories about plane crashes where like the copilot didn’t want to sort of like be disrespectful and say, “I think I see something wrong here or whatever.” So, there are so many little social cues that as you said perfectly, can be sort of tinkered with in ways so that people can be put at ease a little bit.
[00:30:38] EH: Definitely, it would help, if there is a chosen avenue of communication. And that’s what to be expected. That would help tremendously. Hopefully, a lot of organizations follow suit with that. But if not, we just got to get a little bit – we got to understand how to communicate digitally a little bit better. I’m bad at it. It’s not like I’m perfect at it. I just read context all the time. We see misinformation all the time online, whether it’s something government, politically oriented, even sports realm, people put out fake things all the time. We see that misinformation.
Unfortunately, when it comes time to intelligence for intelligence gathering or understanding information, people, we take it one step further, because there’s a phenomenon that I coined, human factor authentication, is when you see the name, you understand the name, you trust the name, because that person is trusted in your life, the information from them becomes trusted to you.
So, this is why we see, I’ll go to outlandish, why you see random memes out there says, “Hey, on October 31, there’s going to be four new moons in the sky for the first time in 10,000 years.” And some people are just taking and running with it, it’s because some of that, not to the extreme, but possibly to something more lighthearted, of course. But they will see who that information came from and they’re going to run with that. And if I was to tell you, you know what, that is not true. That’s not even going to come close to happen. It’s not that I’m telling you the information is false. I’m telling you, your mother, your father, your brother, your sister, who posted it, that they’re wrong. And in your mind, you’re calling my mom wrong. The hell with that? They’re not wrong.
So, that’s how we started to internalize some of this information in this digital environment, and without knowing it, like some people like, “No, it’s got to be right. Mama said it.” It’s kind of like The Waterboy, and it’s like mom was wrong again. And it’s like, “No, she is not.” We have to get better understanding where information is coming from. How are we individually internalizing this information, whether it’s be from a social scam, or whether it be from a phishing attack, or whether it just be from general intelligence gathering or information gathering. We’re kind of all over the place right now, as people. We’re really all over the place.
[00:33:21] CS: Erik, you inadvertently jumped perfectly in the next question that I wanted to ask you by bringing family into it. So, now that now that we’ve got our office, cyber savvy and ready for anything, do you have any advice for how to get our parents, family members, and friends on the same train even if they’re not in tech forward jobs? Because I think, for those of us who do tech in tech adjacent things, who are on our laptops all day, you internalize a certain amount of this security awareness, and then you come home for visits or for the holidays or whatever and you become – I call Thanksgiving, Tech Amnesty Day, where you help your parents download all the firmware on their computer and stuff. But if you’re trying to explain to them why that coupon to that doll shop isn’t real or whatever, do you have any advice for being the security awareness evangelists for your social and family circles?
[00:34:10] EH: First of all, it’s a horrible job to have. I have that same job.
[00:34:16] CS: Yeah absolutely.
[00:34:17] EH: Good luck. But also take the complex and make it human. We understand each other, we understand human like the psychological sides of some things. Take the technology if they’re falling victim or you’re fearful of scams. Take the instance rather than say, “Hey, this is a phishing attack. Read the email header and things like that.” Some of that will resonate, some of it will be a little bit more difficult for that person to really understand. So, take that and make it human. So, I talk to my mother very frequently and she’s an extrovert. I’m an extrovert. She’s an extrovert. Southern Baptist woman, beautiful, and she just wants to feed everybody.
So, I remember after I graduated with my PhD, we were at Kohl’s, she loves Kohl’s. And so, we’re going shopping, I was going to buy some shoes or something like that, and she was going to buy me something, that’s mom. As we’re walking, she’s like, “Hey, my baby just graduated from – just got his doctorate. He just did this.” I’m like, “Mom, stop talking to these people. You don’t know these people.” But it’s not like online that personality trait turns off. And so, when she’s online, say I talked to her about some of the scams and some of the people that are trying to reach her out there, it’s just like a they’re preying on this personality trait you have, that you want to feed everybody, you love everybody, you’re talking to everybody. Yeah, well, this is why you feel that way. This is why you want to go for – this is why you want to communicate with that person. Make sure you don’t. Make sure you read the information. If it’s spam or if it’s fake, you think it’s fake, just delete it. Delete first, ask questions later. If it comes back in again, you might want to read it again, but delete first –
[00:36:23] CS: And then call your son. Ask him.
[00:36:28] EH: Exactly. Take some of the tech –
[00:36:30] CS: I got nothing but time to have my mom read me emails that she got that looks suspicious or pop up came up? Yeah.
[00:36:38] EH: Definitely. Take the technology, try to make it human. The hard thing a lot of us try to do, even myself, I tried to do it. Take the people and turn them into technologists. We’re going to be so far behind anyway. Because your family, my family, they may not read cyber news every day. So, every three months, I got a whole list of new attacks to train them on, because they’re not going to keep up with that. There’s no need to lower that expectation. But you got to reach out to them and convey to them like this is how they’re trying to impact you. This is how they’re trying to reach you from a humanistic side. Because the attacks change every single day, every single month. But the attack methods, the psychological methods, they’re trying to reach you, they’re the same.
[00:37:38] CS: That’s the difference between the dichotomy of book-smart versus horse sense. We have to be book-smart, because we’re in the industry, and you have to keep up on these things. But as long as you have a little horse sense, don’t click on that thing. I imagine with personality, as with your mom, I imagine, one that’s especially susceptible would be like the social engineering, or the social media, sort of, like, tell me where you were from? What was your pet’s first name and those kinds of things, because you’re extroverted, you want to tell people about it, and you want to read what they have to say and stuff.
Now, I’ve always wondered, because, you know, obviously, that’s a huge red flag if you’re in security, but do those types of – because they’re so diffused, those kind of social engineering attacks, where it’s like, I’m just going to sort of send this chain letter style around Facebook, do you know if a lot of useful information gets harvested that way? Because I mean, not everyone’s going to have their first pet as their prompt questions, stuff. But are there like use cases where that’s really worked? Do you know?
[00:38:38] EH: Yeah, there are tons. So, a lot of that is just information gathering on the person. But as you state, sometimes you’re just casting, you’re casting a broad net. And as many people are trying to reach you as possible. Just put notion that like, distance is dead. There’s a death of distance. Distance is dead. And so, what they’re attempting to do, there are use cases where people have collected as much information as they can, try to change the password to someone’s account, and due to whatever they failed out, they can possibly do so. The information that is collected during a lot of those, it’s not collected, just for this one particular person. It’s collected for thousands of people and possibly millions of people, and they box that up.
[00:39:35] CS: They’re able to sort of sort the answers to the names, I imagine?
[00:39:39] EH: Yeah, they box that up, and then they sell it. So, what you can do, you can purchase a data breach for usernames and passwords, but then you can purchase not a data breach and purchase a data dump of possible prompt questions and answers. And so, if you had the data breach with this username, or this email address, and then you have the questions of potential answers for this username, this email address, you might have the keys to the castle for someone’s bank account information or something like that.
[00:40:18] CS: Now, speaking personally, do you feel like you’ve gotten through to your mother with these – I mean, you know, it’s a constant process, but do you feel like she’s more savvy than she was when you first started telling her “No, mom. Don’t do this.”
[00:40:32] EH: Yeah, definitely.
[00:40:33] CS: Good. It is sticking.
[00:40:35] EH: Definitely. Getting a little bit better every holiday. Every time I see here, we’re getting a little bit better. But gets and I think most people around the organizations I’ve worked with, hopefully, they’re getting better as well. But understanding self and understanding why you feel the way you feel, because if you’re comfortable, if you’re too comfortable, understand you’re hackable. If you’re up and you’re more engaged in and your senses are heightened, you’re less hackable, at that point in time. The more comfortable you are, the more hackable. The more under pressure you are, the more hackable you are. And so, understanding having situational awareness, a lot of those things you can get, those are skills, you can get better at and understanding self, understanding situations. Those are fine-tuned skills that in a multitude of situations we’ve proven we can get better at. So, I’m doing those same things, just in a digital environment, and we’re getting a little bit better. Several ways to go, but we get a little bit better.
[00:41:46] CS: At this point, I pretty much condition my mom to not click on anything. So, I guess I’ll take that as a partial victory, like including actual important updates. Unfortunately, she’s like, “Malwarebytes once update, I didn’t click it.” I’m like, “One, good, but two, I’m on my way over.” Whatever works, right?
[00:42:06] EH: Yes, that’s progress, that’s progress.
[00:42:11] CS: So, as you know, the title of the show is Cyber Work, and so we want to talk about the sort of work of this. So, from a work standpoint, do you have any tips or advice for listeners who might be students or cybersecurity career aspirants who might want to work in security awareness, preventing social engineering attacks, or even threat research? Where would you start in 2022 on that type of a career journey?
[00:42:33] EH: Stay curious. That’s my biggest word of advice. Stay curious and don’t accept the norm as the norm. If you want to be a researcher, build a home lab, and just start practicing. That’s the best thing you can do. There’s a ton of online, online resources out there, some through the InfoSec Institute. There’s also TryHackMe and a bunch of other websites, familiarize yourself with those. And then just start staying curious. Because if I was to say study this one thing right now, two years from now, that thing’s obsolete. And understand if you’re curious and you want to get in the industry, please keep in mind that you’re preparing yourself for industry in which landscape you don’t know will exist. So yeah, just that constant learning. If you have a hard time consistently learning, you’re reading new things, it’s going to be very hard to stay current in this industry. So, start now. I would say, just start now. If you want to get into this industry, yeah, dude, start now. The best part about this industry, it refreshes itself, damn near entirely every five years.
[00:44:01] CS: I’m just going to say, someone who started studying in 2017, how could they even imagine what 2022 would look like?
[00:44:07] EH: Exactly. Right now, I’m thinking of what I learned way back. Some of it is good, great foundational knowledge, not saying like, don’t understand OSI model or TCP IP, or the three-way handshake or anything like that. You have to understand some of the foundational pieces there. But a lot of that top-level stuff from a research perspective, I don’t know what I’m going to be researching in the next two years. So, I’m preparing myself for a job that may not even exist right now. My future job may not even exist right now. So, if you understand that, you stand a chance because curiosity wins in this industry. Those that are curious, those that are motivated, they win.
[00:44:53] CS: Now, to that end, and you know, because I think soft skills are important as well. Can you talk about laying your hands on everything and researching is amazing advice and staying curious, can you talk at all about the documentation process? Because, like one, I think it helps you to sort of clarify what you actually learned. But also, I think by documenting it well, you show potential employers that you can communicate with anybody and stuff. Do you have any sort of tips for the way you do reporting? Or are there particular report types out there that you can sort of use as like a blueprint that I think would help you?
[00:45:30] EH: Read a lot of white papers and start to write some. Start to write some doc, but you’re right, documentation is everything. In the job documentation is everything. Oh, my gosh. I’m going through it right now. Just documentation is everything. Compliance, you got to get through compliance to some degree. But yeah, documentation is absolutely everything. So, start reading, start writing. If you have a good logbook, not to talk about digital logs from Splunk, or logarithm or something like that. But if you just keep a research or journal or log of what you’ve done, and what you have been doing, man, it goes a long way. Because I’d tell you, in the job now, if you can’t document, dude –
[00:46:25] CS: You’d ignore.
[00:46:26] EH I wouldn’t hire you. Because it’s critical to be able to look back at historical information or to look back at information that’s a couple months old, and understand what this process is supposed to be, what the policy is supposed to be, what the compliance framework is supposed to be, or what were the findings from this particular incident. If you’re just going through incidents, and you’re not conducting like a post mortem or a post assessment from the incident, dude, oh, my gosh. That’s horrific. That’s a horrific reality to be in.
[00:47:05] CS: I mentioned, sort of a map of all these white papers, it almost kind of gives you like, a history book of cybersecurity attacks. If you understand all these contexts, it’s like a law student reading, case law from the 1850s, or whatever. It’s still going to be useful 10 years down the road to know that this particular type of attack worked really well in 2022, during the pandemic, and stuff like that.
[00:47:29] EH: Oh, definitely, because some things come back. They come back but they’ll look a little bit different. So, if you look at variants of ransomware, you’re like, “I’ve seen something like this before.” You start reading it. Like Emotet, you’re like, “Hey, I think I’ve seen this before.” And it’s been around for been around for a while you can understand, this is what we did then, and it might help now. Spectre and Meltdown, l we’re starting to see things with different chips right now. I won’t say if it’s the red or the blue, or if it’s arm or something like that. But we’ve seen different attacks on chips right now. And collectively in the industry, we’re like, “We’ve seen this before. Oh, Spectre and Meltdown. It’s part two. What the hell did you guys found and how can we fix it this time?” So yeah, understanding that and just do documentation. Without documentation, I would not be where I am right now, because we’ve documented a lot of things for me to help me out.
[00:48:39] CS: I hope that’s exciting to hear for potential students, because, we’re asking you to climb a very big mountain of information here to even get to the first, to use a metaphor, like to the first Mount Everest rest station or whatever. But if you know that all the stuff that you’re learning is going to sort of like create, like a record of modern history of cybersecurity, ultimately, especially if you contribute to it, I would hope that would help show you the importance of it, and you’re not just doing it for a paycheck, and you’re not just doing it because your mom want to kick you out if you don’t get a job or something like that.
[00:49:19] EH: Definitely. I’m looking behind my monitor, because I have a bunch of books back there that that I’ve had previously and I continuously, continuously read. But there’s plenty of times where something will happen to you like, “I think I’ve seen this.” And it’ll help you resolve incidents faster, and that’s the big the best thing you can do. Incidents will happen. It may not be a full-on data breach, but incidents will happen. And the quicker you can resolve those, the better you are. If it’s taking you a month, or even two weeks to run solve an incident, jeez, it’s going to be hard to show that security in that organization is holding a lot of value. The value in security is not just preventing breaches or preventing incidents from occurring. But it’s resolving incidents and back to normal business structure and business behaviors quicker. The average day from data detection, from data breach to data detection is like 90 days right now, which is completely –
Imagine going into a shopping mall, and they say, “Hey, you got 90 days to take what you want.” We would have the carpet out that place. That’s how long it’s taking us from date from the breach to detection is 90 days. So, we need to shrink that down. And with other incidents, my team, very fortunate, my team’s absolutely star studded. I love my team. It’s just a matter of matter of hours, for the most part. Just a matter of hours, not from data breach detection, but from incidents. Hey, we got an incident this morning. Hopefully tonight, it’s done. It’s done and we’re back to normal. That’s where a lot of value is. And if you don’t have the historical information to come from to even recognize what the hell you’re dealing with, good luck resolving that in a couple hours. It might take you a couple days, might take you a week, which is, it’s okay. But eventually that will become unacceptable.
[00:51:29] CS: Yes, for sure. This is amazing. So, thank you again for your time today. As we wrap up today, Dr. Huffman, can you tell us about your new podcast, MiC Check. Who are your guests? Is there a theme to the show? Is there any episode that you’d recommend as a starting point?
[00:51:44] EH: I appreciate that. Actually, I apologize for the correction. But it’s the MiC Club.
[00:51:49] CS: MiC Club. Okay. Sure. Sorry.
[00:51:51] EH: It’s all good. Totally fine. The MiC Club, so it’s for minorities insider, but it’s not just for minorities in cyber. It’s just like minority hosted. And then at the end of every episode, we’ll talk about diversity and inclusion. But the theme, man, we’re just bringing it. We’re so fortunate right now. We had the CISO of HP, Joanna Burkey. So, she was on that. A great episode to watch where she talks about the one to many attacks, like being attacking one particular software, or one particular hardware type that thousands of organizations would have. So, you do the same amount of work and you get so much more from the attacker’s perspective, like the one to many attacks. But we also had the former CTO of the US Army, William Cohen. He was on – the latest episode is AJ Nash, who’s a former NSA analyst. He’s talking about intelligence and counter intelligence. What is intelligence? What not is? What is intelligence and is not and how we’re confusing those things up?
The topic, the theme of it is just mainly two – well, one amazing cyber professional and then myself. We’re just talking security on various topics. But at layman’s terms, so none of it. I think if you’re a student, you could watch it.
[00:53:32] CS: This is very accessible. Okay.
[00:53:32] EH: Yeah. You could watch it. It’s just us talking. Just like, dude, we’re having these problems, and they’re like, I’m having these problems. This is why I see the future of security. A great episode to watch. If you’re on the offensive side is the first episode. The very first episode is with Chris Roberts. Chris Roberts is absolutely incredible. He hacked the Mars Rover, like that’s how like insane. So, he starts talking about how does he assess organizations? How does he talk about attacks? Because he’s definitely on the red team side. But it’s a great thing. Thank you for the opportunity, for sharing that.
[00:54:17] CS: Absolutely. That’s fascinating.
[00:54:18] EH: It’s YouTube only.
[00:54:21] CS: Okay. I was going to ask.
[00:54:23] EH: Yeah, it’s a video podcasts on YouTube, just because me being on the psychology side. I would prefer to watch video than anything. We have it on YouTube. So, by all means, just go ahead.
[00:54:37] CS: Just type in MiC Club, you should probably be able to find it then?
[00:54:40] EH: Yeah, yeah. You’ll be able to find it. The logo is looks like a microphone, but it’s a lowercase I, and the top of it I a a circuit board. If you go there, by all means, please just comment down below, whatever you think, and then subscribe to it and just follow along. It’s pretty fun. I enjoy it.
[00:55:02] CS: Awesome. I wish you a lot of luck with that. So finally, last question for all the marbles if our listeners want to learn more about Dr. Erik Huffman and your other activities, obviously, they should go to MiC Club and hit the like and notification, subscribe button. But where should they go online to find out about your other activities?
[00:55:17] EH: Just follow me on LinkedIn. I do have a Twitter account, but maybe I’m too old because I don’t use it as much.
[00:55:32] CS: Your LinkedIn profile is very robust. It looks like you post a lot of articles there and a lot of insights and things like that. So that would be a good follow, I imagine.
[00:55:38] EH: Yeah, definitely. Just follow me on LinkedIn and that’s where you’ll find most of what I’m doing right now. Instagram, Twitter’s there, but good luck. I rarely post there.
[00:55:54] CS: Low information at this point. Okay. Well, Dr Erik Huffman, thank you so much for joining me today and for taking all this time to get us caught up on security awareness.
[00:56:03] EH: No problem at all. Thank you for having me. I appreciate it.
[00:56:06] CS: And as, I like thank everyone listening to and supporting the show. New episodes of the Cyber Work podcast are available every Monday at 1 PM Central both on video on our YouTube page, and on audio wherever you get your podcasts.
Thank you so much once again to Dr. Erik Huffman, and thank you all so much for watching and listening. We’ll speak to you next week.