Saving lives with ICS and critical infrastructure security

Emily Miller, director of national security and critical infrastructure programs at Mocana, discusses her passion for critical infrastructure security and how securing industrial Internet-of-Things (IoT) devices is really about saving lives.

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

  • Transcript
    • Chris Sienko: Welcome to another episode of CyberSpeak with InfoSec, the weekly podcast where industry thought leaders share their knowledge and experiences in order to help us all stay one step ahead of the bad guys. As part of InfoSec’s effort to close the skills gap and empower people through security education, I’m happy to announce that we’re launching our first annual or our annual scholarship program this month. Visit infosecinstitute.com/scholarship for the full scholarship details. In line with that goal, over the next four weeks, we’ll be speaking with diverse and interesting woman in the cybersecurity industry, including today’s guest.

      Emily Miller began her career focused on critical infrastructure and infrastructure protection during her tenure supporting the Department of Homeland Security and the Department of Health and Human Services. Expanding her experience to cyber, Emily moved to the Industrial Control System Cyber Emergency Response Team, ICS-CERT, in 2014. There she served as the chief of process management, measurement and exercise planning. At ICS-CERT, Emily learned firsthand the cybersecurity challenges facing control systems and what it will take to protect our nation’s infrastructure from cyber threats. Emily joined Mocana in 2017 to continue her personal mission of saving lives by reframing the approach to cybersecurity and working to develop new solutions to seemingly intractable problems. Her next step, helping the industry understand why industrial IOT security is really about saving lives. Emily, thank you very much for being here today.

      Emily Miller: Hello Chris, thank you so much for having me. I really appreciate it.

      Chris: My pleasure. So to start at the very beginning, how and when did you first get involved with computers and security? Were tech computers and security always in your interest area, or did you move down that avenue later in life?

      Emily: No, it’s interesting. Typical of every child who thinks that their parents do something deeply and unbelievably uncool, my dad is a CIO, and I thought that was the most awful thing I’d ever heard of. I did Help Desk through college for four years maybe, this is like late nineties level of Help Desk, so probably what third graders are learning these days. Then I ran away from it, and I got my master’s degree in international peace and conflict resolution with the focus in US foreign policy. Clearly leads you right to cybersecurity, right?

      Chris: Sure, sure.

      Emily: Shockingly, that doesn’t make a lot of money, and I had bills to pay, so I was looking for something else to be doing, and I still wanted to do something that had meaning and that I felt connected to, because when you’re doing conflict resolution and peace work and foreign policy, you’re really thinking about people and protecting the country and how to make people’s lives better, and I wanted to do that, but in a way that was potentially a little bit more profitable for those with student loans.

      So anyways, I ended up working for Homeland Security, landed with the critical infrastructure program area at the office of infrastructure protection, and I was doing physical infrastructure security, and very quickly realized that this is something that really meant something to me. Learning about circuits, learning about critical infrastructure and what that is in the country and starting looking at it from a physical security aspect, and then thinking, what’s next? What do I do with this? Then I looked at cybersecurity, and cybersecurity, if you really haven’t done too much with it, usually think of enterprise level security.

      So maybe I’m worried about privacy, maybe I’m worried about financial impacts, which are very important, but that didn’t have the same resonance to me of what I was doing in physical infrastructure security, and I learned about control systems, which are the physical elements that are connected to the internet and that control processes, and that a control system is not just an industrial process, it’s in everything that we do. It’s your HVAC system, your pacemaker is a control system. Combines now are control systems, because they’re totally internet connected and automated. Lots of really interesting things, and that absolutely lit a passion, and working for ICS-CERT and working with the department and seeing as much as I did really ignited a fire in my belly. It’s not indigestion.

      Chris: Wow. So do you work, still, primarily in sort of government infrastructure, or do you work sort of private companies as well, enterprise companies, or across the spectrum?

      Emily: Across the spectrum. So most critical infrastructure in the United States, and also that the government works with, is actually privately owned. So when we talk about critical infrastructure, we’re actually mostly talking about the private sector. Certainly there is government owned infrastructure, federally and state owned, as well, and municipalities, but what we’re talking about when we say critical infrastructure, I work the gamut. So talking to private sector, talking to trade associations, government entities, regulators, the whole breadth of folks who are every part of the apparatus, from original equipment manufacturers, so the people who are making the actual process controllers or making the things that go into where the owner operators are buying, all the way to the federal government regulators who are doing the, “Well, how do we manage this or make some rules about it if we’re not regulating it?” That sort of thing.

      Chris: So tell me a little bit about your company, Mocana. Your role at Mocana is director of national security and critical infrastructure programs. What types of services or products does Mocana provide to make our infrastructure national security system more safe and stable?

      Emily: Well, Mocana is a cybersecurity software company, but we do cybersecurity software for devices, not for the network. So back to where this kind of ties in, a lot of cybersecurity firms are very concerned about network connections, as they should be. There’s lots of really fantastic vendors in this space who do very interesting things, from asset identification to threat hunting. All of that’s very important, but what’s missing is the cybersecurity of the devices themselves.

      So that is what Mocana does. We secure devices from the inside out, and we do that through compiling source code with the original equipment manufacturers, and then we also have a service platform that’s for secure lifecycle management of those devices. So if you have a device, either a legacy device or maybe a new device that has Mocana on it, how do you make sure those devices are secure, that they are providing secure identity, that they are being updated and patched securely? There are lots of challenges where firmware is being corrupted before it is even sent to the device, so a bad actor will make the firmware bad and then deliver, unbeknownst to the OEM or the manufacturer, the product that has been corrupted.

      Chris: So I’m trying to imagine in my head now, are there examples where, say, there’s old devices as part of your company, and you don’t know who the original person was, there’s no way to sort of update the firmware. Are you able to go into things that have these sort of outdated systems and sort of retrofit this sort of Mocana [inaudible 00:07:13]?

      Emily: Sort of. You are 100% correct, and actually, I was hoping we would talk about some of this. Legacy devices are a huge thing the industry has to deal with, because especially in much more traditional control system environments, you have devices that were put in 20 plus years ago. Even 10 years ago is considered a brownfield or a legacy device that was not intended to have any sort of security feature. Some of those devices can be updated, and that’s when you can do more of a traditional kind of update process. Some of those devices cannot be updated. So in that case, if it’s a non updatable, non upgradable device, Mocana would not be able to do the update. What we would recommend is put a gateway in front of it, so looking at your network architecture, putting a gateway in front of it and being able to add a level of security to that device that, via a number of mechanisms like a proxy and some other things, you could actually get some level of confidence about the device behind the gateway, or devices behind the gateway.

      Chris: Okay. So there’s at least some remedy for just about every device?

      Emily: We hope so. Certainly there are a lot of things that you can’t put … They’re too lightweight, they’re really old, that it’s not practical or it doesn’t make a [inaudible 00:08:26] economic sense to put this type of software in, but when you start going up from either the level zero, if you’re looking at Purdue model sensors up in the stack, there are going to be elements there that you are going to want to have security on, either via a gateway or actually via the device itself.

      Chris: So let’s lead with one of your most interesting statements in your mission statement, namely that the need for secure IOT is, “A battle to save lives.” What does that mean, and where in your research have you found the most dangerous IOT vulnerabilities?

      Emily: So what I mean about industrial IOT and control systems is about saving lives. I think people automatically go right to, “Oh, she means things are going to blow up and kill people,” and I do mean that, but what I’m really talking about are cascading consequences and interdependent systems. Most people really automatically, particularly after what’s been in the news, they think about power. Water, to an extent, about what happens with with that power and what happens with that water. Yes, and what also happens if your food sources are potentially impacted. What happens? So not only from the cascading consequences of water and power, but also the cascading consequences of component elements that no one at the national level has even recognized as a critical infrastructure. It’s critical infrastructure, but it’s not rising to the level of national significance where, “Oh, there’s been a terrorist incident or there’s been some sort of giant weather event, and now we’re really concerned about these top facilities that are going to be impacting X, Y, and Z.”

      Component manufacturers that may be a crucial element to our normal way of life don’t necessarily think of themselves as life saving or life sustaining, but they are, because the cascading impacts are interdependent and interconnected systems, they are what sustain our way of life. The goods and services produced, so be it an actual physical good or a service that is provided, these are life sustaining. So they’re sustaining our way of life, and so that’s what I mean about what this is really about is about saving and sustaining lives.

      Chris: So what are some infrastructure aspects that you think don’t get necessarily emphasized in risk plans? Like you said, they’re always looking at the water supply or the electrical grid or whatever, but what are some other specific aspects of infrastructure that you think need more attention?

      Emily: Oh gosh, I would love to see more attention to food and agriculture. That’s an area where I know both FDA and USDA have done a lot of really great work on the physical infrastructure side, and in fact, that’s one of the areas that really ignited a passion for me about the infrastructure security for food and ag. I think when you think about the same sort of manufacturing processes that are happening in food processing, you can have infection just on the physical side. They can “easily” … There’s ways to introduce contaminants and that sort of thing. Think about that from a cyber impact.

      So if you were having a physical introduction of a contaminant, what if you’re having somebody introducing a data element that is telling you something is true when something is actually not true, or if you are having … A lot of people think of farming practices, they think mom and pop out on the farm. Combines are now automated, and they are connected, and a lot of these things are controlled by potentially even a smart device. So food and ags being innovated, really interesting, by mobile technology. I think when you’re looking at that particular sector, I think that’s one that nobody really thinks about as critical infrastructure, in that it would be something that is life sustaining or life saving, but manufacturing processes are very similar to just traditional manufacturing processes. Things that have never been intended to be connected are being connected for good business reasons, and I think you apply that to other sectors that are not necessarily on the radar. Healthcare is getting more press, but it’s the same sort of concept.

      Chris: Yeah, and I guess that was interesting. I think I’m hearing you right, but you’re saying in addition to it’s not just a matter of a hacker takes out the “agriculture grid” and no more food comes out, but they can sort of tamper with the sort of food safety aspects of it.

      Emily: Absolutely.

      Chris: Or create false positives or false negatives.

      Emily: I don’t want any of the food safety to guys come after me saying, “Oh, it’s not possible.”

      Chris: Okay. It’s conceivably possible.

      Emily: Correct. So we’re not just talking about physical devices being impacted themselves, and we are talking about that and having devices do things that they’re not supposed to be doing, but we’re also talking about data manipulation. So the trick with industrial IOT is that you’re taking a device that may not be talking to a person. A person may eventually be making decisions, but there can be a device talking to a device talking to another device talking to the cloud, and maybe then there are some critical decisions being made based off of that data. Where in that data chain have you validated that that data from the original data point and data source is accurate, that the originating device was not in some way corrupted or manipulated, or that the data in transit was not somehow corrupted or manipulated? These are the things that keep me up at night.

      Chris: Okay. I’m glad they do, because they’re incredibly important. So I guess speaking to you, you said that health care gets sort of well covered in the news. What are your thoughts on the way media discusses these kinds of infrastructure security issues? Do you feel like they cover them enough, not enough, they could cover them differently, emphasize sort of other targets?

      Emily: Oh, that’s such a loaded question, Chris. I’m glad you’re asking it. Personally, I think so much of this is sensationalized. These are really important things. I know, right? But then you hear the, “Your pacemaker is going to be hacked and the grid’s going to go down,” and that sort of thing.

      Yes, these are very important things that we need to be talking about, and the vulnerabilities are real, I don’t want to diminish that, but does not mean that everybody needs to go into a panic. What it does mean is where there is a duty to educate, and I think there is a debate or a question on what media outlets actually feel like they have a duty to educate versus just a duty to inform, but I would love to see, for example, with healthcare that in articles talking about crazy vulnerabilities that people are finding, because researchers are finding some really interesting stuff, but they also in the same breath say, “And the federal government is really working to kick industry’s butt and try to get cybersecurity standards built in to devices so that security is built into the device from the outset, so security is part of the engineering process, and here’s what that is.”

      I think where we have the federal process and the news meeting the general public, what is the general public supposed to do with that information? Does somebody then go to their doctor and say, “What are you doing about this?”

      Chris: That was going to be my next question? Yeah.

      Emily: Yeah, I don’t know, I really don’t, because then what’s the doctor’s responsibility? That is an ecosystem that the healthcare sector itself is much more familiar with and aware of, but as a practitioner, your concerns are going to be a little bit different than somebody who is potentially a manufacturer, even though it’s a highly interconnected system that one feeds the other and the requirements of the practitioner feeds down into the manufacturer.

      Chris: Yeah, and that’s the thing. If you’re watching this on the news and they say your pacemaker could kill you, all you have is that anxiety at the end of it. You don’t have a call to action of, “I need to make my pacemaker stop killing me.”

      Emily: Well, on top of that, yeah, you need to make your pacemaker stop killing you, but there’s a lot more sensitive systems and things that we should be concerned about that we’re not talking about.

      Chris: Can you give me some examples?

      Emily: Well, because I was a former fed, I try to not give specific examples, just because some of that goes into territory of things that I know about the …

      Chris: Got you. Okay.

      Emily: But think broadly. So healthcare, if we’re going to extrapolate, looking at a Purdue model, let’s say you have an embedded device, might be a sensor. What’s the next level up? The next level up might be a monitoring system that that device connects to. Does that monitoring system have security on it? Is somebody able to physically access that system? Is it connected via the internet? Now what’s the next level device up that that monitoring system is connected to? What type of environment is it in? Is it in a hospital environment? Is it in a home care environment? Is it in a nursing home environment? What are the protections in place there? All of these, when you go up the stream, it broadens the attack surface, and if I were a bad actor, sure, the pacemaker’s scary, but there’s a whole lot more impact that you can have looking at the network and connection of devices, and by network, I don’t mean the network, I mean the mesh of devices.

      Chris: So it sounds like the issue is not specifically the pacemaker or the this or the that, but creating more of a systemic security policy in terms of cross platforming, figuring out exactly where all the most vulnerable points are, and making sort of an across the board decision to make this a priority.

      Emily: Right, and then that’s the device identification part, is what’s going to cause you to have a bad day. What’s going to cause a manufacturing facility to have a bad day is going to be different than what’s going to cause a hospital to have a bad day. Then you’ve been doing your risk calculations. Of that, what risks can I accept versus … For example, pacemakers, it may, in fact, be more risky to try to do an update in a patient than doing an update on a monitoring device that’s shut off for most of the time, except when a patient’s around. That’s what I’m talking about, the risk calculations, or versus a manufacturing environment where this one device is going to have a $10 million an hour impact if that thing goes down. That may be your quantification of what’s going to cause you to have a bad day, maybe differ, and then layering that on top of, “What does this do in, in cascading consequences? Is there somebody else depending on me? Is there somebody else depending on this? If my facility goes down, am I supplying a component that has a just in time delivery?” If you have just in time delivery, that probably means there’s no surplus, and those are the cascading kinds of consequences that we need to think about.

      Chris: So walk me through your everyday workday with Mocana, or if there is such a thing as an everyday … As director of national security and critical infrastructure programs, what are some job duties or tasks you perform every day, and what are some of your favorite aspects of the job?

      Emily: Well, you hit the nail on the head. I do a whole lot. So I do a lot of internal strategic work, but most of what I do is talking to folks. So talking to clients, talking to federal government, talking to OEMs and owner-operators. Partly, yes, it’s to educate about what I believe, but mostly it’s to educate me, and by that, I mean I want to make sure that the solutions that we are providing and we ID in the industry in whole as a whole, not just Mocana, the solutions we’re providing are the ones that industry really needs, and finding those gap areas … For example, most of the industries across where we’re seeing control systems and industrial IOT conversations about security taking place have varying levels of cybersecurity maturity.

      Emily: So the electric sector and oil and natural gas have, I think, a higher degree of cybersecurity maturity than, say, municipal water facilities, and when you go and talk about what is the solution set that X, Y and Z electric company or X, Y and Z oil and natural gas company may need, here’s your use case and that sort of thing, what are they interested in? Where are their concerns? What are they seeing versus when you go and talk to a water sector owner-operator who may have zero budget, and who the CEO is also the CTO and is the guy who makes the coffee in the morning, it’s going to be a very different conversation, and you want to make sure you’re applying a hammer to a nail and not a hammer to a kumquat.

      Chris: Yeah. That was the argument 10, 15 years ago in the healthcare [inaudible 00:21:55]. I worked at the Chicago Medical Society back in the day, and it was always that doctors don’t have time to be thinking about these data changeovers or updating their software system or whatever. It gradually got done, but how do you get people on that first rung?

      Emily: Yeah, and that’s really aligning … One of the things I love, I love educating and talking and learning, I love learning. I know a lot about a lot of different industries, but I’m not an expert in any of them. So it’s always an eyeopener when I say, “Well, hey, I thought this one thing about what you’re doing based on X, Y, and Z conversations,” and they go, “No, no, no,” but then getting to the a-ha moment, my favorite part is the a-ha, when I say, “Well, electric does it this way, and then a food and ag guy that I talked to over here is doing it this way, which sounds somewhat similar to your problem. Could we do it this way?” Both of us go, “Oh my god, well, maybe we had a breakthrough about what we might be able to do,” and then thinking about, “Okay, what are some of the barriers and challenges that we need to surmount?” Then if you’re talking to a federal official, is there regulation in place? Are you thinking of doing standards and best practices? How can I help amplify some of that messaging? That sort of thing.

      Chris: Yeah. So as I mentioned at the top of the show, for this month and well into the future, we’re looking to talk to women in cybersecurity industry to get sort of their stories and their experiences and backgrounds. So I wanted to start by asking what you think we can do in the tech and security fields to make tech careers more accessible to women.

      Emily: Well, I think we just need to change that to more accessible to diverse populations, full stop.

      Chris: Yes, absolutely.

      Emily: For thinking about tech, first of all, I think it seems really impenetrable. I’m not a software coder, I do not have any sort of computer science background myself. I do think that liberal arts educations and backgrounds are going to be some of the future of how we approach tech. We need engineers, we need women in STEM, we need more diversity in STEM, full stop, but we also need to make this field accessible to folks like myself who are coming at it from a very different perspective, and who are bringing a fresh or new or different way of looking at the challenges, and not just from a, “I’ve been educated in this way,” and that’s challenging, because there’s a lot of cultures that I’ve experienced built around the engineering and that sort of thing, and that’s awesome.

      Emily: That’s really interesting, and it’s good to learn, but thinking about, what about all the policy pieces? What about all of the communication aspects? I don’t just mean marketing. I mean, how do we bring the conversation from the C-suite down to the operator and vice versa, and then all the pieces that you have to translate in between? Those types of conversations, I think, is what diversity in general helps bring to the table. Diversity of perspective, diversity of experience, diversity of education. Those are what’s going to really help transform, and I don’t know that there’s any particular one solution other than just looking beyond your like people, whether it’s a white people, male people, whatever your like persons are, you need to look beyond that intentionally so.

      Chris: Yeah, and as you say, because so much of it is based around not soft skills, but problem solving and communication and collaboration, I think a lot of the problem we’ve been hearing from previous guests is that HR departments have the sort of stringent must have five years in this experience and must have this certification and so forth, but if we sort of start to let go of some of that or emphasize other things, then you have more likely … There’s that statistic that women won’t apply for a job unless they qualify for 90% of [crosstalk 00:26:18].

      Emily: Oh yeah, I’ve done that.

      Chris: Yeah, and men, by comparison, if they have a couple of the skills, they’ll give it a shot. So there has to be that kind of worldwide change in terms of what we’re looking for, how we’re looking for it, and like you said, where we’re looking for it.

      Emily: Yeah, and I think to your point, I don’t know if you’ve heard of a columnist for Forbes. She’s also an HR professional named Liz Ryan. So she writes a fantastic number of perspectives about a human-based workplace, and I started following her, and that has just blown my mind about what that might do, and it’s really about bringing your whole self, herring down some of the old structures, and about, “You must have X, Y, the skill set required for entry level position,” when there, it’s actually somebody who’s got at least 10 years is the only one that’s going to fulfill that, and actually asking for people to be people and to bring curiosity and to bring critical thought and a willingness to learn and to stop treating us like automatons. Just in general, anybody who’s been to a professional corporate environment that thinks of circa 1980s type of corporate world … Particularly as the generation myself and generations behind me come up, what we want is completely different in a workplace environment, and being able to establish that and break out of the hegemonic, “This is the way we do things,” type of approach, I think, is what’s really going to shake up this industry.

      Chris: As you start getting more diversity into sort of management and C-suite levels, then you can sort of make these changes more sweeping rather than departmental.

      Emily: I hope so. I certainly hope so.

      Chris: Me too. So what tips would you give to women or minority candidates entering the world of security? What are some of the common pitfalls that you’ve learned to sidestep over the years?

      Emily: So you touched on it when you said the women don’t apply to things. I would say when you feel like you’re an imposter, imposter syndrome is very real. Just lean right into it, and I don’t mean to quote Sheryl Sandberg, but when you start to feel the wall and it’s starting to press right here, “Ooh, I don’t do that. I can’t do that. I am not really qualified,” just push right past it and shove it out of the way, because that I think … And I feel this all the time, truly, that clench right about here where you’re like, “Oh my god, there’s no way. I’m not qualified. I can’t do this. These people I’m talking to are amazing. I’m not. Who am I?”, just push right past it, because everybody feels that, and nobody talks about it, and because we don’t talk about it, we don’t think anybody else is feeling it or experiencing it. So you have to really just take that leap and go for it, because you really have nothing to lose. If you’re about to lose your job, I don’t mean that, but as long as you’re okay, you don’t have anything to lose.

      Chris: So for companies trying to recruit more women and minority professionals, what should they not only do to find these candidates and hire them, but make themselves, say, more desirable to these professionals that they’re trying to recruit?

      Emily: So back to the concept of a human workplace, I think number one, you need to have an interview that is not the, “If you were two inches tall in a blender,” kind of questions, and you’re actually having a conversation and learning about the people, and if you can have a human conversation and remember that they’re interviewing you just as much as you’re interviewing them, to really understand who … We’re bringing ourselves to work, and who do you want to work with? Do you want to work with somebody who can ask questions and talk to you and engage you and fill in gaps that you might have? We have to do that from the outset, and then on the actual business element, we have to remember that these are people in our workplace.

      Having full parental leave policies, making sure you have a support for your staff persons, offering work from home. So it’s not just the, “You’ve got to come to the office every day.” Certainly you need to have collaborative work environments, we have to be able to talk to each other, but as cost of living goes up, as people’s flexibility and what they want from the work environment, you have to be much more about people. I think if we remember that and what the needs are then and trust our staff and trust the people that we have decided we want to make a part of our culture and our corporate culture, that’s going to really change the game, and that’s going to make them want to work for you. Let them do what they want to do, give them the resources that they want, make sure that they are supported and protected, and they’ll do awesome things for you.

      Chris: They’re going to find new and interesting sort of perspectives on problems you thought that were unsolvable.

      Emily: Yep. 100%.

      Chris: So as we wrap up today, what are some security issues pertaining to infrastructure, IOT, that we should expect on the horizon for 2019?

      Emily: Well, we’ve talked about one of them, which is the legacy device question. That’s really going to start kicking us in the butt. There’s a lot of stuff, old, old Windows stuff that’s totally been end of life for forever. It’s actually managed something really important, and I need to figure out how to protect it, or what to do about it. There are some network level solutions you can do with it. Certainly Mocana thinks that we have some level of solutions to provide to legacy devices as well, but that’s something that is going to really come to a head, particularly in not just the next year, but even the next five years. That’s going to be something that we’re going to have to tackle as an industry.

      I think the next thing that I’ve been really thinking about is security butting up against digital transformation. So we talked about that a little bit earlier, this explosion of devices and how we’re using this to really make our businesses much more efficient, to do all sorts of really cool big data analytics in the cloud and stop putting people in front of things, you can actually have machines talking to machines, and then they can talk to the cloud and then I can have my AI and the cloud. Okay. What about security of all of that?

      That’s just going to start … It already has, and that’s where you’re seeing a lot of the tension between business, IT, and then you throw OT into the mix of the operators down on the floor saying, “Whoa, whoa, whoa, security, don’t touch my stuff.” We’re really going to have to deal with that. How do we create trusted systems? So not just the networks themselves, but the devices and the whole system ecosystem of that, the devices, the network, the government, how does all that work together, and how do we continue to enable business and continue to make money, but also make sure we’re doing it in a safe and secure way?

      Chris: Emily Miller, thank you very much for joining us today.

      Emily: Oh, thank you, Chris.

      Chris: This was really fascinating.

      Emily: I’m glad you thought so.

      Chris: Absolutely.

      Emily: Again with the imposter syndrome, I just start waxing on and hope that I actually say something halfway intelligent.

      Chris: That was perfect. Thank you again for being here today, and thank you all for listening and watching. If you enjoyed today’s video, you can find many more on our YouTube page. Just go to YouTube and type in CyberSpeak with InfoSec to check out our collection of tutorials, interviews, and past webinars. If you’d rather have us in your ears during your work day, all of our videos, including this one, are also available as audio podcasts. Just search CyberSpeak with InfoSec in your favorite podcast app. To see the current promotional offers available for podcast listeners and to learn more about our InfoSec pro live bootcamps, InfoSec skills on demand training library, and InfoSec IQ security awareness and training platform, go to infosecinstitute.com/podcast, or click the link in the description.

      Chris: Thanks once again to Emily Miller, and thank you all for watching and listening. We will talk to you next week.

Free cybersecurity training resources!

Infosec recently developed 12 role-guided training plans — all backed by research into skills requested by employers and a panel of cybersecurity subject matter experts. Cyber Work listeners can get all 12 for free — plus free training courses and other resources.

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.