Healthcare cybersecurity issues and legacy health systems

Dirk Schrader of New Net Technologies talks about healthcare security and legacy systems. We discuss the millions of pieces of health data left out in the open, the issues with closing these holes and the need for professional legacy system-whisperers.

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

  • 0:00 - Intro
  • 2:56 - What drew Dirk to security
  • 4:46 - Did your Dad's role inspire you?
  • 5:55 - Stepping stones to your current job
  • 9:35 - What is it like to be a security research manager
  • 14:38 - Unprotected healthcare records
  • 21:50 - Unprotected systems in the U.S.
  • 25:20 - Using better security in hospitals
  • 31:55 - Logistical issues of security for hospitals
  • 37:48 - Best solution for hospital cybersecurity
  • 39:30 - How to prepare for change
  • 42:32 - What skills do you need for this work?
  • 46:00 - Will people pursue these changes?
  • 49:40 - Projects Dirk's working on
  • 52:10 - Outro

Learn cybersecurity for free with our new hands-on Cyber Work Applied series. Whether you want to learn how cross-site scripting attacks work, set up a man-in-the-middle attack or walk through major breaches like Equifax, Infosec instructors will teach you these skills and show you how they apply to real-world scenarios.  Best of all — it's free!

[00:00:00] CS: Today on Cyber Work, my guest is Dirk Schrader of New Net Technologies. Our conversation is around healthcare security and legacy systems. We discussed the millions of pieces of health data left out in the open, the issues with closing these holes and the need for professional legacy system whisperers. That’s all today on Cyber Work.

Also, I want to tell you about Cyber Work Applied, a new series from Cyber Work. Whether you want to learn about how cross-site scripting attacks work, set up a man in the middle attack, or get a blow-by-blow recap of the Equifax breach, expert Infosec instructors and industry practitioners will teach these cybersecurity skills and show you how these skills apply to real-world scenarios. Best of all, it's 100% free. Just go to infosecinstitute.com/learn, or check out the link in the description below and get started with fun hands-on training that keeps the cybersecurity skills you have relevant. That's infosecinstitute.com/learn. Now, let's begin the show.

[00:01:04] CS: Welcome to this week's episode of the Cyber Work with Infosec Podcast. Each week, we talk with a different industry thought leader about cybersecurity skills, the way those skills affect – the way trends affect the work of Infosec professionals and offer tips for breaking in, or moving up the ladder in the cybersecurity industry.

Dirk Schrader is the Global VP of New Net Technology, or NNT. A native of Germany, Dirk’s work focuses on advancing cyber resilience as a sophisticated new approach to tackle cyber-attacks faced by governments and organizations of all sizes for the handling of change and vulnerability as the two main issues to address an information security.

Dirk has worked on cybersecurity projects around the globe, including more than four years in Dubai. He has published numerous articles in German and English about the need to address change and vulnerability to achieve cyber resilience, drawing on his experience and certification as a CISSP (IC2) and CISM (ISACA). His recent work includes research in the area of medical devices, where he found hundreds of systems unprotected in the public Internet, allowing access to sensitive patient data.

This is going to be the topic of today's episode. We're going to talk about unprotected, or poorly protected legacy systems in general, and how we start to build some coverage over this vast swath of unprotected information. Dirk, welcome to Cyber Work.

[00:02:22] DS: Thank you, Chris. Thanks for having me.

[00:02:24] CS: My pleasure. We always like to start out informally with an origin story question first. What was your initial attraction to computers and security? I looked at your bio, and it seems that you've been a fan of – you’re in tech for a long time and that even your first job out of university was for Commodore, which was my first home computer.

[00:02:41] DS: Really? Which one?

[00:02:43] CS: Counter 64. Yeah, way back in the day.

[00:02:45] DS: Oh, come on. Me too.

[00:02:46] CS: Yeah, yeah, yeah. I can always gauge an age range of guests when we all say, “Oh, yeah. Counter 64. First computer.” I'm like, “Okay, we're about the same age.” What was it that drew you to security as a career to passion?

[00:03:00] DS: Well, to cut a long, boring story short, I guess, it actually started with my dad. Bless that. He was an electrical engineer in a steel factory, working on high-voltage systems there, anything higher than 15,000 volts. You don't touch it, at least only once. Then in the semi, he was also a farmer. I grew up on a farm. I grew up with electricity systems with electrical appliances and machinery.

He was always taking me to help him fixing things, or improving things which were about to go broke. From that on, it was a path, let's say. One of my previous bosses, a mentor, better to say, introduced me to networking. Then, I was diving into security, did my which is not mentioned in my bio, because it has expired. I did my Cisco certified security professional.

Finally, I got into what I'm doing right now, which is security is a process. The CISSP, CISM. That's my main passion, I have to say. To be fair, it wasn't always that direct way as it sounds now. I have tried – Yeah, I tried, put it that way. Myself and other aspects of information technology as well. Anyway, the last 20 years, yes, it was always about information security.

[00:04:45] CS: Okay. Yeah. I'm wondering if just the notion of electrical grids and home wiring and stuff, did that train your brain in general for the idea of computer networks and this idea of all these connected systems together, because it seems like, there's a real natural progression there in terms of this is connected to this and when this goes out, this whole thing goes out and all that.

[00:05:07] DS: Yeah. It definitely does. One of these earliest stories I have with my dad in terms of what do you what do with your professional life was that he was taking me with him to his steel factory and he was asking me to join a group of apprentices they had there. The trainer for these apprentices, they gave me all their old – all of these old calculators. They’re not that old. To unmantle them and to clean them. Then, I was looking at these chips and the board and this microprocessors and I’m like, “I want to know what that is.”

[00:05:51] CS: Yeah. Yeah, that's a whole different maze. That's really interesting. You said that you've tried many things and you found one up in information security. I wanted to point out that I looked at, again, at your LinkedIn bio and I noticed that you've worked in different aspects of the security industry, from sales to marketing, up to your current position in research. Can you talk about some of these stepping stone moments and figuring out what worked and what didn't and what you were passionate about and what made you make these big jumps to higher levels? Because I think a lot of people, it's easy to get stuck in a position and say, “Well, I'm good at this. I might as well keep doing it.”

Were there were there moments where you got home from work one day, and you're just like, “I can't do this anymore? I got to try this next thing”? If so, how did you get yourself unstuck into doing things that you really love?

[00:06:42] DS: Well, two things. First, there's nothing bad in it to say, “I love it. I stay where are – from where I am.” It’s also nothing bad in it to realize, although I never had that precise moment to say, “No, I can't do that anymore.” For me, it was a next to the one thing I just mentioned that which, is about the mentor of mine, who introduced me to networking. There were a few things, let’s say. A few moments and individuals. One of the things is working in international teams, like in the in the Commodore European Support Council, where the colleagues there, among other things, took away my fear of talking in English. Yeah, come on.

If you see my school grades in English lessons, that's a different thing. In the same way, when I was living in Dubai, where worked for Siemens, it was also that notion of improve yourself and get you to the point, “Can I go deeper? Can I dig deeper? Can I have a better understanding of commit to sales and marketing here? Is that what I'm selling? What I'm talking about, what a marketing is, do I understand that well enough?”

That my explanation about it, the way I'm telling it to customers, in terms of this is what it does, this is a brochure, or trying to convince them about the value proposition of a product, of a solution, of a service is there. Sales and marketing is not bad. They were good positions. It was something where I realized that digging deeper, trying to understand, trying to get behind the inner workings of something, this is surely a mindset that not only I do share with my colleagues here at NNT, it is also something where – how to say that? I have that opportunity in the current role and which is when you look at cybersecurity from a connector’s point of view, everybody is affected by cybersecurity, in one sense or the other.

That is the opportunity to work with people from various backgrounds, from various regions. That in itself is the biggest stepping stone moment for me every day. I can listen. I can learn. I can share thoughts. Like we do it right now. That's the best part in my job, I have to admit.

[00:09:30] CS: Yeah. Yeah. Your world is opening up constantly.

[00:09:33] DS: Yeah.

[00:09:34] CS: Yeah. Let's talk about your current job here. What is your workday like as a security research manager? How do you prepare for a day? Is that the beginning of the day? Does it go up in smoke around 10 a.m.?

[00:09:47] DS: Lots of coffee.

[00:09:48] CS: Lots of coffee. We're all living on that.

[00:09:52] DS: I think I'm not special in this. Anyone in cybersecurity has lots of things to read, things to digest to stay on top of current developments, to get familiar with new TTPs, tactics, techniques, procedures, attack vectors. I think it's needless to say, that a lot of reading takes a good part of my day.

I'm not so much into emergency handling, I have to admit. I do admire the guys who do that, who do that on that constant challenge, on that constant notion of if I make a mistake here, the things go really broke. For me, it's so something where I am probably in the earliest stages of that process of these phases, where I'm talking about prevention, about being resilient. You mentioned it in the intro, cyber resilience is my favorite and it's about being prepared for an attack, being able to maintain operation while being attacked, learn from the attack, come out of that incident of that attack, of that dip in your punch you get; coming out of that as a mature organization.

Cyber resilience in that part is different to business continuity, or disaster recovery, as it addresses the organization at a different, much more encompassing level. Then, how do organizations evolve to be cyber resilient? How to mature in that? How to maintain a high level? This notion of maintaining a high level, what is that? What forms a major part of my work? How to adjust to attacks like, take an example, the solar winds thing? Or even to visage to foresee new TTPs not yet seen as an example for things being in discussion right now, and during my work with the team, what is the consequence of digitalization for operational technology? How does that shift or change priorities away from availability, which is the major source of priority there, comes they operating in, when they are not fully digitalized, or out of, let’s say, context in terms of what they have learned in the past 20 years.

When you think of that as the paramount of their deterministic processes in production environments, what is the impact of that digitalization? That's some of the things we're working on. Not so much fancy and very high-techie sometimes. I do consider that a good piece of ground groundwork and to connect between those hardcore – might be the wrong word of saying that. Those well-versed hackers and the folks doing the business processes, the folks doing the operations. Because if you don't combine these two worlds, then you will never be resilient.

[00:13:36] CS: Yeah. I think that's really important too, to separate out the work of handling the immediate problems, versus handling the work of future proofing. Or as you say, figuring out the next steps of the hardcore hackers of the future. Knowing that if your business is only dedicated to quashing the immediate fires, you're never going to be ready for the next – everything is always going to be on the defensive, if you don't have someone there planning in the offense.

I think that's going to be useful for our listeners to hear in terms of thinking about, you're always trying to figure out what do you want to do with your life? Do you want to be putting out the fires now, or do you want to be as you say, getting up every day, reading about the new TTPs and figuring out new angles? You're almost playing a long chest with the hackers in that regard, I guess. You're thinking, what they could possibly be doing in the future?

Okay, so a few months ago, we had on Emily Miller of Mocana, who spent her career defending nationwide infrastructures and IoT against hacks and invasions. We discussed the case in Oldsmar, Florida, which I know you've been following as well. A bunch of hacker tried to poison the town's water supply by adjusting light levels. Our topic today is his legacy systems in part, in healthcare, which covers a lot of ground. One place we are going to settle on is the topic of unprotected health care records, as it's the issue that's surprisingly open and which to hear you tell it is acknowledged and also ignored by many health care providers. I want to have you just give us the top down about your report and your findings on the state of healthcare record security right now.

[00:15:23] DS: Well, the origins of that report go back almost two years. When I was reading a story about a researcher who had talked about a proof of concept that he was able to manipulate medical imagery, so that he could take cancer away from the images, or he could make cancer appear on the images. Which, if you think of medical imagery as a part of the medical process, you go for an X-ray, you go for a CT, and then you have your images taken, and then they are presented to the doctor to actually derive your treatment. Once that underlying source of information, the image itself is lost, is manipulated, then we are in danger.

I was like, “How big of an issue is that?” I mean, I was going in there with the assumption, okay, yeah, you need to be next to the device, you need to be within the network, you need to have overcome certain hurdles, like firewalls, whatever. I was like, “Okay, let me let me check that.” It was actually even in here in Germany. I went online and searched for some background and was checking which parameters do I have to look for.

Just about five minutes later, I had a system unprotected, connected to the network. I was able to see the patient data. I was able to see the images. Basically, I could have downloaded the whole data set. That was about 18,000 studies on that device. There was no protection at all. There was no encryption, no firewall, no passwords, no multi-factor authentication. A plain server connected to the Internet, with everything on it. For me, it was shockingly simple to find one.

That initial system, which started the whole process that was a system in the UK. With the details of the data, who is searched of the IP address, related SSL certs. I was able to reach out to the administrators of that system via LinkedIn. In a matter of days, that system went off the grid. You can imagine that I was curious. We started. We started to search. Showdown is a big help. Yes.

A few months later, at the end of the first round of searching and data collection, we had hundreds of systems were in our data set, where we had access to, unprotected access. We had from all over the place, all over the world, every continent, except Antarctica, would be interesting if there is one. The amount of studies we've finally had in in our data collection in terms of our statistics, we didn't download anyone, where hundreds of millions, hundreds of millions of studies from patients across the globe. Billions of images related to them detailed 3D imagery. Scary thing.

I mean, yeah, think of my own head in a CT and our MRT, and then someone else is able to actually get your head out of a medical archive, put it into a 3D printer, and then you can do a passport photo or whatever. Might be true, but anyway.

[00:19:26] CS: You also mentioned that you were – using graphic technologies that they'd be able to add or remove cancer as well. I mean, that's a horrendous thought that you could change someone's diagnosis, either put a scare into them, or make them go through chemo for nothing.

[00:19:45] DS: There were additional ways of having access to these devices. Some of them were actually so negligent that they were putting these devices not only into the DICOM protocol was, which was my original research. They also had these devices connected using a web interface. Again, no password, no encryption, nothing, plus the option to upload files. Well, I left a note on some of the systems and I say, “Hey, you are open.”

[00:20:15] CS: You’re open as open can be.

[00:20:19] DS: It is. When we published the report, the outcry was massive. I mean, it ended up for example, in the leisure that the government's health minister actually summoned an urgent meeting to follow up on each similar system we found in that country, which was funny, in terms of reading the newspaper from Malaysia, about these meetings.

[00:20:48] CS: Busy day.

[00:20:50] DS: I was scared by the massive amount of findings. At the same time we were publishing the report, we were also reaching out to the German Federal Office for Information Security. Because these guys are well-connected. They have their peers in the world, the US cert, the NCSC in the UK, ANC as the French organization. Most of the countries and there were some 50-plus countries we actually had in our findings, they had a connection. They had a peer. They could help us to give our findings to them to follow up on. I was hoping that these authorities in the world would be able to identify the system owners and get the systems away from public view. In the same way, I tried myself as best I could to reach out to them.

[00:21:51] CS: I mean, you noted in the report that you sent disclosure notices to the administrators about over – of a 120 unprotected systems in the US, and that 69 administrators completely ignored the warnings. You said that the response from Europe and the UK has been positive, and data has been secured. Other unprotected systems exist in Australia and Canada and one in France. You concluded that the figures you gave to security week for the report relate entirely to the US, and rather than expose systems being removed, new systems are still being added without adequate, or any authentication requirement. What's the reticence here? Why are administrators ignoring this warning? Is there a sense that it's just too difficult to make these changes, or that maybe making these changes implicates their prior negligence? Or what do you think? Do they just seem to think it's not that big of a deal?

[00:22:41] DS: In some cases, I would call it resistance.

[00:22:47] CS: Sort of, don't tell me what to do mindset.

[00:22:50] DS: Yeah. Yeah. Just a few weeks ago, I was talking to an admin of one of the systems identified in the first round, so 18 months ago, or even 24 months ago. That system, a quite large one, is short over half a million studies on it, Social Security number, COVID diagnosis on it. It was still online, unprotected after 18 months. Let's say, from the time of during the first round of reportings informing the authorities. 18 months later, it was still there. I tried again to reach out, to speak to someone. Finally, I got connected to an admin. He was like, “Yeah. I think I recall that message. Now I do understand why the FBI was also trying to contact me.” Come on. Yeah. I think it is –

[00:23:49] CS: More coffee needed.

[00:23:52] DS: Yeah. Sometimes, or something stronger. For them, it was the way they communicate with their doctors. I can't take it behind a firewall. That notion of there are operational ops obstacles for me. It is a reticence, a negligence on the one-hand side, but also a felt need to do it that way.

[00:24:21] CS: My listeners are sick of hearing me say this, but one of my first job here in Chicago was I worked for the Chicago Medical society. I've worked with doctors a lot. This was about 1999 till about 2005 or so. This was right at that point where doctors were going from very, very old computer systems to slightly older, old computer systems. I can tell you that reticence – the way I compare it is as you said, you are completely embedded at all times in researching future threats, future this, future hacker vectors and whatever.

Doctors are completely obsessed with their one thing, the vascular surgery, the veins, the brain. It's unfortunately, to the detriment of everything else. When you would have a conference, you would almost have to put their pen in their hands to get them to sign in, because they're thinking about that one thing. I mean, it's a real problem, because obviously, this is incredibly sensitive data. As you said, there's this real problem with making the computer system too hard, because doctor so and so doesn't want to deal with two-factor and look at their phone, or their thumbprint, or whatever. I don't know. Do you have any thoughts in getting around that?

[00:25:39] DS: Well, not really. Because, on the one hand side, if I would be sick and I would be in the need of a doctor who is really focused, I would love to have one.

[00:25:54] CS: Yeah, sure. Yeah.

[00:25:56] DS: On the other hand, it explains why we are seeing that again and again. I mean, anytime we generate a new base data set, we find fresh systems. That's the weird thing. They haven't been there before. Or even worse, they are old systems now with new IP addresses. When I look at the data itself, I can see I have seen these names before. Sometimes, it seems to me that they're trying to get away by moving to a new location.

[00:26:33] CS: Yeah. You're moving to a new address, but you still haven't taken your email, or the phonebook.

[00:26:38] DS: It’s still IP. Welcome to the world. Then, but the story told to me – The story that was told to me by that admin is only a part of it. Most of the folks we contacted and when they didn't react, they are likely confused or overwhelmed.

They have no clue how the system works. I'm moving to a new location. That how things are connected to the public Internet, how easy they are connected to the public Internet and how easy it is to overlook default configurations. Then in a later round of connection attempts, one even told me that they regarded the initial disclosure, the one I was sending in December 19 and January 20, that they regarded that as a phishing attempt.

There is uncertainty. There is fear. There is lack of knowledge in that space. Overall, this is in great parts driven by two things. A, lack of enforcement. Sorry HIPAA, toothless. Simple as it is. Organizations have plenty of time to report about a breach. If they do so, it is somewhat blurry how much they report. I mean, I have systems that have 1.2 million, I think, if I do recall it correctly. I had a system that had 1.2 million studies on it. On average that would represent about, let's say, one in six, 200,000 citizens. If I'm not mistaken, they've reported 10,000 people that affected by it.

There is that not well-defined structure of when do you have to report, how do you have to report and how is it enforced and what happens to you if you don't do it. Maybe CCPA and the other privacy regulations coming up in the country will be more efficient here. Let's see. Here in Europe, GDPR can bite you.

[00:29:09] CS: Yeah. That’s deep. That is deep. I mean, what little I know about HIPAA? It seems like, it's still focused on just physical issues of health – in actual physical health. There's regulations for sterilizing, or having your operating room just so. It doesn't seem like HIPAA ever really caught up to security as a key component of effective health care. I don't know. I could be wrong on that. I agree with you [inaudible 00:29:40].

[00:29:42] DS: Yeah. Basic.

[00:29:45] CS: Yeah. Bare minimum. Yeah. Right.

[00:29:47] DS: The second part for me is next to that lack of understanding how the technology in itself works, what does it mean is the lack of understanding, what is the consequences? What is the potential full-up? They don't get it that someone can do very bad things with a name, a date of birth, a social security number and the location. If I have access to data from an elderly person, which is residing in a nursing home, I have the date of birth, I have the social security number, then I just have to read the – I think, yesterday it was published, or the day before, the story of Brian Crapps, about the ways of how easy it is to unlock someone's credit freeze and put these things together.

I mean, there are systems where the archives are operated for years and years. I mean, we found data going back to the early 90s, when I started my work at Commodore, that holds massive amount of data. That system I've run, I mentioned, with the 1.2 million studies, that included social security numbers. When you know the name of the system operator, you can go into a certain region, and then there is a lot of public knowledge you can use and I was actually doing that for my own side of test of a research. I took a couple of names. I took the social security numbers.

Facebook is your best friend when you're trying to investigate someone. I was able to find phone numbers. I was able to find addresses. Then here I am, well prepared to do ID theft. It all started with having an unprotected medical archive.

[00:31:54] CS: Yup. Now, I'm trying to –

[00:31:57] DS: You probably have to call me up before I'm going on a rant here.

[00:31:59] CS: No, no. I was going to say, I'm trying to give them the benefit of the doubt. I understand that money and budgets are always a factor. What are some of the logistical issues in implementing mass security upgrades for healthcare infrastructure? Is there an issue of massive upgrades taking systems offline for an extended period? Again, I keep coming back to what is the reticence, but apart from not believing the threat or not seeing it, is there that issue? Because I think, with most critical, things like health care, even taking your system offline for an hour can be a big problem if people are in surgery, or what have you. Is there a way of doing this that would be minimally disruptive? Or is that just definitely part of it?

[00:32:52] DS: A simplistic way of answering the questions would be to say, compliance stands in the way of compliance. You have your healthcare device. In order to be allowed to operate a healthcare device in a hospital environment, it has to pass certain certifications. Yeah, it's safe to operate things, it doesn't harm people when operated and all these things. That is a status freeze.

The device as it is a certified. If you change it, you lose your certification, your allowance to operate it. Okay. If I want to update it, I need to change it to be compliant for some certain cybersecurity regulations. There we go. It's a catch 22, so to say. It is also complicated by a cultural difference. I mean, I think Emily indicated that about that. In industrial environments, in healthcare environments, there is that machinery put in. It is there for 15, 20, 25 years.

Working is the important notion. It is working. It is still working. If you compare that to the mindsets in IT, where the idea of software and hardware life cycles is in the range of 36 months, or even the spring, scrum, whatever, I guess, that can be called a clash of culture. The change in mindset when old digitalization effort comes into play, that availability, in that physical notion of does it still operate, I guess will be augmented by confidentiality, of process data, or the integrity of control data, as in the old model’s case.

In the same way, coming back to that compliance versus compliance thing, this whole redo thing of other certifications of yeah, it is a point where you have that notion of safety and security in the operational worlds. The fun part from a language perspective is safety and security are both translated into German-like [inaudible 00:35:27], which makes it easier for us to overcome that cultural difference.

Safety and security, I have to say, “Okay, if I do operate a laptop, will I get a power strike, or something like that, so it's safe to operate?” Is it secure? A different question. For the operational technology, for industrial systems, having a 100 days of no incidents, no accidents since 999 days, all these right signs and billboards we have seen in the movies, where any weird guy come in and broke his lack, and then we – one day.

[00:36:11] CS: Back to zero again.

[00:36:13] DS: Yeah. So do things. Changing this, changing the overall cooperation between safety and security that they are not conflicting each other, that is probably a logistic hurdle, I guess. It's also some of the things where the mindsets, you see that I'm talking lots about mindset is have to come together, because if we don't understand the other mindset, if we don't put ourselves in the other shoes, on the other side of the table, we’re not going to overcome that hurdle.

The other issue now as you mentioned, productions are there for 24/7, so you have a single-plan maintenance window in a year. How do I do that? Even if I am able to update a system, if I can only do it once in a year, because of other requirements, difficult. Yeah, lots of individual difficulties. Overall, it is that notion of yeah, getting the two concepts together. I mean, we all know that OT is sensitive to the usual approach of cybersecurity. Scan a water treatment facility with a vulnerability scanner, and I would be cautious about the outcome.

[00:37:48] CS: Let's work backwards from a place of an ideal solution. We're never going to get exactly what we want, but it can be helpful to think from a best-case scenario. Fast forward a year or two, 10, whatever, and somehow the country's healthcare security network is the envy of the world and is impenetrable as could possibly be and it's up to date and it's compliant, how did we get there? What massive changes had to happen to get us to that point? You mentioned, mindset change, obviously. What kind of resource spending, human collaboration had to materialize to get us to the place that we want to be?

[00:38:24] DS: My role model is cyber resilience. I mean, that is starting from a collaboration point of view between the stakeholders, that the two mindsets, the two sides of safety and security. It might be also about spending and resources. We're not talking about a complete revamp of things. Over time, things have to be renewed. Yes. I mean, even sometimes 20 years are gone, so there is a new machinery coming in.

On the on the onset, if I know about what is on the other side, what is in the mindsets, how are things operated from their perspective and what kind of flexibility is there? Then I can change my approach to cyber security a bit, there a bit, and then we overcome these hurdles, where most of the time, people see that contradiction between safety and security.

[00:39:29] CS: Yeah. I almost am imagining a job role in my mind. Maybe it doesn't exist, but some sort of a gentle cyber mediary, someone who's very adept at holding the hands of these administrators. Without making them feel like a dum-dum, or that they've been sitting on a problem for 20 years without thinking about it. We always talk about how cybersecurity, like the hidden superpower is you need soft skills, and you need communication skills and the abilities.

It seems like here, there would be a special ability to have, if you could have someone who could just go from hospital-to-hospital, network-to-network, and very – without judgment, or just explain to them, “This has to happen. We need to do this. You need to make even the smallest changes.” How are we going to do this and how do we get you prepared for this big, scary changeover? I mean, what do you think?

[00:40:33] DS: Well, I agree to that. 100%. I mean, yeah, I've written about it in the recent weeks, that there is a good likelihood that we will have these two breeds of cybersecurity pros. That is the one guy who is able to talk about it, the one person who is able to mediate between the worlds as he has. He or she has a good understanding of both the worlds and is able to familiarize with the other side of the table, with the other side of our processes are seen. The other breed is the folks who are going deep into a system, who are actually trying to figure out how bits and pieces fall together, who do these forensic investigations and stuff like that.

On the other hand, yeah, it is a situation where you will always start by listening and not by educating. If someone is coming to me, I'm first listening to you, how do you do it now and why? There might be good reasons for them to do it, why they are doing it in the way now. Then, you you're asking and what would happen if you change that? Where are your flexibilities, so that you can adjust things, that you can do things in a different way by maintaining operability and increasing security?

[00:42:22] CS: Yeah. Now, I want to use that as a pivot point to talk about this work of this, the cyber work, if you will. We have a lot of cybersecurity students and aspirants and novice professionals who listen to the show. If they are interested in steering their skill sets towards the goal of these sorts of things, protecting ICS, or healthcare systems, or patching legacy systems, or other crucial networks that are currently underserved, what are some skills that they should be learning, or tasks they should be working on their own to show that they have what it takes to do this work? Are there types of information they should be passionate about learning, or exploring to become good at this?

[00:43:02] DS: It comes back to what I've stated about myself in the first place. It's know your trade. I mean, yes, you have to have a good understanding, a sound knowledge about cyber security, about information technology. Let's say, the perfect guy in artificial intelligence and machine learning and coming in with their, “I can program a system to digest tons of security events.” That doesn't help you to understand the data from a human point of view. If you are not familiar with what is behind the data, what is that actuate or doing with a setting? What is the sensor delivering to my industrial control system, to my scattered system, to my HMI? How do I make sense of these things from a process point of view, from a business process point of view, but also from an operational point of view, so that it's not only a massive amount of payment data, which is digested by an AI system?

All of a sudden, the AI system tells you, you're making rubber boats and boots, instead of doing some chemical stuff. I don't know. Just kidding here. It is that notion of go to the other site. If you're interested in cyber security, have your general knowledge ready. Have it ready in terms of you know networking, you know TCP IP, you know how to differentiate between simple terms, IDS, firewall, AV, vulnerability, management, change control, and all these things. You also should be able to provide context.

If it's not the case for you to do context, this mediation between the worlds, and the willingness to learn new worlds, on the one hand side, which is okay. Don't get me wrong. The other part is dig deep. Go into the analytics of how do I get into attack vectors? How do I do my forensics? How can I analyze, reverse engineer attacks? How can I find out what is going on on a specific device? Two sites, two far ends of the whole spectrum of cybersecurity. That would be where you can navigate in between.

[00:45:59] CS: This has been a great talk. Although pretty scary, also, I think, very hopeful in certain ways, as we've shown our listeners, ways that they can actively contribute to – I would say, to almost the betterment of society in certain ways, if they want to get involved with this work. Can you talk a little bit about – you mentioned in your report that a number of countries took the lessons to heart and secured their systems. I mean, do you have any predictions?

Sometimes you have a family member. It's like, “I don't want to do this thing.” Then you beg them, and they say, no, no, no. Then after they get over their initial, maybe embarrassment, or whatever, they do the thing. Do you have a sense of whether or not some of these people who maybe resisted before are secretly taking care of this business? Do you see this resistance continuing until something – worst-case scenario happens. Where do you see this happen, going in the next five years or so?

[00:47:02] DS: I think in all fairness, it's a matter of geography. We do see more and more on the one-hand side, regulation, more and more understanding of the issue. In broader term,. stories are highlighted in not only in our space in terms of a podcast as we do one, or Brian Crapps, or whoever is talking about it. We also see it in the headline news that there is an issue and there is something to know about it. Once we are moving away from that sensationalism about it, and going deeper into saying, “Why does it happen? How do we change that?” Be more solution-oriented than to, “Hey, there was a fire burning. We have to report about it,” way of doing it. I think then, that will help to change things. That will help to take away the fear of anything I do, I do wrong, sometimes.

Never touch a running system, so to speak. On the other hand, speaking about geography, I think knowledge, education is also a big deal here. It's hard to say that, as I would love to change it, but if I'm talking about the countries, like the US, Canada, the main European countries, in comparison to other countries where we have seen these massive amounts of system being unprotected, I'm not sure whether we have that same amount of knowledge and us understanding to start from. Simple as this. Yeah. It feels sad to say that, I have to admit.

[00:49:17] CS: You have to start from the painful truth, though.

[00:49:21] DS: In all fairness, I'm happy to be proven wrong here.

[00:49:24] CS: Yes. Sure. Yeah. We wrap up today, and then on a triumphant note here. Yeah. This has been a great conversation. I think it’s given us a lot to think about. Thank you very much for your time. As we wrap up, can you tell us a bit about your work with New Net Technologies? What types of projects are you working on for this year and what are some upcoming things that you're excited about?

[00:49:49] DS: Sure. With pleasure. NNT, what we do is one of the – we tackle one of these basic elements, the essentials of what is there that initiates a – yeah, what is the root cause of a cyber security incident? That is, something is changing. It's a configuration setting. It is a file being dropped. Yeah.

Someone sending you an email, you click on that, and you're something happens, something changes on your device. That's the simple office idea explanation of it. To monitor these changes, to be able to contextualize these changes, I mean, we can already monitor the changes, but the ability to contextualize them, to put them into relation of what is going on on the machine? Where is that machine? What function is provided by the machine in a business context? These are the things we're working on.

Whether it is in the cloud, in virtualized environments, whether we were talking about ladder logic on a PLC, everywhere can be a change that can be good, that can be expected, that can be initiated internally due to normal operation, but it can also be malicious. For us to be able to distinguish between good and bad change and to highlight those who are bad, to make sure that readily available is the information about what is that change? Who has initiated? Where does it come from? What does it mean? What is the context of it? What is the role of the device? That is what I'm doing in my daily work now with NNT, next to talking to nice hosts.

[00:52:08] CS: Oh, you're very kind. Let's wrap up on that. Last question. For all the marvels, if our listeners want to learn more about Dirk Schrader, or NNT, where can they go online?

[00:52:18] DS: www.newnettechnologies.com, or use the short version of it, nntws.com.

[00:52:26] CS: All right. Thank you, Dirk. Thank you so much for sharing your time and insights today. This was incredibly illuminating and a lot of fun.

[00:52:33] DS: Thank you. See you.

[00:52:35] CS: All right. Thank you as always to everyone at home for – or at work for listening and watching. New episodes of the Cyber Work podcast are available every Monday at 1 p.m. central, both on video on our YouTube page and audio wherever you find podcasts or downloaded. You can also find it at www.infosecinstitute.com/podcasts. Check out our past episodes.

Also, don't forget to check out our hands-on training series, Cyber Work Applied. Tune in as expert Infosec instructors teach you a new cybersecurity skill and show you how that skill applies to real-world scenarios. Go to infosecinstitute.com/learn to stay up to date on all things cyber work.

Thank you once again to Dirk Schrader, and thank you as always to everyone for listening and watching. We will speak to you next week.

Free cybersecurity training resources!

Infosec recently developed 12 role-guided training plans — all backed by research into skills requested by employers and a panel of cybersecurity subject matter experts. Cyber Work listeners can get all 12 for free — plus free training courses and other resources.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.