K-12 cybersecurity: Protecting schools from cyber threats

Michael Wilkinson leads the digital forensics and incident response team at Avertium. The team is dedicated to helping clients investigate and recover from IT security incidents daily. Wilkinson talks about threat research, the threat of Vice Society, how K-12 cybersecurity can improve and much more.

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

0:00 - Digital forensics and incident response
3:12 - Getting interested in computers
6:00 - How had digital forensics changed over the years
9:03 - Handling overwhelming amounts of data
12:53 - The threat of Vice Society
17:20 - Why is Vice Society targeting K-12?
19:55 - How to minimize damage from data leaks
24:25 - How schools can improve cybersecurity
25:54 - What schools should do if cyberattacked
31:36 - How to work in threat research and intelligence
34:42 - Learn more about Avertium
36:40 - Learn more about Mike Wilkinson
37:08 - Outro

[00:00:00] Chris Sienko: Every week on Cyber Work, listeners ask us the same question. What cybersecurity skills should I learn? Well try this, go to infosecinstitute.com/free to get your free cybersecurity talent development eBook. It's got in depth training plans for the 12 most common roles including SOC analyst, penetration tester, cloud security engineer, information risk analyst, privacy manager, secure coder and more. We took notes from employees and the team of subject matter experts to build training plans that align with the most in-demand skills. You can use the plans as is or customize them to create a unique training plan that aligns with your own unique career goals. One more time, just go to infosecinstitute.com/free or click the link in the description to get your free training plans plus many more free resources for Cyber Work listeners. Do it. infosecinstitute.com/free. Now, on with the show.

Today on Cyber Work, Mike Wilkinson, leader of the digital forensics and incident response team at Avertium joins me to talk about Vice Society, a fast-rising threat group who have chosen K12 school districts for ransomware attacks. Mike helps us through the details of the LA Unified School District ransomware attack, maps Vice Society within the threat groups space, and I can't resist asking him about his days doing digital forensics back in the early aughts. Keep yours on because it's Cyber Work time.

[00:01:30] CS: Welcome to this week's episode of the Cyber Work with InfoSec podcast. Each week, we talk with a different industry thought leader about cybersecurity trends, the way those trends affect the work of InfoSec professionals while offering tips for breaking in or moving up the ladder in the cybersecurity industry.

Today's guest, Michael Wilkinson leads the digital forensics and incident response team at Avertium. The team is dedicated to helping clients investigate and recover from IT security incidents on a daily basis. Michael has over 20 years' experience in the IT industry and has been conducting digital investigations since joining the New South Wales Police State Electronic Evidence Branch in 2003, where he led a team of civilians in one of the world's largest digital forensics labs. Wow! Since moving on from law enforcement, Michael has led DFIR teams in Asia, Europe, and the Americas. He has been a regular speaker at conferences including CEIC, DFRWS, PFIC, HTICA, and AusSearch. Michael has been involved in the development and teaching of several graduate programs in digital forensics and incident response, including the University of South Australia, University of New South Wales, Macquarie University, and Champlain College in USA, and has also taught at Black Hat.

Today's topic, we're going to be talking about Vice Society, a currently rising threat group that has made an unfortunately large name for itself in a short time. Looking forward to hearing that. Mike, thanks for joining me today. Welcome to Cyber Work.

[00:03:00] Mike Wilkinson: Oh, you're welcome. Thanks for having me.

[00:03:02] CS: Yeah. To get started, I like to get to know our guests by tracing their interests a little bit. Obviously, you have a long history with incident response and digital forensic. Where did you first got interested in computers, tech security, all that kind of thing? Is this something that you've been into since you were a kid or was there a defining moment in your adult life?

[00:03:21] MW: Yeah, it was something I was interested in as a kid and actually started programming in probably, I think it was sixth grade. The school got a bunch of Apple 2s probably, I guess, at that stage. I remember we had the logo program where you got to drive what they called a turtle around the screen and create pictures. I figured out, if I wrote a program that sort of drew the picture, and then deleted it, I could sort of have a plane flying across the screen.

[00:03:53] CS: Oh, interesting. Yeah.

[00:03:55] MW: Pretty cool at that time. But yeah, then I'm trying not to get too long winded here, but I actually ended up at a high school, eventually actually doing a degree in textile technology, something totally different. Working in a factory for a bit, ended up actually designing, and making rock climbing harnesses. Then, went and did a business degree. Part of the business degree included IT, and then sort of got into the programming from that. Then went get a master's, and then started working for the New South Wales Police Force on digital forensics, and that sort of then, yeah, the journey.

[00:04:36] CS: Okay. Yeah, so that was kind of a natural direction for you that you got into digital forensics through a work opportunity. Was that something that had interested you up to that point, or were you kind of finding out about it as you were getting the position?

[00:04:52] MW: I mean, the security side of things. I was actually teaching at that point, and security was sort of one of the subjects I was teaching, and something I'd always been interested in, and some programming as well. The police job, that was that was kind of kind of bizarre thing. They advertised it and I applied, and then it was like seven months later that they reached out and said, "Hey! Would you like to come for an interview? Which government departments do slightly every now and then.

[00:05:28] CS: Yeah, I know. Yeah. We've been working on getting our GI Bill certification for a long time. I know how paperwork works. Okay. Yeah, I'm very excited to talk to someone in digital forensics. We're going to be doing a big series on that next year, and especially sort of debunking the way that digital forensics is portrayed on TV shows like CSI versus how it, you know. You've kind of been doing this since digital forensics was a thing. Can you talk about how the technology, or the process, or everything changed, and also how the perception of it changed since people thought we had limitless zoom ins, and rotate the image, and all the high-tech stuff that doesn't exist?

[00:06:16] MW: Well, that was always the classic one, the NASA technology that lets you recover the image from the reflection of sunglasses.

[00:06:23] CS: Yes, of course.

[00:06:24] MW: On the pull handle or something like that.

[00:06:28] CS: That's him, all right.

[00:06:29] MW: Yeah, absolutely. I mean, especially back in the early days, and this is actually before I started, but only a couple of years before I started, which was 2003. Just to give you an idea of the volume of data we're dealing with. It used to be standard to actually print out a listing of every file on a computer that you examine. You write up your report, you've got your findings, and then just include it in an appendix. It's just a list of every file was on the file system. I just can't imagine the reams of paper you're going to go through if you're going to take that sort of approach. It really was possible. I mean, by training, in terms of learning file systems and things, and everyone else teaching file systems. We're still working on floppy disks, and looking at how your FAT12 worked and having to manually decode FAT12 is always a fun thing that's just twist your mind around, particularly painful 12-bit numbering system. When viewed in hex editor, everything gets twisted up a bit, it's just quite painful.

I guess back in those days, a lot of focus was primarily around just recovering deleted files and things, which was relatively straightforward when you're using spinning disks, and floppies, and things. [Inaudible 00:07:58] media, you have a lot less opportunity to recover stuff. You've got all the internal cleaning activity going on. Back then, files and the information are getting recorded, but it was really pretty minimal at the operating system level and at the application level. These days, there's so much tracking stuff going on on computers. I'm probably going to make people a little paranoid now, which is a good thing, you should be. But there's so much being recorded about everything you're doing on your system now, that it's really moved well beyond your file system type analysis to really focusing on all the different artifacts and pulling those artifacts out of the operating systems. And using that to actually put together a much stronger picture around, certainly in criminal cases, it's building a picture of what the computer has been used for, and being able to demonstrate, yeah, this person had the intent to be doing whatever it was that they were doing with that system.

[00:09:04] CS: Now, would you say that – it sounds like, obviously, the amount of data has exploded. I mean, is it harder to sort of get your arms around all of that. Is there an overwhelm to just the sheer amount of data or is there a better technology, or tools in place now to sort of sort and order the things to make sort of a more robust picture?

[00:09:28] MW: Yeah. I would say that the tools now are far more consolidated. Back in the old days, it really was – you had forensic tools, or forensic tools would pretty much give you access to the file system, give you access to deleted files, and do a pretty minimal amount of artifact processing. A lot of times, you'd be using multiple tools, you'd be extracting stuff out, and then processing with some sort of open-source tool pretty often. These days, there's a few different commercial tools out there. They tend to have a lot of artifact processes just built in, so you can get away with not having to memorize – I'm not going to say memorize, but not having to have a database of, "Okay, I'm looking at this artifact. This is the script I have to go and use to process it and understanding it." That comes at a risk as well. I think there's a lot of potential misunderstanding when you're using one tool, and you sort of do what's called a push button forensics.

[00:10:39] CS: Yeah. You do the thing that it says for you to do in the instruction booklet, and then you take no other steps otherwise.

[00:10:45] MW: That's right.

[00:10:45] CS: Well, it said, no, there must not be anything there. You think that there is a little bit of a discouraging of sort of creative workarounds by just the sheer ease of the use of the tools these days?

[00:10:58] MW: I would say, the problem is, one of the challenges certainly in my world, and in the corporate world, I wouldn't be billing the customer by the hour. We're always conscious of minimizing costs for them, so there's always a certain amount of pressure to keep things moving, get things done as quickly as possible. You've always got to balance that against the process of verifying your findings, and [inaudible 00:11:26] to make sure that the tool is working correctly, that you understand how to use a tool in the first place. A lot of the time, you use a tool, and it's giving you results. The way it's wording those results may not be entirely accurate. So you've got to understand, "Okay. This is the artifact I'm looking at, this is the information, it's available within that artifact." Depending on how the system has been used, it's going to change the way that information behaves.

Just running the tool, and getting an output, and getting a spreadsheet, all the bunch of timestamps, and things may appear to be telling you one thing. But unless you really understand how that artifact was created, how that artifact was recording the data, it could be something totally different. I think that's something I see, probably newer people getting into the field, falling into traps every now and then. Certainly, when I'm teaching, it's something that I really focus on is, be aware of the limitations of your tools, and you're not using the tool, or if you're doing job properly, you're not relying on a tool, you're understanding the artifact. You're on understanding how the artifact behaves, you're understanding what that artifact is actually telling you, and the tool is just a shortcut to –

[00:12:54] CS: I'm glad we took that little tangent there, because ultimately, our listeners are sort of getting their feet wet in cybersecurity and looking for their career path. I think that's a really good insight for digital forensics curious people out there. I want to move from there to, obviously, threat intelligence. Today's topic, obviously, is a new threat group calling itself Vice Society. In September 2022, at the start of the academic year, Vice Society targeted and successfully breached, the Los Angeles Unified School District, second largest school district in the US with 640,000 students and over a thousand schools. We're going to break down the attack, and the response a little bit. But from your perspective, tell me about Vice Society, and what do we know about this group, and where it's located, and it's common targets, and methods. I feel like I'm hearing about it as though it's like a new kid on the block. Is that a reasonable assessment?

[00:13:46] MW: Yeah. The group itself is certainly a new kid on the block or relatively. I think they've been around for about a year or so now. Time seems to fly way too fast these days. But the interesting thing about them, a lot of the groups we hear about, and the way ransomware operates is you generally have your ransomware writing group that writes the actual and encryptor, manages the payment transaction, and then you have the affiliates. The affiliates would take the encryptor that's been developed by that sort of core business, gain access to the victim, they ignite the ransomware, and then all the encryption keys or decryption keys are actually managed by that ransomware writing group. The other ones that will have the name associated with them, Conti, and REvil, and so on.

[00:14:46] CS: Exactly.

[00:14:48] MW: Interesting thing about Vice Society is this is basically an affiliate group. They're using different encryptors and targeting these various school districts, but they're not actually writing the ransomware themselves. There's a lot of affiliates out there. There's a very large number of groups and individuals out there doing this sort of thing. But for the most part, we generally are identifying and linking back to the actual encryptor that they're using, rather than the actual group. That's a little different from that aspect.

[00:15:26] CS: Yeah. It would be easy enough to sort of think of these as kind of warring mafia factions or something, when in fact, you're talking more – it'd be like calling the 45 society or the sawed-off shotgun society or something like that. It has been made sort of clear that their primary targets are K12. Schools or is that just some of the high-profile things that have happened so far? Have they come out and said, "This is our target" or is that just sort of what we've seen according to data?

[00:15:57] MW: They haven't specifically come out and said that they're targeting that. There was an advisory from CISA that, yes, this group was targeting education. Certainly, some affiliates do tend to target specific verticals or industries based on – I mean, it all comes down to money at the end of the day. I mean, these groups are there to make money. With the more mature groups, they tend to develop expertise in certain areas. School groups, unfortunately, our schools, unfortunately, tend to be softer targets, and the same goes for the healthcare industry, as well, unfortunately. It's really, based on my experience, working with schools and healthcare as well, these are people whose business is helping others. At the end of the day, they're fairly trusting, their focus is on giving access and making life as easy for their customers, or students, or patients as possible. They don't necessarily have that same level of security. That's changing.

[00:17:19] CS: Yeah. That's going to be my next question, is that, since obviously, higher education orgs have been a target for a long time, but is it the fact that maybe because higher ed has deeper pockets to pay ransom or well-funded IT departments that can do regular pen testing, and red teams, and stuff? Do you think the move towards sort of K12 school is because it's just getting a little too hard to get into sort of higher ed targets anymore?

[00:17:50] MW: Yeah, I think that's probably it. The deep pocket side, I mean, obviously, as long as people are paying the ransom. Yeah, they're going to keep – we're going to keep seeing ransomware. I think emotionally, if you're a school principal, or supervised superintendent, or whatever, you're on this position making decision, don't have to pay the ransom or not? They've got all the details on all these kids. Instinct is going to be, "I want to do everything I can to protect the students," which then gives them that extra pressure to get that ransom paid. That's part of the whole, I guess, maturation of the way these ransom groups are operating. In the beginning, they were just encrypting everything, they weren't exfiltrating data. For the most part, that was enough to get people to pay the ransom. Then they found, "Okay, people identified the threat, but they're actually building more robust backup systems, or backup systems in the first place in many cases. So then, okay, if we exfiltrate data, then we've got more leverage, we've got that additional pressure to apply."

I've worked with school groups that have made the decision to pay the ransom, to prevent the sort of release of data, which is a tough decision to make. I mean, obviously, these are criminals, you can't really trust them. At the same time, in terms of value, bunch of student data is not of a huge financial value or something that they can turn around and sell easily as if you've got a bunch of credit cards, or if you've got a bunch of insurance details for patients. and health insurance details or something like that. Monetizing student data is harder and worthless.

[00:19:55] CS: This might be sort of out of your area of experience, but I kind of want to talk about the differences in response. I mean, we know Vice Society exfiltrated 500 gigs of personal information from students and threatened to leak it public. But LAUSD, according to your report said, the district was not going to pay the ransom to prevent data from being leaked, because it was not a guarantee that hackers wouldn't end up leaking the data anyway. The district believes that the money could be put to better use, such as funding different needs for students and their education. Now, of course, by comparison, Cedar Rapids Community School District in Iowa made the decision to pay the ransom to minimize disruption from the loss of the data without trying to use like a crystal ball or whatever. Do you have a sense of what some of the metrics are in terms of thinking about, "Okay. We know they're going to leak the data. What do we do to kind of minimize the damage of that?"? Have you worked with organizations that have made that calculated decision? Can you talk about sort of non-tech measures that can be put out there to make sure that this isn't – it doesn't lead to anything from cyber bullying, to discrimination or any other sorts of things?

[00:21:11] MW: Yeah. I mean, there's a bunch of different considerations here. Making that decision of, "Do we pay or do we not pay in order to protect the data?" is as I said before, that's a tough decision. It's an interesting thing I've seen over the past two years or so. We helped a lot of organizations that have been hit with ransomware, and help them with the negotiations, and things, and provide whatever guidance we can around the decisions, the risks, and everything else. There's been a trend, I would say over the past year or so, where people are less and less likely to pay in order to prevent the release of data. When we first started seeing the shame sites, it was really common to pay. Organizations would go and look at, back in those days, you say the Conti, all the REvil shame site. I'd say, all the different logos of companies that hadn't paid, they're up on the site, they're clearly visible. It'd be, yeah, we don't want our name up there. We'll come to some agreement.

I think these days, it's so common to hear this happening. It doesn't have that same impact anymore. Then in terms of protecting people, credit monitoring is a really standard thing that it's provided, generally. It changes state by state. I'm certainly not a lawyer. I don't know the requirements for every state.

[00:22:47] CS: Of course, yeah.

[00:22:49] MW: But credit monitoring is pretty standard to provide to any victims that have had their data leaked. In terms of the cyber bullying and in terms of actually accessing the data, I mean, it's interesting, most of the groups, I guess, the well-organized groups that do have the shame sites, they are publishing the data, they are making it freely available for anyone to download. Now that you mentioned, I mean, certainly, in terms of bullying and things, that's not something I've seen or I'm aware of. But at the same time, it's not something I've specifically looked for and investigated.

[00:23:30] CS: Yeah. I guess I'm sort of hard – I'm trying to imagine what kind of personal information about students is making it out there, whether it is just home addresses, or financial data, or other.

[00:23:44] MW: Certainly. I mean, schools have or potentially have healthcare data and stuff. There is potentially sensitive information that could be getting out. That's probably the most disturbing aspect. The other problem we have is that, even if you do pay the ransom, there's still no guarantee that the data is not going to get out there. You are dealing with criminals that have already broke into the systems. In terms of morality, they don't have a whole lot.

[00:24:20] CS: Hindsight, obviously, it's 2020, and everyone knows to watch for breaches after they've been breached. But for school districts, especially lower-level education, who might not be watching as closely or who are just now finding out that they have a target on their back more than in the past. Especially if you have any low or no cost improvements, and so many schools are underfunded. What tips do you have to help them strengthen their security posture and protect infrastructure from attacks like these?

[00:24:47] MW: Yeah. I mean, okay, full disclosure here. I work for a company that is also an MSSP. We provide managed security services. I think outsourcing really as much as possible, and even not so much from the MSSP perspective. But Chromebooks, Office, Microsoft, Google, all provide hosted environments that have pretty robust security infrastructure around that. At the same time, most breaches these days are happening through compromised credentials, whether that's through phishing. Phishing is definitely the most common way.

[00:25:37] CS: It's the rules the roost, yeah.

[00:25:39] MW: Yeah. So multifactor authentication is the other real – if you're not doing – there's just no excuse not to be doing multifactor these days.

[00:25:48] CS: Yeah, bare minimum. Right. Okay. I want to talk in terms of in the moment. School districts who might be listening who don't have an extended team of trained cybersecurity staff or a clutch of lawyers on speed dial. I want to ask, if you do get breached and hit with ransomware, can you sort of walk through the flowchart of what steps you should take first? I mean, do you think that LA Unified School District did the exactly the right thing? Is there anything that could have done better? Did they wait too long? Are there any considerations you should be taking when weighing the cost of ransom versus cost of exposed data?

[00:26:27] MW: Yeah. I mean, look, I don't have enough insight into LA School District specifically to comment. I mean, standard approach and what we recommend, as soon as the incident has been identified, look to containment. You want to minimize things getting any worse. I've certainly had situations where clients have identified, there's someone inside their network, and then they have a reluctance to maybe shut systems down, or restrict internet access. Then 24 hours later, suddenly, everything's encrypted. Hindsight is always 2020, but it's a pretty common occurrence that we see situations when things are detected, and just – people think it's all cleaned up. That's one of the, I guess the other issues I have with a lot of security vendors. You get your antivirus product, and it detects something, and then you check the details on it. "Yes, we protect you against this threat." Okay, great. You've detected something, but really, what exactly is it that you've detected? Is it some malicious attachment? Great, you've detected, you blocked it. No problem there. Or is it something like Mimikatz that's running on a desktop? In order for that to be running, [inaudible 00:27:50] is able to execute commands on that system, so you're actually detecting something at the point at which a threat already has access to the system. You need to be able to look at those alerts, understand exactly what it is that's been detected and know, "Yes, this has been cleaned up, and I have nothing further to worry about," or "This is actually an indication of a bigger problem, and we need to go in and fix it." Sorry. I got a little off topic there.

[00:28:16] CS: No, I totally agree.

[00:28:19] MW: Then in terms of handling it, once something goes on, or incident is concerned, couple of key things. Ideally, you have an incident response plan in place already, so you are prepared in advance. That incident response plan should have some sort of playbook on how you handled sort of most common incidents. Some of the tougher decisions is, "Okay. Do we pay? Do we not pay? Do we contact or do we don't contact?" Having decision trees and guidance in the plan that can help you in those stressful situations is super valuable. The other thing is making sure you're actually preserving evidence. Instinctive reaction of a lot of IT guys is, "Oh, you have to take them out on the system. I'm just going to wipe it, restore it, and we can move on." By doing that, you've just lost a bunch of critical evidence, and will potentially lost a bunch of critical evidence.

[00:29:20] CS: Yeah, good point.

[00:29:22] MW: For us, when it comes to the exfiltration. Normally, ransomware-type scenario, initial access will happen sort of two weeks to two months prior to encryption taking place. Exfiltration will happen before encryption happens. If you're just running around cleaning up, wiping systems, rebuilding them, and everything else, then you're potentially losing that evidence around the exfiltration.

[00:29:47] CS: Interesting, yeah.

[00:29:49] MW: Which then becomes, and super important. I mean, if you're then facing litigation, or you have concerns, you want to figure out exactly what's been taken from a forensics perspective. If the systems haven't been wiped, most of the time, we can figure out what's been taken. There's enough evidence on the system left behind, blah, blah, blah, what the threat actor has been doing. We can say, yes, they were browsing to your financial teams share file, they've zipped everything up, they saved it to this computer, then they've uploaded it to mega or something. We can trace all of that and figure out, yeah, this is exactly what they've taken. But if systems are getting wiped, then we can't do that.

[00:29:49] CS: Yeah. I mean, I wonder if some of the wiping of the system is for expediency, and some of it is just like a child who breaks a vase, and wants to cover it up by throwing it away. Is there kind of a shame factor of like, "I don't want to think about it. Let's just wipe it, and start over, then no one will have to know that I –"

[00:30:45] MW: Yeah. I mean, there is certainly a bit of a shame factor. I think that's a bit of a pity, because it really – they're the victim. Someone else has just committed a criminal action on whoever systems. It happens to so many organizations. It's really – if you look at the list of some of the organizations that have been compromised. There are a lot of different victims out there, a lot of well-funded, well-organized companies that have still ended up compromised. From that aspect, admitting upfront that you have a problem is the best way to go about resolving it and addressing it.

[00:31:35] CS: Yeah. Well, that's great, because you moved from response plan to sort of the threat research angle, which is obviously your expertise in digital forensics. I want to turn this back from the career perspective. For any listeners who are trying to break into cybersecurity or move into a different facet of the industry, and might like to do the kind of work that you do with threat research, threat intelligence, and sort of being on the frontlines. Can you talk about some solid skills, experiences, or innate talents that makes someone an especially good threat researcher? What would you need to see on a resume to be excited about a candidate?

[00:32:14] MW: Yeah. Look, I like to see a broad background. Really, when I'm hiring, I'm actually looking for experiences beyond just this field. Whether you're talking to sort of digital forensics, incident response, threat intelligence, I like people to be bringing something else to the, I guess, to the job. They have a good understanding of, I guess, for me, it's all about the context. You need to be able to put whatever it is you're looking at into context. Coming from law enforcement, or coming from a data science type background, just working the frontlines, having been a sysadmin, and helped schools, or whatever type of organization, it doesn't really matter. Managed systems, worked in programming. Programming is awesome, because threat intel side, you potentially crunching a lot of data, you'll get malware reverse, all that sort of thing.

[00:33:23] CS: You know where all the nuts and bolts are in the machine going to the source of the problem.

[00:33:28] MW: Exactly. The other thing is, I think this applies to any security type job is, you've got to have a real passion for learning. If you want a job where you're just sort of working nine to five, and you can walk away at the end of the day and forget about work. This isn't it. It's really, you're going to be spending a lot of time just reading, learning, teaching yourself.

[00:33:59] CS: Staying on top of things. Yeah.

[00:34:00] MW: Yeah. You got a copy of VMware, or VirtualBox, or something, and you've got a decent lab set up so that you can test out different things and play around with them. You really got to enjoy doing that. It's going to be – I got guys that will spend their weekend playing around with malware, and then come to work on Monday and continue playing around with malware.

[00:34:24] CS: Right. Give a report on what they did over the weekend. Yeah.

[00:34:28] MW: Yeah, exactly. Yeah, it is. Really, if you're that passionate about it, and enjoy it, is it really worth in the first place, which is a really awesome position to be in.

[00:34:41] CS: Absolutely. As we wrap up today, you mentioned it before, but I'd love to have you tell us about your company, Avertium, and some of the projects and developments you're eager to talk about, your MSP, or any other new things happening going into 2023.

[00:34:57] MW: Yeah, sure. Avertium is a security services company. We have a managed services side, really focused around the MDR side of things, sort of Microsoft and SentinelOne. I'm biased, okay. From my perspective, coming from incident response. EDR is super valuable. [Inaudible 00:35:22] things are good, but EDR is better. I just think everyone should have EDR. We focus on that, on the managed EDR, but we also do a bunch of same and log management stuff as well. Then also, professional services, and professional services that ranges from sort of basic compliance assessments, security assessments up to virtual CISOs. We can really come in and help an organization figure out where are the security gaps, what do you need to fix, and then provide them with that roadmap to get it all fixed out. Then my team, which I think is the best team in the company. I may be slightly biased there. Yeah, we're the ones that go in when things go bad, and get to do the investigations, and figure out what went wrong, and help clients deal with that situation. It does get super stressful, but it's also really interesting, and really satisfying, because it generally –

[00:36:29] CS: I can't imagine, yeah. To feel like you're the Magnificent seven or whatever that's get to come in and take care of business. That's always going to be exciting. One last question, the most important question of the entire episode. If our listeners want to connect, and learn more about Michael Wilkinson and Avertium, where should they go online?

[00:36:47] MW: Yeah, so avertium.com. I guess, dub, dub, dub, avertium.com

[00:36:53] CS: A-V-E-R-T-I-U-M?

[00:36:54] MW: Yes, that's correct. Then, if you want to look me up, you can look me up on LinkedIn. I am on Twitter, but I've got to confess that I haven't used it for a few years.

[00:37:07] CS: No one I've talked to who I enjoyed talking to has said, "Yeah, I'm on Twitter all the time." Like everyone, they're all like, "I can't be near that sewer," so I understand. That's great. Find Michael on LinkedIn. Find Avertium at avertium.com. Mike, thanks for joining me today.

[00:37:26] MW: You're welcome. It's a pleasure.

[00:37:28] CS: As always, I'd like to thank everyone at home for listening to and watching the Cyber Work podcast on an unprecedented scale for 2022. We have nearly tripled our numbers since the middle of the year, and we're delighted to have you all along for the ride. I'm just going to say, go to infosecinstitute.com/free to get your free cybersecurity talent development eBook. It's got in-depth training plans for the 12 most common roles including SOC analyst, penetration tester, cloud security engineer, information risk analyst, privacy manager, secure coder and more. We took notes from employers and a team of subject matter experts to build training plans that align with the most in-demand skills. You can use these plans as is or customize them to create a unique training plan that aligns with your unique career goals. Once again, go to infosecinstitute.com/free, or click the link probably down there to get your free training plans.

Thank you once again to Mike Wilkinson and Avertium. Thank you all so much for watching and listening. As always, we'll see you next week. Take care.

Free cybersecurity training resources!

Infosec recently developed 12 role-guided training plans — all backed by research into skills requested by employers and a panel of cybersecurity subject matter experts. Cyber Work listeners can get all 12 for free — plus free training courses and other resources.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.