Digital forensics careers: Facts versus fiction

Forget what the crime scene TV shows have told you — digital forensics is not done on an overhead projector while the whole department watches! Learn about the day-to-day work of a digital forensics professional from a team of experts who have been putting in the work for decades!

In this episode of Cyber Work Live, you will learn:

- The types of tools you’ll use to help bring criminals to justice
- Why a lack of technical experience isn’t a barrier to entry
- How to get real-world forensics practice in your own home
- Where a career in digital forensics can take you 

  – Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free 
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

0:00 - Digital forensics careers
4:28 - Limits of going off the grid 
12:28 - What do SIM cards actually do?
33:12 - Gathering evidence in digital forensics
44:08 - Digital forensics and the cloud
51:44 - Working as a digital forensics professional
54:42 - Digital forensics certifications
59:50 - How to pursue a digital forensics career
1:02:24 - Outro 

[00:00:00] CS: Is Cinderella a social engineer? That terrifying monster trying to break into the office or did he just forget his badge again? Find out with Work Bytes, a new security awareness training series from Infosec. This series features a colorful array of fantastical characters including vampires, pirates, aliens and zombies as they interact in the workplace and encounter today's most common cybersecurity threats.

 

Infosec created Work Bytes to help organizations empower employees by delivering short, entertaining and impactful training to teach them how to recognize and keep the company secure from cyber threats. Compelling stories and likable characters mean that the lessons will stick.

 

So, go to infosecinstitute.com/free to learn more about the series and explore a number of other free cybersecurity training resources we assembled for Cyber Work listeners just like you. Again, go to infosecinstitute.com/free and grab all of your free cybersecurity training and resources today.

 

[00:00:59] CS: Our first guest, Amber Schroader is the CEO and founder of Paraben Corporation. She has spent the last three decades as a driving force for innovation in digital forensics. Amber has developed over two dozen software programs designed for the purpose of recovering digital data from mobile phones, computer hard drives, email and life monitoring services.

 

In addition to designing technology for digital forensics, she has also spearheaded the procedures for mobile and smartphone device as well as emerging field of IoT devices. Amber is the patent holder on the EMI shielding container. Otherwise known as the Faraday bag. As well as the inventor of many other shielding products. Amber's written and taught numerous classes for this specialized field as well as founded multiple certifications in the field.

 

Next, I'd like to introduce you to Tyler Hatch. After spending some time with a Vancouver-based digital forensics firm, Tyler founded DFI Forensics in July 2018. Tyler is a certified computer forensics examiner or CCFE and a certified mobile forensics examiner, CMFE. And is always training and receiving education to further his knowledge and further his understanding of computer forensics, IT forensics, digital forensics, cybersecurity and incident response. He is a frequent contributor of written articles to various legal and digital forensics publications including advocatedaily.com, lawyersdaily.ca, eForensics Magazine and Digital Forensics Magazine. As well as the host of the Digital Forensics Files podcast.

 

Next, please join me in welcoming Donald Wochna. Don is one of the few attorneys in the United States who is also a certified computer forensics examiner and a certified mobile device forensics examiner. He is a Ohio resident, a Vietnam veteran and a former associate at Thompson Hine and partner at Baker and Hostetler where he practiced after graduating from the law school at the University of Chicago in 1983.

 

His legal practice strategies and advice in criminal and civil matters combine advanced technical and legal analysis to identify and achieve legal objectives within electronic environments. He currently provides expert services through Digital Safety Group LLC, which provides forensics and data services that help clients and their attorneys achieve objectives with or without confrontation with third parties. Including divorce, custody, internal corporate investigations, privacy stalking and harassment prevention.

 

That's quite a lineup. And I hope you're looking forward to a great conversation because we're going to have it. Amber, Tyler, Don, welcome to Cyber Work Live.

 

[00:03:33] DW: Thank you, Chris.

 

[00:03:33] TH: Thank you.

 

[00:03:35] AS: Thanks.

 

[00:03:34] CS: We're glad to have you also. I will be taking questions from the audience as they come in throughout the event. But our presentation this time is uniquely structured around a set of video clips that Amber has selected depicting some various examples of digital forensics, mobile forensics and related disciplines that, to put it generously, just really don't pass the smell test for anyone who has the most basic amount of security background.

 

Our panelists will discuss these misrepresentations and strangeness in each clip while also talking about the real work that they do around mobile cloud and digital forensics.

 

I guess if we want to begin, then let us begin. Our first clip here, carrier pigeons. How do they work? You're going to love this one. This is from the movie John Wick. Here is the clip. Let's take a gander.

 

[00:04:37] BK: Welcome to my Mission Control. Brain stem of my operation. The information superfly way. From whence I controlled the word on the street, the way of the world.

 

[00:04:47] A: With pigeons.

 

[00:04:49] BK: Yes. You see rats with wings, but I see the internet. No IP addresses. No digital footprint. Can't track it, can't hack it, can't trace it.

 

[00:05:02] A: Can you get disease from it?

 

[00:05:04] BK: Well, I wouldn't recommend that you eat them.

 

[00:05:09] SF: All right. This first clip started to get us warmed up from, as I say, the John Wick film franchise. We see that Laurence Fishburne really likes pigeons and that he thinks that using pigeons for communications will keep him out of the way of the surveillance state.

 

I'm going to start with Amber, if you're available right now. Apart from the rather decorative way that words like IP addresses and hacking are tossed around, this clip lines up nicely with things I see on Twitter from various sort of tech blowhards who are getting dunked on via Twitter mobs and they respond by saying things like, "Oh, just keep posting. I've got your IP address. So I know where you live." You know?

 

So let's start there. Why did you want to use this moment from John Wick to illustrate the concepts we'd be talking about today?

 

[00:05:53] AS: I think one of the reasons I picked it is also a perception. I always joke and digital forensics that the only valuable way to explain things is to make sure it's something your mother could understand. I actually played this for my mom and she's like, "Is that really IP addresses? So, they work like pigeons carrying one message to the next one." And I was like, "Um, not exactly, but I guess we're kind of adjacent to it." And then we kept explaining it and going through it. But it was all about that explanation and also, I guess, the perception of how people look at it. We always have that CSI effect and I think that's what this is all about today, about how the CSI effect has really changed what digital forensics is.

 

[00:06:35] CS: Yeah. Yeah. No. Absolutely. And I think if you sort of live in this space, it's hard to sort of necessarily see without some concrete example that if you don't at all – this is the only information that a lot of people have about this kind of stuff. Even if it seems absurd, you're like, "Well, I don't know. I guess maybe that's what they do."

 

Yeah. Considering that we have three panelists with extreme knowledge of all the ways and places that one's information can get into other hands, let me jump to Tyler next here and ask what you think about Mr. Fishburne's plan to eliminate "IP addresses and hackers by using carrier pigeons".

 

If he's thinking completely off the grid for doing this, can we brainstorm some other ways where his operations might be spied on by other means than computer networks?

 

[00:07:25] TH: Well, there's obviously a huge physical component to his activities and use of these birds and what have you. And as is noted on the slide, pigeons have their own natural hackers in the form of a hawk. Certainly not foolproof by any means. But I think probably what drives me more crazy than anything else about this clip is just the whole misunderstanding, misconceptions around IP addresses. Because we get a lot of inquiries. People want IP addresses to locate a person behind the stalking harassment, particularly with social media.

 

I always find those kind of things interesting. Because, A, I always tell people there's no IP address that I can get from an Instagram message or something like that. Instagram may have that on their back end. But even if you can get it, which is highly unlikely or difficult and expensive to do, it's going to give you most likely a broad geographical area that isn't going to pinpoint anybody in particular. Are there ways the law enforcement and people like that can then get further information? Yes. But it's not easy to just sort of get an IP address and figure out who's on the other end.

 

[00:08:39] CS: Yeah. And also, it makes me also think of the way that certain, let's say, members of our extended family at family gathering say things like, "I don't use a credit card because I don't want to be tracked," while their phone's buzzing in their pocket. I think of this particular person and it's somehow – whether it's purchases or whether it's, like you said, geographic location, the idea of like, "Well, they can't hack this." But there's plenty of other input points that I think are not being thought about here.

 

[00:09:09] TH: Very well said.

 

[00:09:10] CS: Don is an attorney who's also a certified digital forensics expert. Can you tell us about any examples you've seen in real life where a criminal maybe tried pulling something like this in the clip above who thought they were completely invisible online but had all their information hanging out there for the world to see?

 

[00:09:25] DW: Well, that's almost – yeah. That's almost every criminal that's out there is doing something, thinking. And many of them think that they're invisible. Some video, what they're doing, and put it up on Facebook. I have no idea what they're thinking about that.

 

[00:09:39] CS: It blows my mind every time. Yeah.

 

[00:09:40] DW: One of the things that I found kind of interesting about this clip is it does explain an anomaly that I talk about a lot in a lot of seminars, et cetera. And that is there is a perception and an almost understanding that the devices that we interface with every day, cellphones, cars, CCTV cameras, that the devices and the data the devices produce to do their job also can be used to surveil us. So that the normal function of the device has a counter function. And I loosely call that surveillance. Because, basically, it depends on your intent when you're watching something.

 

And so, in this clip, here we see people all in agree. We live in a surveillance society. Everything's looking at us. And here's poor Laurence trying hard to figure out a way outside of that environment. And he comes up with pigeons. And, of course, then he's in a different environment. Hawks or whatever.

 

The problem is, is that the delight is moving from the computer to the cellphone, et cetera, and then putting everybody in the 5G box. We're now kind of stuck in a box that there is no way to get out of the environment very easily if at all. And that means that the surveillance capabilities increase.

 

But the good news is two things. People have no idea how these devices work, but they love them so much. They don't care. And so, they just go down the road and they use them to do stupid things. And they can be tracked in certain ways. Certainly, law enforcement has some powers.

 

But on the civil side, if you can get certain information, you can convince the judge to get you a subpoena or whatever, you can go after stuff to the degree you can get to it. But it's kind of interesting.

 

[00:11:27] CS: Yeah. Combining that with what Tyler said, I think there's – if you think of IP addresses versus the apps that you're using, I think there's like a layer missing for a lot of people who I think maybe even if they laughed at it, sort of internalized the whole the internet is a series of tubes argument or whatever. And so, they're seeing IP addresses as just this sort of like sewer network that can be – but like you said, if you're using IPs through Instagram or through other things, then your sewer system might be 20 feet deep or something like that. You're not just going to see it. You're not just going to utilize it. Yeah, a lot of nuance. A lot of layers in there. But I think that's a great place to start that we're already on sort of technically shaky ground here. And we're just getting warmed up because this next clip is my absolute favorite here.

 

This is from a TV show called The Rookie: Feds, I think. In this clip we're about to watch, a supposed master's criminal currently in a shadowy chase through a shadowy warehouse realizes that the law is right on his tail. He pops open the side of his phone and does what any good supervillain would do. He has an impromptu snack. So, see for yourself.

 

[00:13:04] Speaker: You're trapped, Luke.

 

[00:13:06] Luke: That's close enough.

 

[00:13:09] CS: All right. Yeah. For those who weren't sure, I think he swallowed his SIM card thinking that that would keep them from seeing what was on his phone. Amber, if you're available, I want to start with you again here. While it's unlikely that any show would garner Nielsen style rating showing the actual complex and data-intensive work of mobile forensics, I can't believe I need to ask this, Amber, but will swallowing your phone SIM card actually prevent mobile forensics experts from accessing your phone? Yeah. I'll take that as a no. Tyler, Don, what do you think?

 

[00:13:44] TH: Amber must be attending her court procedure here. Yeah. First of all, I'm impressed with this clip at the speed of which he can open his SIM card and get that thing over there. Because I have about a thousand of those little pins, but I can never find one when I need one. I always resort to the paper clip, which is always just a slight –

 

[00:14:05] CS: An angry 10-minute search through the house for a paper clue. Yeah.

 

[00:14:08] TH: Yeah. But, no. This is a great example of something where it's depicted on TV and people form a genuine misconception about what is actually stored on a SIM card. It used to be before cellphones actually stored data on the memory chip that that information would be stored on the SIM card. But that hasn't been the case for quite a number of years. And it really only stores information about your carrier and allows your device to receive signal and may have some contacts. But, no. That's not going to stop anybody from doing some mobile forensics on the actual device.

 

But on the other – the counterpoint to that as well is if somebody actually wants to shut you down just by virtue of the fact that phones now have a passcode and things like that. If they don't want to give you your passcode, you're going to have a very difficult time unlocking that device. Law enforcement has tools to get around. But our firm is a civilian firm. So we don't have access to those tools to the degree that law enforcement does. Yeah.

 

[00:15:11] CS: Right. Yeah. Yeah, that's great. Because that is actually where I kind of wanted to go with this. Okay. Let me start with Don on this. As a lawyer who's also a forensics expert, how do you do the work of taking this type of forensic evidence that you might find on a phone like this and sort of making the implications of it understandable to a jury or a judge? And equally important, what are some ways that you've had to learn how to make sure this type of evidence is presented in a way where everything was acquired legally and above board? Since not doing so might result in valuable evidence being thrown down at technicality.

 

[00:15:46] DW: That's a long question.

 

[00:15:48] CS: Sorry. Yeah. Yeah.

 

[00:15:49] DW: That's a lot. The fun part about this clip and the partial answer to your question is, when I first saw this, the first thing I thought about is this is going to be an interesting chain of custody. Because you got to get that chip from inside this guy's belly into a courtroom in a fashion. So that when the officer sits on the stand and is presented with the chip inside a little plastic bag with some numbers on it or whatever, he can look at the chip or she can look at the chip and say, "Yep, I can identify this is the chip that came out of somebody's body." Or it came out of somebody else or whatever.

 

And so, the chain of custody here would be kind of fun to know something about – surprisingly, if you're really interested in that, NIST, National Institute for Standards and Testing, has a guidebook on how to handle biological evidence to produce chains of custody. So that you can get to the place where you can get this evidence in.

 

Getting it admitted, the first step in getting anything admitted, is you've got to identify this is what I say it is. And the prosecutor is going to say that's the chip that was in this phone. Well, it's a two-steper. That's got to be – first, that's the chip that was in this guy and it's a chip that contains evidence that we can then link it to a particular phone that was also in his hand. And therefore, we can then draw the inference that that was the chip that came out the phone. On and on and on.

 

And so, procedurally, when I look at these types of things, because I don't deal with law enforcement except when I help cross-examine our brethren at the law enforcement side, I start with the procedure. Because if you can't get the evidence in, then you might as well not even be horsing around with it.

 

And in this instance, as Tyler said, working backwards then, many times it's helpful for everybody to understand why are we going through all of these steps if at the end of the day the evidence doesn't contain – that chip doesn't contain information that we're looking for. Or are we just doing it because we want to be both comprehensive and complete? Because our criminal defense attorneys are always going to ask, "Did you examine the chip? Is it possible the chip –" and they ask questions like, "Isn't it possible the chip could have information on it that showed it belonged to somebody else?" And of course, you can say no. But the jury hears that question. And that's how, during cross-examination, alternative theories sometimes get put into the case through the attorney that asking cross-examinatory questions. This is a delight because it's wrong on lots of areas. But I could see somebody going to the trouble to figure out what's on the SIM card.

 

[00:18:28] CS: Yeah. And also, it presents sort of less – sort of a security work around as it does just a series of extra paperwork for y'all to do. You're like, "I have to deal with chain of custody now."

 

[00:18:43] DW: Well, yeah. And I don't know about Tyler. He may want to comment on this too. But normally, I mean, in all deference to my good friends at BCII and some of the other places, it seems like the government caseload is either really, really high or the environment they work in is a little different. But I never have got the amount of time that I know the government takes many times to process something. I'm always under the gun trying to get done in something seven to ten days what I know other people are taking months sometimes to do or sometimes even years. I don't know if that is meaningful or not.

 

But certainly, on the civil side, if you're going into the digital forensic field on the civil side, I move at a different pace many times than does my – then do the examiners, my good friends on the criminals – the law enforcement side. And Tyler, I don't know if you have that same experience or not. I don't know.

 

[00:19:37] TH: Very much so. I'm in Canada for those of you who don't know. Vancouver, British Columbia, Canada. And our law enforcement are understaffed and overwhelmed primarily with criminal cases involving child pornography material and things like that. Yeah, it takes them an extraordinarily long time to do any of this kind of work. Whereas us in the civilian field are expected to do it much, much more.

 

[00:20:04] AS: The same thing happens with us, too. Yeah. Law enforcement is nine months behind.

 

[00:20:10] TH: Easily. Yeah. Yeah. It's just I think their resources are literally overwhelmed.

 

[00:20:17] CS: Yeah. Can you talk about a little bit about that sort of processing gap in terms of like what's needed for the case versus how long it's going to take to get the evidence? Versus whether you have to sort of try to wrangle and maybe a new date for the court case because the evidence isn't going to be ready in time? I mean, that sounds like a dramatic enough like story that could have been like a TV show rather than –

 

[00:20:38] TH: Yeah. Well, this clip does do a good job of illustrating a mobile forensics case. Because I think, obviously, the police are after the suspect. He's doing something that's got their attention. And if he's planning and speaking to other criminals and they're planning to do something, there's probably really good evidence of that on the phone.

 

And I saw somebody earlier asking about encrypted communication apps, like Telegram and things like that. Criminals are always – in my experience, they're always looking for the latest and greatest hidden unavailable communication app to plan a variety of criminal acts.

 

I think there would be a lot of information on this phone. And depending on how they're communicating, it could take a very, very long time to get all of that data. And then trying to get it off the phone is another story with a lot of these newer encrypted apps. It's not as easy to extract that data from the phone as maybe people think.

 

[00:21:42] CS: Yeah. I want to tee this back to Tyler a little bit because you mentioned this as well and maybe I misheard you. But more so than swallowing a SIM card, it seems like a good way to keep them off your back is to have a very complex unlock code or something like that. Now in a case like this, is there a large amount of sort of like legal wrangling that you need to do to get permission to, as you say, sort of break into this phone? I mean, I guess I'm trying to understand if this person hasn't been charged with a crime, then I'm assuming that you can't really just kind of say, "Well, we assume he might be in. So let's get into this phone by any means necessary."

 

[00:22:30] TH: Yeah. Well, I think my comment on that would be that I think law enforcement has tools available to them to bypass passcodes and encryption that are not available to firms like mine who are just civilian firms.

 

[00:22:46] CS: Got it.

 

[00:22:49] TH: The police would – if there was an arrest, they would probably take this phone as a piece of evidence and then work on it to unlock it and examine it as part of their investigation whether or not the suspect cooperates by giving this passcode or not.

 

[00:23:02] CS: Gotcha. I want to jump back to Amber here since I'm sort of allocating each of you sort of a different aspect of – I know that you all have expertise in all these different areas. But can you talk about some of the tools that are frequently used in making the work of accessing phone data or making a mobile clone or making available material usable for research tasks in cases like this?

 

[00:23:25] AS: Absolutely. The interesting side about technology in the digital forensic space is there really aren't as many players as you would think out there. There's usually a couple dozen. As Tyler mentioned, there's tools that are available only to the law enforcement government side of things that will go through and try to brute force a phone open, different methods like that. But they're still limited, obviously, by how the technology is designed.

 

And then there's ones that are available to all sides of it. Because in theory, and I'm sure Don will agree with this, technology is really supposed to be neutral. We're really telling you, "Is it a one? Is it a zero? Is the data there or is it not?" It's not like we're magically producing things with technology. It's just some choose to be on one side and not on the other.

 

And in fact, we should be available to all sides of it. That's kind of what the Daubert principles are in the first place. But you use a lot of different extraction techniques. There's a lot of open source tools out there that a lot of times inspire some of the digital forensic tools to kind of almost tweak them so that they're a forensic-grade piece of technology.

 

If I'm working on an Android, a common method would be to do an Android debug bridge backup kind of method. If I were working on an iPhone, I could actually just do an iPhone backup, which creates a PLIST file. And that's the exact same as doing a style of forensic imaging. And people are doing that all the time to their iCloud. And they used to be doing it to their desktop systems as well. But that data can be used in court. The data just has to be produced and then it has to be verified in that process.

 

In the case of our disappearing SIM card, I love the fact that there's a different change in custody with biological. Didn't think about that. I don't know if I ever needed to think about that – I was like –

 

[00:25:05] CS: Or ever will again. Yeah.

 

[00:25:06] AS: I’m going to erase that part out of my brain. But the fact that that SIM card can actually – I'm going to use an iPhone as an example. An iPhone actually stores the last three SIM cards that were used with it. And it's a unique piece of evidence that is exclusive to iPhones that you might not think of. So we would know if that SIM card was not only used in that swallowed method. But it was also what if it was used in another phone at a different time? We could actually follow that evidence through to other types of mobile devices that could be beneficial in an investigation.

 

[00:25:43] CS: What I'm hearing is this criminal wasn't as stupid as we thought he was.

 

[00:25:46] AS: No. But he definitely had digestive challenges.

 

[00:25:48] CS: Yeah. Oh, yeah. Yeah. Yeah. that is inorganic material, if ever there was. We're getting some questions in from the audience. And Tyler, you sort of touched on this before. But to the whole group, Yolanda Gilliard said, "Do you recommend apps like Telegram, Signal and WhatsApp for secure communications? Or can location identity information still be captured?" I think this is more just as a sort of –

 

[00:26:14] AS: As a person.

 

[00:26:14] CS: As a life – yeah, as a life question. What do you think about things like Signal?

 

[00:26:20] TH: It depends on your level of Laurence Fishburne-ness I would say. I think they're out there. I have them all on my phone that I use frankly. I think I still use WhatsApp most out of all of them. Some of my colleagues and friends who are more concerned about security and privacy prefer some of the other ones like Signal, and Telegram and things like that.

 

But I'm not particularly concerned with somebody hacking into my communication app. For me, it's not really that big of a deal. I guess whichever one you want to use depends on your level of privacy and security concerns. But, yeah, I think they're all – they all have their own differences.

 

[00:27:08] CS: Yeah. Yeah. They do what they do, but they're not necessarily like a miracle cure or whatever.

 

[00:27:14] AS: I don't think any app is perfect. It's an app. It can't be perfect.

 

[00:27:17] CS: Mm-hmm.

 

[00:27:19] DW: And let me just add one thing to the discussion concerning law enforcement and civil only because if people are considering a career, I personally think that on the civil side that we have a lot more freedom and a lot more ways in which I can get information for the objective that I'm trying to pursue than does law enforcement. I don't have the Fourth Amendment. I don't have the Fifth Amendment. I don't have those. I have privacy issues and I can usually get around those.

 

And so, if we work backwards from what is my objective. For example, I'm trying to find out whether an employee took something when they left. If I know what the objective is, I may find that objective can be achieved because the data has been disseminated into the cloud areas, et cetera, or other social media. The data may or may not be on a device. But depending on how we can stylize the case, I can usually get a court order to get into a device or get close enough to the device that the other side – many times, that drives the other side to give up or to want to settle a matter.

 

On the civil side, I'll just tell you, I think civil and corporate investigations are fun because we have privacy issues. But that's about it. And Amber, are you going to say something? I apologize.

 

[00:28:35] AS: Oh, I thought the thunder where I'm at right now just kind of backed you up a little bit.

 

[00:28:39] DW: Oh, okay.

 

[00:28:40] AS: But, yeah. No. I completely agree. There are a lot of different flexibilities that you get from the private side versus the law enforcement side if you're looking for this as a career. I know there's been some chat over the imaging of an iPhone. All of forensic technology is expensive in a way. And I think that's an important aspect when people start either their own firm or they're doing on the private side versus the law enforcement side because it's very unique. There's a lot of research and development that goes into it to get to what that forensic grade is where we want to recover data that's happened on the phone that maybe a consumer wouldn't be interested in. That's a lot of research that goes into it. So, it's all relatively expensive.

 

But the question on whether or not your data can be protected. What if I encrypt my iPhone backup? Different things like that. Realize that there's a lot of tools developed for the space that makes it so – there's brute force mechanisms. There's methodology to be able to get around some of those passwords that would be blocking a normal person from accessing your data. But look at it as a digital forensic superpower that they're going to probably bypass that.

 

And I'm going to say 60% of the cases, you have to get pretty advanced in your passwording to not do it and then hope your phone manufacturer doesn't implement something that produces a flaw, which most of them produce a flaw and that's exploited in the digital forensic process.

 

[00:30:00] CS: Yeah. It looks like you're sort of answering a couple questions we got in here from Robert Wilk and Dirk Bell. You say if you can image an iPhone by doing a backup, if the suspects [inaudible 00:30:11] back up, then isn't that forensic basically useless?

 

Just to sort of untangle what you said or to make sure that I'm clarifying it, it sounds like encrypting an iPhone in that way will keep sort of the average person on the street or the average sort of script-kitty hacker or whatever out of your phone. But within the digital forensics realm, it's maybe like the tool – your tools are bigger than theirs. Yeah.

 

[00:30:37] AS: They're going to be – yeah. It's going to be I have more capabilities than the average person does. And actually, part of the forensic process is typically you back up. For an iPhone specifically, you'll do an encrypted backup with a known password as your methodology because it produces more data on your iPhone than if you did a normal backup. You get keychain data. You get a lot more location information. That is part of the forensic procedure for it.

 

[00:31:00] CS: Yeah. And Dirk asks, if unencrypting an iPhone is still prohibitively expensive for most law enforcement, I mean, can you speak to the sort of the cost of these kind of tools?

 

[00:31:08] AS: For actually breaking a pin code on an iPhone, it is very expensive. Only people with larger department budgets are going to be able to afford that. The smaller departments, a lot of times they're part of regional task force. So they might have that capability in there. But it is expensive. It's like investing in a brand-new Kia. It's about that cost to be able to break into the device with this software.

 

[00:31:32] DW: And Chris, let me point out to attendees that this capability of trying to break into mobile devices through the operating system as if there were like a forensic operating system was the subject of the FBI's lawsuit against Apple dealing with the Boston bomber and the attempt to force Apple to create a kind of forensic OS. And the Inspector General of the United States has written a very good report about whether or not the FBI had the capability or claimed it didn't have the capability in order to enhance the prospects in litigation.

 

And it's a great read because the environment includes a real desire on the part of law enforcement across the world to have a way in. And so, while we talk about cellphones and a way in that bypasses everything, that will lead you ultimately to take a look at the 5G lawful intercept subcommittee. The technical rules from building the 5G network that allows with lawful intercept to plug law enforcement right into the 5G network real time.

 

There's all kinds of ways in which the device itself and the protections the device gives are being circumvented many times by the environment which the device has to work.

 

[00:32:56] AS: I think that actually leads, Chris, to the perfect next clip because it kind of addresses that thing.

 

[00:33:00] CS: Yes. I was just going to say someone beat us to the punch in the question asking about this very thing. Let's move on to the next slide and let's move on to the next clip here. I'm just going to say that this is also from The Rookie. And let you see what you think.

 

[00:33:21] Speaker: Show me your hands. We got a body.

 

[00:33:38] Speaker: I say this guy died of a heart attack.

 

[00:33:40] Speaker: How can you tell?

 

[00:33:40] Speaker: His shirt says signs of a heart attack.

 

[00:33:47] AS: I love that.

 

[00:33:48] CS: Oh, there you have it. Boy. Yeah. Some of these, I'm like I don't know if I have the tech skills to understand what's wrong with this. But, yeah. From an ethical standpoint, this one is wild.

 

In the second clip from the rookie, our investigators enter into a suspect home, guns drawn to find the common setup in these types of cops and criminals film. The room is eerily lit. Usually bright lights coming through spaces and heavy blackout curtains. Some sort of loud music is playing. Induced disorientation or panic in the viewer and indicate that something hedonistic was happening just recently.

 

And in this case, our officers discover the person they're searching for in a chair at which point they use a – let's be generous and say unadvisable method of accessing the phone. All right. Amber, I'm curious to ask you about this one. Because when I showed this clip to other folks on my team, the question we all had was, "Wait, would this work?" Is this clip illustrating the lack of technical understanding of the writers about how biometrics work or is it more that this has a procedure or ethical issues when submitting phones that might potentially compromise the submission of evidence on ethical ground?

 

[00:34:50] AS: Well, I think depending on how they had it locked, it obviously could work. If it's facial recognition, the face is there. But the other side of it really definitely comes into the is that dead person offering their consent or not? And this is where a lot of that legal side of it comes in and some of that unlocking side really starts coming into question.

 

I also love the fact that the other side of this clip where he immediately has his internet history up and it's like, "Oh, he died of a heart attack because he Googled whether or not he was dying of a heart attack."

 

In my 30-year career, I have to tell you, the only time the internet history has led directly to me doing work on the phone is that their internet history actually looked up who was Paraben Corporation and then they – and I was like, "Okay, that's pretty funny. That's never happened but once in my entire career of the 5,000-plus phones I've processed in the lifespan. But there's definitely some ethical question in here that fall more on the legal side of it.

 

[00:35:48] CS: Yep. Yeah. All right. Well, yeah, let's start here with Tyler. How often does locked cellphone data play a part in forensics investigations for you? And when you're accessing the evidence, I'm guessing that you and your team have to have more than a little background in case law to ensure that material is being accessed correctly.

 

From a practical standpoint, how much of these considerations are baked into some of the standard digital mobile computer forensic certifications? Do they mostly talk about the procedure of digital forensics or does it get into the sort of ethics –

 

[00:36:25] TH: Well, for us, most people hire us to assist with their legal case. We generally have the permission of the person who's giving us their phone or their computer voluntarily so that we can find evidence in support of their legal factual position.

 

This kind of scenario wouldn't really come up with a lot of things in what we do. But we do occasionally assist defense lawyers in criminal cases where the police have seized evidence. And certainly, browsing history has been very relevant in conversations and things like that.

 

As far as unlocking the phone, it's a believable clip. I think it probably would work assuming that the deceased person hasn't been deceased for so long that their phone has drained of battery power. Because then, if you charge it and power it up, then you would have to put in pin code. For most phones, I believe. It's relatively a believable clip.

 

[00:37:27] CS: Yeah. Yeah. I mean, it hasn't been that long because the music is still playing in the room.

 

[00:37:31] AS: Exactly.

 

[00:37:34] CS: Yeah. Sort of going back to that though – and I'll open this to everybody. But since a lot of people in our questions are asking about what certifications they should take to get into digital forensics. And where should they start and experiences and so forth. Can you speak a little bit about the sort of certification side of things and how much of the sort of legal, procedural, ethical aspects are in the certifications? Do they just tell you how to do it or they also tell you when to and not to? Or is that something you kind of learned once you get into the job?

 

[00:38:16] AS: Sorry. Chris, can we just – can you just repeat that a little bit more?

 

[00:38:21] CS: Oh, I'm sorry. Yeah. Yeah. Yeah. I guess I'm asking about the sort of the makeup of various digital forensic certifications and whether they talk about these ethical – whether you learn the sort of ethical aspects of, let's say, unlocking a phone with a dead person's face? And how to present evidence properly? Or whether the digital forensic certifications really just lock in on here's how to crack a phone. Here's how to make a clone. Here's how to –

 

[00:38:49] AS: Yeah. I think, yeah, most of the training that I've done touches on the ethical components and the overall legal component and that traditional forensics process. But most of the training programs are very technical in my experience. They're teaching you how to use tools to do all the technical aspects of the job. Yeah.

 

[00:39:13] CS: Yeah. Got it. I want to move to Don here. This might not even ever be a consideration, but are there any special obstacles that you've had in submitting phone data to a jury to make it sort of "stick"? Considering how close an attachment we all have to our phones nowadays, I'm curious if you've ever had jury members balk a little about the idea of like stealing evidence from a phone in a court case even when the defendant wasn't especially sympathetic. I know we all sort of think of our phones as our extensions of ourselves. It would feel really weird if someone was like, "Oh, I found this thing on your phone." You're like, "Hey."

 

[00:39:48] DW: Well, that's an interesting question. Because when I first started doing this about – I've been practicing law 41 years. I've been doing the forensic stuff about 21 years. When I first started doing the forensic stuff, there was a real resentment and a real pushback from lots of people, "How dare you think that you're going to get into my computer or my life?" And that was reflected in cases. That was reflected in attorneys that were on the other side.

 

Even when I – if the attorney were representing an employee and we were saying we wanted to look at the employee's computer or phone because we want to make sure he didn't take anything. The first pushback was, "You can't do that because you're in effect coming into our house and stealing stuff and you're rifling through all the drawers. And we wouldn't let you do that in the home going through all the closets. Why the heck would we let you do that on a computer?"

 

A lot has changed. And now, I think almost everybody knows, good grief, there's all kind – we live on these phones. We live on these devices. We want them a whole lot. And I think the more common reaction is not, "Gee. The evidence – I'm going to reject the evidence. Or I'm worried about the evidence because I am angry that you got it." It's more of a I had no idea this stuff was on the phone this way.

 

And if you are presenting it as part of an expert testimony in court, of course, you're doing that in piecemeal. You're doing that slowly. You're doing that with a lot of analogies so that a jury understands what it is you've done. Where data resides? Why you were able to get to it? And what it means?

 

And that process generally comforts them in a sense understanding, "Okay, this was on Bob, the bad guy's phone." They may go home and go, "Good grief. I hope my stuff's not that visible." But everybody knows, your browser history is your browser history. And they now know, too, that you can go in and maybe try to delete it and do other stuff with it. That's fine.

 

[00:41:44] CS: Well, that leads into a question we got here. Katie Miley asks, "A lot of internet-enabled assistance, Google Home, Alexa, et cetera, listen for keywords and conversation. Is this vocal data recording or et cetera more recently being used in digital forensics in court or otherwise?"

 

That's a good question. And again, it speaks to that feeling of sort of worry that people have that innocuous things that we do at home are now suddenly turning into instant evidence.

 

[00:42:11] AS: I think it's absolutely the case. When you think about Alexa, she listens for three minutes after she gives a command. And no one thinks about that when you ask her to set a kitchen timer or what's the weather today. That she's still listening to some of that background noise after. And that's really how IoT is emerging into as digital forensic evidence.

 

There's really three sources when you think of digital evidence. There's consent level evidence, which is you give me your credentials. We might log into the cloud. And that's going to happen. And we're going to gather that data. To physical that we're going to capture from computers, phones, any of those types of devices. And then the last one really comes from the open source side. Because OSINT, which is another field, is really starting to get into that forensic verification stage where they really have to show the data has more credential and kind of a better foundation behind it than just random stuff I found.

 

Because, again, AI and all the fake news and all of that has really kind of made our data sources when it comes to the open side a little weaker than it used to be. We have to really verify that. But we use all of those pieces to come together to really have a good digital investigation at the end of the day.

 

[00:43:29] DW: And that verification a lot of times can be done through corroboration of what you're seeing on one device with other things, pulling all that together. In fact, that's really – in my experience has been that's really fundamental and almost necessary as part of the testimony. You got to tie it all together. Otherwise, you're going to show up and in effect say, "I got this black box. And when I connected it to the phone, this is what it told me." And that never works.

 

[00:43:55] CS: All right. Well, we're getting a tidal wave of great questions in here and we'll try to get to as many of them as possible. But I wanted to jump to our fourth and final clip here. We're going to talk about the cloud now. We've talked about phones. We've talked about biometrics. We've talked about SIM cards. But before we go on, I just want to quickly tell a little story, personal story.

 

At a previous job I worked at, I worked with a woman named May. She was in her late 80s and she had been with the organization for most of her life. And her job role at that point was to scan the obituaries for names of the members of our organization who had passed away and then add them to the document at our computer terminal. This is in 1998, mind you, and the computer she was using probably was at least a decade and a half old by then.

 

May would save the obituary file without really understanding file structure. And so, sometimes when someone would ask her where the month's obituary files, she just sigh and jab a finger at the screen and say, "It's in the computer."

 

I tell that story only because I think the idea of it's in the cloud in popular entertainment seems similarly uncertain of what's actually happening in this black box of plot design. We're going to jump into a scene from a TV show called In the Dark. And we're going to see that it is in the cloud.

 

[00:45:11] Speaker: I have my ways. So?

 

[00:45:15] Speaker: I'll give it to forensics. See if we can get anything from the cloud.

 

[00:45:20] Speaker: Thank you.

 

[00:45:21] Speaker: Seriously, though, how did you find it?

 

[00:45:26] CS: See what we can get from the cloud. When Rich Summer, as Dean Riley, says, "We'll see what we can get off of the cloud." I think folks with even a cursory understanding of what cloud-based systems are and how their security and access works and what they have to do with a recovered phone are probably rubbing their temples pretty strenuously now.

 

Amber, tell us about cloud-based forensics. What set of tools and steps would actually go into accessing a cloud-based piece of evidence? And how does understanding cloud security and cloud infrastructure maybe help professionals who want to work in cloud forensics?

 

[00:45:57] AS: We'll see if this creates some controversy. Because I personally believe that the cloud is really our largest growing data source for digital evidence period. So many of the apps that people mentioned earlier, from WhatsApp, to Telegram, to Signal, they're really becoming more cloud-based because the devices themselves change firmware version so often, it's illogical to want to program it based on that local firmware and its hardware.

 

A lot of them have started migrating their data into the cloud, which is why I found this clip so funny. But it was also that nonchalant way they're like, "Oh, we have a magical way to talk to every cloud on the planet," which is so not the case. Every one of those is so unique and individual on how you access it through the keys. Whether or not that key structure is going to change.

 

I think Twitter is a perfect example of that. As Elon goes through and changes his mind every week, it actually messes up the digital investigation process because all of our key synchronization systems break as soon as he decides, "Oh, we're not going to do this anymore." Or, "We're going to do this now." Or, "Oh, we're going to put this back." All of that affects that digital investigation process and our ability to capture that evidence.

 

I think the biggest part of cloud is that there isn't a legal precedent as well on whether or not the credentials I can capture from your devices, whether it'd be a computer or a phone, can be used to capture your data. And then the other side of it is, of course, consent.

 

We actually do a lot of live consent with people. Chris, if I were doing an investigation with you, I'd say, "Okay, you want me to look at your Facebook data." I would send you a consent form via your phone. You could sign it, type in your credentials and they come back to me in my lab and I can capture all of that information.

 

And that style of trend is really happening a lot more often than me even looking at your smartphone. Why do I need to look at that if I can get it all from the cloud? Especially when TV tells me to.

 

[00:47:47] CS: Right. Don, I'm going to pass this question over to you now. What are some unique challenges in working with cloud forensics in terms of accessing and working with data, but also with submitting it as evidence in a case?

 

[00:48:00] DW: Actually – and if anybody contacts me, I'll be more than happy to give them a sanitized copy of this brief. But this is a bit of a pet peeve of mine. When you get a search warrant, you need to identify with particularity and specificity the things you're looking for and the places to look. And that's pretty straightforward because the person who gets the search warrant many times gives the search warrant to like a local sheriff or a marshal and they can actually execute the search warrant by looking at the four corners of the piece of paper and they can see that they're looking for drugs or they're looking for a grand piano.

 

And in fact, because you have to state what you're looking for with particularity and you think to yourself, "Gee. I want to make sure they can look at every nook and cranny," there's an art to writing search forms. But we also have to state the place to look. The location. And if I say go search apartment 21A, then you're allowed to search apartment 21A. Now if you're in apartment 21A and there's a door and it's connected to apartment 21B, you're not allowed to go into 21B.

 

Take a look at a search warrant for somebody's Google – or rather Apple information. Apple's it's got a whole process and a procedure. You take the search warrant. I want all the information from John Doe that Apple's got. Don't know where it is, but I want it. Send it to an Apple email address and they go out and look for all the information in wherever data farms they've got, whatever service they've got. And they have their own data centers. They also use some third-party data centers. And they all bring it back and then give it to law enforcement.

 

I have argued, and it proves a little too much. But I have argued that because we don't know with specificity the location in the cloud or which data farm has got the data we're looking for, because Apple doesn't know it until they look it up, how the heck can this search warrant be valid under a constitutional test for particularity and specificity? And I don't think it is valid.

 

Having said that, no judge is going to – no judge in an average case is going to kick something out based on that. But it'll be an interesting legal challenge if it gets to the right place. Because stuff in the cloud necessarily is known for purposes of storage, but we don't know where the heck it may be even on the globe. And so, we have some interesting legal issues I think coming down the pike.

 

[00:50:19] CS: I want to open them to Tyler or anyone who wants to talk about it. But when you're selecting a digital forensics team for your case or project, do you have respective team members that are skilled in certain very specific aspects of digital forensics?

 

If you have a case that's – are you picking and choosing team members based on who has cloud background or who's a whiz with mobile forensics for that case? I guess, which is to say, is there a benefit of becoming hyper-focused on one skill in your area if you're going into digital forensics?

 

[00:50:51] TH: First of all, yeah. When I do have a certain case that requires a certain specialty, of course, I assign it to the person with the most experience and specialized knowledge in that area. I think cyber attack cases come to mind, incident response cases. I have very particular members of my staff that are very highly trained and skilled in that area. But overall, I think everybody's very trained on computer forensics, mobile forensics and then incident response on my team. It just depends on the case. But, yeah.

 

[00:51:25] CS: Yeah. All right. Well, we're pushing in on an hour here. I mean, I could go another hour. And this is so much fun. But I want to just kind of wrap a little bit of this up by talking about the actual work of digital forensics professionals. We hope that the listeners who have seen all these absurd takes on digital forensics, that it doesn't dissuade people from getting the profession or the study or the experience needed to get started.

 

But just in case these type of over-inflated cloak-and-dagger scenarios make it look more fun than it actually is, I'd like for each of you to take turns talking about the engaging and real work of digital forensics professionals and the aspect of the work that you love that might not show up in a John Wick or In the Dark.

 

[00:52:11] AS: Who do you want to go first? Is Don going first?

 

[00:52:14] DW: I'll do it quickly then. I love what I do because I get a chance to teach attorneys, judges and juries. I get a chance to explain to them some of the inner workings of the devices that they love, cellphones, et cetera. And they're always shocked to see what's on them. And when you can do that and at the same time achieve a legal objective and make a living doing that, that's a pretty good life.

 

[00:52:38] CS: Yeah. Anything to add, Amber, Tyler?

 

[00:52:42] TH: Yeah. I love what we do because I love technology and I love facts. My background as a lawyer is as a lawyer as well. I'm not currently practicing like Don. But being able to determine facts with the certainty that you can using forensics nowadays I think is just incredibly rewarding. And every once in a while, you solve a puzzle. You get a really big satisfying win because of our particular skill set. And I find that very rewarding.

 

[00:53:14] CS: Nice. Amber?

 

[00:53:17] AS: I love the reward side as well. I love that I do something good every day. If that makes any sense. I also love the black-and-white nature of digital forensics versus some of the other areas that could be impactful in that way.

 

But my day-to-day is I'm actually developing the technology used in digital forensics. And so, that inquisitive and creative side of me that took apart my mom's toaster and did all those things really gets utilized there. Because when someone talks about WhatsApp, I have to go through and research it and figure out what data is actually available. Whether or not it's going to be consistent. What are my methods for capturing of that data? What type of key structure do they have?

 

I shared some of my frustration with Twitter because it keeps changing everything. But on my desk, I'm researching the Ring camera system because it keeps a lot of its data. It's had a recent lawsuit with it and everything else. But it has really valuable digital forensic impacts. And that's something I get to share with the community that is in hopes of doing good because they're finding that truth or innocence in that data.

 

[00:54:23] CS: Yeah. All right. I had one more question about whether you could even really show digital forensics in TV and movies or if it's like trying to depict the life of a writer or a poet or a painter or something like that. But I think rather than sort of go into that, I want to just jump in because we still have a ton of great questions that have come through. And a lot of them are practical and job-based and certification-based.

 

I'm going to just kind of wrap up the hour with that. Howard Raven asked more details on the certifications. What certifications are the best for digital forensics? Do you guys have any thoughts on where to start with digital forensic certifications or whether you should just be doing hands-on stuff first? And then when you want to sort of make it professional, what's the entry point?

 

[00:55:10] AS: I think you need to get a large variety. It's one of those that I don't think any one of us is certified just in one space. There's actually infosec classes that have digital forensic background. There's classes out there that are done for certified [inaudible 00:55:26] investigating or forensic investigator. I have to think of all the acronyms for it. Yeah, there's so many.

 

Obviously, each individual company that has technology also has certifications that's really about using and utilizing the tools. Because think of it like you get a really high-powered microscope if you're a scientist in a lab. In this case, you get a high-powered forensic suite that touches on such a large variety of data. You need specialized training for that.

 

I think companies need that to see that you're proficient. You can't turn a screwdriver for this tool, then it's going to be like, "I don't know if you're a good choice because they don't want to waste the time on training you." Those are things that people can explore if they really went into this space.

 

[00:56:09] TH: Yeah. It's a tough question. It's probably one of the most commonly asked questions that I think we all get. There's a digital forensics group on Facebook. Sorry, guys. My dog is barking.

 

[00:56:20] DW: That's all right.

 

[00:56:24] TH: Yeah. It's a very, very commonly asked question. And it goes hand-in-hand with how do I get experience when I can only get hired when I have experience? It's a very, very difficult field to get into.

 

Yeah. I mean, I always tell people that I cheated because I started my own company and just sort of did training as –

 

[00:56:24] CS: Gave myself a job.

 

[00:56:47] TH: Yeah. And so, I feel for people who are legitimately talented and skilled and looking for a way to get their toe in the door. It's not easy. Yeah, you just got to focus on I would say less about which certifications are the best and just focus on the knowledge. And I think Amber's a true testament to that. Because Amber understands the data and the science behind all of the applications since very, very early on in the development of this field. It's not because she got a whole bunch of certifications, you know? Yeah, props to Amber for being a pioneer.

 

[00:57:24] CS: Rakesh L. asks setting up a digital forensic lab at our university. Your opinion on basic tools and equipment?

 

[00:57:33] AS: I think just as a note for everyone. There is an open source platform that's designed for digital forensics. It's called Autopsy. And it is free. It has a lot of great plugins to it. That's a great way to kind of start playing. NIST and some of the others in the community also share data sets.

 

You have to love this. I know that Don and Tyler and I haven't said this. There's a lot of waiting time in digital forensics. It's not like it's just instantaneous. It's like I got to image an iPhone. I got to wait eight hours for it to image for me to even get into the data. And then I'm parsing through 500,000 text messages. That is not a joy. That is a lot of someone's life that you're going through. And there is that side of it as well. Don't think it's all hot and sexy and like, "Oh, look at all this cool stuff." It's a process. It's a large puzzle you're putting together.

 

And the biggest impact, and I take this very personally, is that I always feel I have a burden of proof. Even on the civil side, I still have a burden to go through. I know legally I don't. But I always have a burden to make sure I'm not missing something. That's always been my biggest fear, is that I missed something. And it would have made an impact. I don't know about you guys.

 

[00:58:43] TH: Always.

 

[00:58:44] DW: Well, I'll just add on the hardware side. I know that Amber and I have spoken a number of times. But on the hardware side, no matter what you do, you want to know enough about the tools that you're looking at and enough about the hardware so that you get a match. So that the tool is giving you data in a way that the hardware is built to process it as fast as possible. A mismatch there just produces a choke point.

 

[00:59:08] TH: Yeah.

 

[00:59:07] CS: And I think it also is a testament that the three of us are all entrepreneurs. And I know a lot of people trying to get into the space that might be the scariest side of it. But looking and exploring some of your own business practices or what you can offer out there, whatever it may be, that's also a career opportunity that I don't think as many people are exploring. There's a lot of reward and a lot of stress and lack of sleep that happens. But it's a great way to explore as well.

 

[00:59:34] CS: Yeah. Okay. We're seriously at time here. But I want to ask one last question because I think it's a frustration that is asked once but probably is in a lot of people's minds. Carl Smith asked, "Do entry-level digital forensics jobs really exist? If you were starting off today, what approach would you take? There appears to be a lot of training/certification opportunities, but many are cost-prohibitive and the jobs seem relegated to law enforcement military background or extensive experience required. Very frustrating."

 

Tyler, you kind of poked that particular bear of a topic as well. But I wanted to get each of your opinions on the how do I get experience when I need experience to get the job and so on and so forth. How do we break the snafu?

 

[01:00:21] TH: I don't have the answer. I don't know. I don't know. Maybe some sort of government funding for new employees or something, I guess. I'm not sure.

 

[01:00:30] CS: Yeah. I mean, it's – go ahead.

 

[01:00:34] AS: I volunteer as well. I know that sounds really silly. That's like why do you need a digital forensic volunteer? But it's going to be a little odd. But I offer it to some of my local – no one laugh. My local funeral homes when they need me to process phones for someone who passed away. Different things like that. I volunteered to do that because it's practical experience. It's getting me more things that I would want to put in on my resume. I also help non-profits.

 

Because believe it or not, lots of places have digital forensic. I'm going to call them incidents. And that's a great way to kind of step in and say, "Look, I've worked on this type of case." And give yourself that entry-level position so you have some experience in there in addition after you've taken classes, after you've done some practice, all of those. Look for the places that maybe don't have someone on staff but are willing to let you help them. Because a lot of people need the help in this space. It's a lot of data recovery, too. You can get into all of that. Because it is hard to find entry-level anymore.

 

[01:01:36] CS: Yeah. Okay. One last lightning question here. Amber mentioned a free learning resource aside from NIST. Could you mention that again?

 

[01:01:47] AS: Tools resource is actually called Autopsy. If you Google Autopsy Digital Forensics, you'll find the tool. It also has links to lab images. I have free lab images I give out as well. Because I think the more you practice, the more you're going to decide if you like this, first off. But I think even a lot of companies would take you as a volunteer that says, "Hey, I'd just like to shadow you. Let me shadow on a couple of Investigations." You're going to have to sign a lot of paperwork, so on and so forth. And you're not going to be able to do hands-on. But you're going to see how it works. How does a team like this work?

 

[01:02:23] CS: Yeah. All right. Well, I'm going to cut it off there. We still probably have almost a dozen questions. Thank you for all your great questions, everybody. I think what we're going to do is we'll try and answer them in a future blog post that'll be the unanswered questions from Cyber Work Live. So keep an eye out for that.

 

And, of course, drop any of us a line if you want to continue this conversation going. But it is time to go. With that, we're at time. And I would just like to say thank you to everyone in joining us today for today's episode of Cyber Work's Live.

 

If you enjoyed the event, I hope you'll keep watching for future installments of our media myths and cyber reality series, which will likely include episodes on hacking, red teaming and physical breaches, confidence tricksters and social engineering, film depictions of the dark web. I'm excited about that one and many more.

 

For anyone new to our program, I also point out that new episodes of the Cyber Work podcast are available every Monday at 1pmm Central both on video and as an audio podcast. Go to infosecinstitute.com/podcast to check out past episodes. There are also episodes featuring Amber and Tyler in the resources section of the presentation. And hopefully, we'll get down on the show sometime soon. It'll be fun.

 

Also, keep today's fun going by checking out infosecinstitute.com/free to check out all of our free resources for Cyber Work listeners. Let's start with Work Bytes, our new security awareness series that features a guest of colorful characters as they work together, make security mistakes and hopefully learn from them along with your employees.

 

There's some great free posters featuring these characters, including Volkov Volkovic, the vampire. Dr. Lydia Lightningclack, bone slicer. Captain Rufus Rafael. Ed, the zombie, from accounting. Melody Moonbeam and more. You can also download our free Cyber Security Talent Development eBook. It's got in-depth training plans for the 12 most common roles, including SOC analysts, penetration tester, cloud security engineer, information risk analyst, privacy manager, secure coder and, yes, digital forensics analyst. Just go to infosec institute.com/free.

 

And finally, I just want to thank again our wonderful and hilarious panelists, Amber Schroader, Tyler Hatch and Donald Wochna for joining us today. And thank you to all of our guests for attending and submitting so many great questions and feedbacks.

 

As I close off, I will just say, as we end the presentation, a very quick survey will appear. If you could just take just a few moments to share your thoughts, it is appreciated and will help us produce more great content in the future. Thank you again for coming today. Have a great day. And please, please, don't eat your sim card.

 

Subscribe to podcast

How does your salary stack up?

Ever wonder how much a career in cybersecurity pays? We crunched the numbers for the most popular roles and certifications. Download the 2024 Cybersecurity Salary Guide to learn more.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.