Focusing on users is the only way to improve cybersecurity

Ben Johnson, CTO and co-founder of Obsidian Security, discusses a variety of different topics around the umbrella theme of shifting cybersecurity priorities in the face of an evolving threat landscape.

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

Chris Sienko: Hello and welcome to another episode of Infosec CyberSpeak, the weekly podcast where industry thought leaders share their knowledge and experiences in order to help us all stay one step ahead of the bad guys. Today's guest is Ben Johnson, CTO and co-founder of Obsidian Security. We're going to be covering a variety of different topics around the general umbrella theme of shifting cybersecurity priorities in the face of an evolving threat landscape. Ben Johnson is CTO and co-founder of Obsidian Security, a rapidly growing cybersecurity startup that provides identity intelligence. Johnson is also a technical amici on the United States Foreign Intelligence Surveillance court, FISA. Prior to founding Obsidian, he co-founded Carbon Black where he led efforts to create the powerful capabilities that help define the next generation endpoint security space. Prior to Carbon Black, Ben was an NSA computer scientist and later worked as a cyber engineer in an advanced intrusion operations division for the intelligence community. He has extensive experience building complex systems for enterprise environments where speed and reliability are paramount. Ben, thank you very much for being here today.

Ben: Thanks for having me and quite the intro. I appreciate it.

Chris: Okay. Very good. Yeah. Want to make sure everyone knows. So we start out every episode by asking a little bit about your security journey. Where did you first get interested in computers and tech and how did that move from general computers maybe to cybersecurity?

Ben: So, I'll try to not be long winded, but I get pretty excited talking about, well really anything. So it's actually a little bit amusing. So first of all, got a Commodore 64 when I was like three. My dad-

Chris: You are the seventh person to say a Commodore 64 was their first computer, so that's awesome.

Ben: Yep. My dad was helping me learn Basic and all that. He's not a hardcore computer guy or anything, but I think he saw the future. But really I grew up in Vermont and I grew up on top of a mountain. And really that meant we had no TV channels, but we had a phone line and AOL and the sort of beginnings of internet providers and things like that were just starting to come around. So I really just grasped onto that and fast forward a little bit, I really got into programming and just technology and computer science. And then security, I saw the movie Enemy of the State and I was already interested in security but it wasn't really like today where there's lots of online literature courses. It was really just sometimes people would leave Telnet open or stuff like that, like a long time ago and forge emails from other people.

Ben: So you start to think about the attacker versus defender side. But then I saw Enemy of the State and I was like, "Whoa, this NSA thing is really cool." And at the time NSA was much more private. It was called No Such Agency and they didn't have cameras, they didn't have it in the news all the time, they didn't have the director of NSA giving talks and things, all this stuff that's happened since. And so I applied and I fell in love and I'll happily talk about my journey more, but that was really the start of it. And then I've been hooked ever since.

Chris: Wow. So it's funny, I was actually going to next ask how you got involved with the NSA, but it literally comes from one watching of Enemy of the State.

Ben: And then I flew out there and interviewed and it's a massive organization, lots of different offices and parts of the mission you can work for or work on. And I really just enjoyed that work and I stayed in the intelligence community until 2007. Then I wanted to take a break. I was in DC, I wanted to move back to Chicago, which is where I went to undergrad and my wife went there and we never really got to experience downtown, worked in the financial trading industry for a couple of years writing code for traders, which was a really cool experience, but my heart's in security. And so in 2009 I went back to security and I've been there ever since.

Chris: Wow. And so from there, what sort of moved you towards being with Carbon Black and then founding Obsidian, what did you learn at the NSA that sort of progressed your path that way?

Ben: Yeah, I mean we had to learn a lot about both offense and defense in cyber. And it was just this, thrown into the fire, you have to learn as much as you can and a lot of the problems you're working on, you couldn't just Google a solution. So it really created this philosophy among all of us that you just have to figure stuff out and figure out how things work and build something if it doesn't exist right now. And so I think that fits with the entrepreneurial side of the house. And then on the Carbon Black side, we were doing incident response in 2009, 2010 around Operation Aurora when the Chinese were hitting a bunch of sites and we were saying, "Man, incident response, this is like copying hard drives and forensics. This is so inefficient, so ineffective." If you look to the physical world, you have surveillance cameras where you can rewind the tape. If you look to airplanes, you have the black box or the flight recorders kind of thing.

So, we immediately started saying, "What if we collected all this data, would that sort of see enough of the attack and be able to rewind the tape and help incident response?" And so the journey went from there and kind of exploded in a good way and the rest is history. Happy to talk more, but essentially at the end of 2016, I was really getting that itch again to do something new and to really start something from the ground up. And we were 800 people at CB, I was doing a hundred flights a year globally, it was awesome, but just wanted to do something new.

And so I actually ran into Glen and Matt, who were my competitors. They were at Cylance. We had strong friendships, Matt and I were both at NSA and Glenn and I had a lot of conversations around potential partnerships early on before we became very competitive. So we had friendships and mutual respect. And I wanted to move out of the weather. So I was in Chicago and I grew up in Vermont, so I sort of served my time you could say. And these guys are in, SoCal in Southern California and that sort of fit. And plus our personalities fit and everything. So here we are, started up city in early 2017 and now we're about two years in and just sort of the gas is down and we're moving forward as fast as we can.

Chris: Okay. And so was there an opportunity to do something on a more personal level with Obsidian because you're saying that Carbon Black, it seemed like the scale was maybe too large. Were you able to work on a different scale this way? Or...

Ben: Yeah, I mean Carbon Black continues to be great and build some really cool tech. And I was fortunate that I essentially became the field facing CTO where I got to interface with kind of the best of the best security teams and all this stuff and learn from them. And that was really cool. But I kind of missed just the building and the initial blank whiteboard, let's build something new. And it had been seven years and you sort of get that itch, right? And so yeah, it was like, "Let's get back to building, let's start something fresh, really impact the trajectory of a new organization and see where I could go."

Chris: Okay. So we have a variety of topics to discuss with you today, but let's start at the top level and move our way down. So in your introduction to me, you said to defend our environments in this changing threat landscape, we have to focus on the user. It's the only way state of security will actually improve. So what does it mean to focus on the user when crafting your cybersecurity strategy?

Ben: Yeah. So I think there's a few different angles there that all compliment each other. So I think no matter what, you can't ignore culture. You have to think about culture, you have to think about education. You could add a whole bunch of security staff, but if you can get 100,000 employees or 10,000 or 1,000 or whatever you have just acting a little bit better, more intelligently, being a little bit more careful with what they do or that kind of thing, that's way more impactful than adding 10 awesome headcount, right? Simply because of numbers and statistics. So you have that angle, you have the whole people and culture angle.

From a more trend and technical angle, what we're seeing, and this is why we decided to focus more on, we now call it intelligent identity protection, but with this whole move to the cloud and a lot of end users and employees moving more mobile and roaming, whether it's coffee shops or personal devices or whatever, you're losing control of the network. And in a lot of times you're losing control of the device, the client device, the endpoint device, and even the backend because the backend is now cloud. So really the only thing you can do is try to manage that access of who is accessing what, is that appropriate access? And then what are they doing with it?

So you then have to start to think about, "Okay, how would I analyze things like behavior? Does this look like this person? Does it look malicious from an insider perspective?" And then the other angle I want to talk about which fits into both of those is, if I'm an attacker, I want access. Access is king, I'm coming after your credentials. If I can get credentials and login as you, I'm not installing anything, I'm not doing anything crazy from like a system perspective. I'm just logging in using things the appropriate way, just pulling your documents, your database. Right?

And I don't mean to trivialize it, it's hard. Bad guys have to steal credentials and try to blend in and stuff like that. But the problem is it's still too easy. We see all the time people lost passwords and then adversary logs in and the rest is history. So I think those are the reasons why it's cultural, that's where you get a lot of improvement. You need to understand access and what the users are doing because that's really the attack factor. And then it has been a pretty sore spot around credential theft, credential stuffing, that kind of stuff.

Chris: So yeah. In your intro, again you asserted that in any organization it's highly likely that some user's credentials have already been compromised. So why do you think credential compromises like these are so commonplace?

Ben: Well, first of all, I think it's still pretty sad state around password reuse and some of those kinds of things we just treat as basic no-nos, but those happen, those things happen. And then these massive credential dumps and again, if I'm an adversary and I'm trying to get in, the first thing I'm trying to do is get as many credentials as possible so that if I get booted out or if I install an implant and that gets detected, I can still try to get in through other mechanisms. Right? So credential loss is huge and a lot of it though again comes back to the humans reusing passwords or just sort of not being careful with that access.

Chris: So it sounds like education is probably a big part of what you're advocating for. Do you believe in things like password managers or is it more about learning to write rememberable but more unique passwords and things like that? What are your strategies in general for getting rid of bad credential hygiene?

Ben: Yeah, I think you nailed it. I would say if there's only one thing you could do because of time or cost or whatever, it's get a password manager because so much, whether it's consumer and personal or professional identity, so much comes down to that access and that identity. And if someone can become you, then they all of a sudden can pretend to be you, like we see in business email compromise and getting people to wire money or they access your data or things like that.

Chris: Okay. So what cyber security policies do you think are being prioritized too much at the expense of more useful strategies? Basically like users is the real thing, the real hotspot and that maybe people are spending too much on tech to go through it. Are there things that you think are not getting the most return on the investment? People are spending too much money on and aren't really doing the job?

Ben: Yeah. So, of course I'll preface this with every organization is different, but I think in general there's a couple of different phrases we could use. The first is, and this is one I like to use is defenders defend infrastructure, attackers attack humans. And so those don't really line up. The humans are still thinking about IP addresses and host names and things like that. Whereas the adversary is often thinking about like, "Hey, how do I get this person's credentials?" Right? So there's that.

The other thing, I can't remember who said it, but I like it is, defenders think in lists, attackers think in graphs, which is kind of the same thing, which is sort of relationships and how would I move around and get to where I ultimately want to be? So there's a couple of those are more philosophical, they're more approaches, more training. I think a lot of people did grow up in the deep packet inspection and NetFlow and those kinds of things. And there's still a lot of value there, but as the world becomes more and more cloud and fragmented, then what network are you collecting traffic on kind of thing. But coming back to your question, I'm a huge, I guess sort of... I'm very outspoken around all of us buy tooling, security tooling and I'm just making numbers up here, I'll be honest, but my estimate is you're probably getting maximum like 50% of value from what you're buying.

And that's maximum because there's so much these tools could actually do, or the data that the tools collect you could actually use, and it's maybe not even just security. Maybe there's other aspects of IT that could benefit from the visibility the security tool provides or whatever. But the point is, people don't usually have time to continually tune and optimize what they've bought. And so they never quite get the full value that they pictured when they saw the first demo or when they've made that PO. So I think there's a lot there.

And then the last thing I'll say is making the tools work more together. So I do think over the last, let's say four years, there's been a good surge around integrations and orchestration and automation, all that stuff, but people still haven't taken it to the next level to truly gain all the efficiencies they could. It's sort of like, "Hey, let's plug these things together and sort of integrate. But then let's go, from a vendor perspective, let's go do a whole marketing campaign that we now integrate, but we never actually achieve that deep integration that really saves people a ton of time." Right? It's saved some time but it could be one plus one equals three and instead it gets to like one plus one equals 2.2 and then every vendor moves on to the next integration. So it helps, it's worth it, but it's never quite where it needs to be. So, I could rant all day about all sorts of stuff we could do better, but those are some of the things that I think about often.

Chris: Well those are two really systemic things that I think are worth bringing out into the public though. So what do you recommend in terms of security departments that are already stretched thin or short staff and they get their new toy out of the box and immediately install it and then, "Oh my God, this deadline is coming up, we don't really have time to optimize it." Do you have any sort of tips for finding a way to slow down and optimize? Are there go to things that all, security packages that can do better that you can look for in whatever you bought?

Ben: Yeah. So I mean I think we could sit here for four hours, so I got to be careful, because I don't want to take up all the time. But I've started giving a talk called Lean Hunting and I think I'm going to actually start giving a talk this year at different conferences, something around just calling it Lean Security or something. But it's again kind of applying entrepreneurial mindset around, how do we get more with less? How do we squeeze more value with fewer head count or with fewer tools? Now it seems like most organizations can get enough budget to get some tooling and some headcount, sometimes quite a bit, but it's still whether you don't have the tooling or you don't have the head count, usually you're missing something. Right? So how do we get more out of that?

So I think to start with it's, "Okay, what do I have in my possession now where if I just squeezed a little bit more value out of the logs going into a SIM or to ELM or something like that? What can I do with it? Or if I just write this extra little script that automates the de-provisioning of an account through PowerShell or something, it actually saves time because we can do that much more quickly than the way we do it now through IAM or whatever." There's all these different ways. So I think it starts first with just a mindset and approach of this hungry, "I'm going to build, I'm not just going to analyze data. I'm going to be an engineer in security." By the way, I've talked to probably 800 organizations now through Carbon Black and Obsidian and the number one difference between the best and the, we'll say not best is, are you approaching everything from a more engineer perspective or are you just approaching it as an analyst that's just looking at data and reports?

So if you can start to shift your team or your organization to more of an engineering mindset, that's going to give you a large return. The other thing is blend open source and commercial. So maybe in certain areas, commercial's way better or it's just way easier to manage or whatever, okay, cool. Go do that. But there's other areas where maybe for example, you throw data into ELK or you use osquery or you do these different tools that have actually made a lot of progress where you can still improve your team or your environment without a lot of spend. Now when I talk to other new vendors, the problem a lot of vendors think about is they only think about cost in terms of financial cost. But you, and I'm sure all the listeners realize that there's the time that as a practitioner, that you're working with your procurement team to get that tool in house, the time you're trying to work with IT to get a server to install something, the time you're actually going through training, there's so many costs, right? So you have to figure out how all that works together.

And then the last thing I'll say, and again I could keep talking about this, is coming back to something we just talked about a few minutes ago, which is, do you have existing tools or does someone maybe in another team have existing tools that would help you do something better? IR, hunting, policy management, whatever, that maybe wasn't designed exactly for your problem but you could use that and squeeze that value out of it? So, I think there's a lot of opportunities to win here but mostly it starts with that philosophy and approach around, how can I get more with what I already have?

Chris: Yeah, engineering mindset but also collaboration with other departments and stuff.

Ben: Oh, sorry. There's one other thing I want to say. Here's one more thing which is a little bit different because you said collaboration, if some team just went through months of retooling and building out a nice tech stack and building some playbooks and whatever, can you learn from that? In another company, can you learn from that team? And conversely, if you just went through that, can you share that? We focused so much, because there was such a need to share threat intel and threat feeds and IOCs and whatever else you want to call them, like intelligence. Can we actually share what's working and what's not from a tech stack, from a tooling perspective, from just sort of an approach perspective, how we operate, how we build out our security program? Because someone else just spent months figuring that out and it might not be perfect, but you can learn from that and their months of effort, you talk to them for an hour, you might have absorbed a lot of that. So we got to start sharing more of the approach, the guiding principles, the tech stack and not just focus on things like threat intel.

Chris: So to move from the best practices to, as you said before, the not best practices, you mentioned previously that one topic you like to speak about is the data privacy and security practices of the world's largest tech companies and where they're falling short. Since security companies should ideally be the most knowledgeable on the best practices to take, where do you believe they can stand to approve their data privacy or security postures?

Ben: Yeah, so security versus privacy is a never ending pendulum and battle. I think, I was just on a panel last night and we were talking about everyone's favorite GDPR. It's challenging because there's a lot of rights and a lot of privacy that I really do think we should expect as consumers, but it gets much fuzzier when you become an employee, right? Because you're using the employer's information system, you're working an employer IEP, whatever, right? How much of a right to privacy does the employee deserve versus being able to, for the environment to monitor for cyber attacks and threats, insider threats, et cetera? So I don't want to go too much further down that because we could talk about that for the rest of the day, but there is this challenge where a lot of these laws are starting to pop up that are more geared towards consumers who are really are just, it's more about educating people that, "Hey, you are the product because you're using this service or social media or whatever for free, so your data is going everywhere." And then how does that apply maybe to enterprise and corporations?

From a data privacy and corporation perspective from like a social media provider or things like that, I think on the positive side, most of the big guys have great cybersecurity teams. So, they've invested heavily in preventing unwanted compromise kind of thing. Where it starts to get a little less good we'll say is they're essentially in the business of selling the data or chopping it up and finding different ways to utilize it. And every copy of data is a liability and so when you start to have these large datasets and it's very easy to click a button and copy an S3 bucket or shift some data from one provider to another, move from one cloud to another, whatever, very quickly you can lose track of the data.

And I think that's where a lot of work still needs to be done of, "Okay. Yeah, sure you can hold my data, you can hold my patient data, I don't know, cell phone location data if you're my cell phone provider, whatever, stuff that you need to operate. But then if that starts to flow into third parties or go other places, do you essentially have some sort of chain of custody or some sort of tracking to at least understand where it's going?" And I don't think anyone's doing that very well right now.

Chris: So to move on to another topic for the purposes or to get our listeners caught up if they don't know about it, what are credential stuffing attacks and why are they on the rise and how can enterprises protect themselves against them?

Ben: Well, basically, I mean there's lots of credential dumps, right? So there's lots of sites or databases, et cetera, where you can just go harvest. I'm going to go collect a million logins to some service or whatever. And then I'm going to go try your Gmail username and password against all these sites because you probably use your email as username, because that's usually the default and then see if you didn't change your password. Right? Or even things like work emails, if I collect your work emails, maybe I can use that to sign into a bank or something like that. So it's all about trying different credentials that have been seen and known to work somewhere. So, at some point in time, this username and this password worked for you, so I the adversary am going to try that against these various sites thinking that you probably have Facebook, Gmail, Bank of America, JP Morgan, whatever, all these different sites.

And so back to your password manager question, a lot of times the person that's outside of the cybersecurity world is like, "Well how are they guessing my password to Gmail or whatever?" It's like, "No, they didn't. They compromised that tiny little yoga studio or the pizza place or whatever where you signed up for an account and then they used that to go into the major places that actually have really good security." So I think it really comes down to a couple things. The first is, if we can just make password reuse go away through whatever means, that's one huge thing. Number two though is better monitoring. Are we seeing strange logins or logins from devices that we don't expect? Things like that. I think you're starting to see some of that from Googles and other people in the world that say, "Hey, this device just logged in. We've never seen it before. Just wanted you to know", kind of thing. Right? It's not perfect, there's some ways that adversaries can maybe get around that. But in general we're starting to move in the right direction around analyzing where people are accessing things from, what they're doing. And that's what we're trying to do more on the enterprise side.

But it's this kind of combination approach of reducing the likelihood that stuffing would even work and that includes things like multifactor authentication, but then B, making sure that there's some analysis of like, "Hey, this login happened and then this activity started happening, we've never seen you log in and just download every file on the file share. That's kind of weird. Maybe it's okay, but let's at least check." So I think it's a combination of those things.

Chris: So I imagine that theoretically we're probably a long ways down the line, but it might be able to implement something for your login pass the way we currently do with banks where they see you suddenly taking out $5,000 worth of iTunes credits or whatever, and they call you and say, "Wait a minute, is this actually you?" Is that a possibility where your email provider, whatever can say, "Hey, your user pass has been used on these 17 different places within five minutes?"

Ben: Yeah, I think everything's luckily moving in that direction, which is positive with analytics or machine learning or other techniques. It's usually a combination of them, where it's basically comparing the current behavior against what you've typically done. And if it deviates enough, which is basically how your credit card spending fraud detection works, right? It's like, "Hey, this is very outside you. It might still be you, but I'm going to check."

Chris: Mm-hmm (affirmative). Yeah. So at the time we're recording this about two days ago, we had an election here in Chicago for the mayoral runoff. I previously spoke with John Dickson at the Denim Group about the hacking that happened to voting machines in the 2016 elections and the possibility for the same in 2018. Did we actually see a measurable level of voting machine fraud during the midterms? I don't remember seeing much reporting on it by comparison with 2016 so I'm curious if anything happened.

Ben: Well, I think it was even just this morning or yesterday, one of the senators, and it's been a busy day, so I blank on the name, I apologize. But one of the senators sent a letter saying, "Hey, we can't say there's no evidence of election machine tampering because we haven't really looked for evidence. So we haven't done forensics on the machines." It gets very crazy because there's very... And I'm not a lawyer, but there's definitely challenges because some stuff has to be done more at the federal level, but a lot of voting really is still empowered by the states or the states are empowered to conduct voting without sort of federal interference too much. It's a very kind of tricky line. I don't personally know. I mean what I would say is, I don't think things have improved very much. So, I think that's a major concern.

Back in 2016, one of the concerns was, or two major concerns were machines that don't have a paper trail. And literally, so I was living in Chicago at the time, Chicago... I can't remember, I apologize, I can't remember if it was Cooke County or maybe one of the counties in Indiana right there around Lake Michigan was literally using election digital voting machines that have already been banned in California for being too insecure. So it's, to the earlier discussion around threat intel sharing or best practice sharing, but from an election perspective, it's like, "Hey, we spent all this time determining that these machines are not good enough. Why are you who have maybe millions of people in your precincts or whatever still using them?" So, I can't say there's been any evidence or anything like that, but I don't think we've moved forward enough in terms of just improving the standard practice.

Chris: So in addition to firmware updates and more up to date tech and stuff, what do you feel nationally is the current state of election security and what can be done to improve the nation's cybersecurity hygiene across the whole ecosystem, not just Chicago?

Ben: So a couple of minor questions there, right?

Chris: Yeah. Nothing that affects every single thing going forward or anything.

Ben: Well, I think elections... So on the one hand, you sort of get the almost like a stock portfolio where you diversify and so if something has a big problem, you're still okay, kind of the same thing with tech, but you don't really want security through obscurity. You don't really want security just through diversification. You need to have a reason. And so I think there needs to be more standardization actually around, "Hey, these are very well built machines and there's audit trails and things like that." So I think sharing and spending money is required. I think a lot of the times the people that are working the machines aren't the most sophisticated because, let's be honest, usually it's people that maybe have a little bit more time and therefore aren't in the corporate rat race and maybe just haven't been as technical-

Chris: Or the volunteers who have another job and don't have time to...

Ben: Right.

Chris: Yeah. Right.

Ben: Right. And they're not supposed to be experts. And then kind of zooming out, I think from a cybersecurity perspective, things are getting better. Things like the iPhone and some of these other technologies that have become very popular have just risen the bar tremendously. The fact that your phone is usually encrypted by default and maybe requires biometrics and things like that. For the average person, it's improved things. You start to find ways to make multifactor authentication or two factor authentication more on by default or easier to use. I think that raises the bar. From a company perspective, the challenge is still, it's like someone's throwing missiles at you, but the government saying, "Hey, but it's your problem." Because the federal government is not trying to defend all the corporations and enterprises.

So, I think there it's around, as we refresh our tech and we start to move things to cloud and just renew our laptops and things, making sure that this time around we've thought about security more from the beginning and privacy, security and privacy more from the beginning. And then finally it comes down to again, people. The awareness, the culture, the approach. Like, "Hey, stop clicking on stupid stuff or stop reusing passwords." If we just did those two things, whether it's at work computer or home computer, everything's going to get better. It's not going to be perfect. We're still going to have attacks. But it's things like that where if we can just get more of the population or more of your employees to do those things, you get better.

Chris: I'm sort of springing this one on you, this was not on the list, but I was thinking of in terms of election security, I don't know if this is opening a new can of worms or making things worse, but do you think that there's still a benefit to having polling locations that you have to go to with secure computers? A lot of people say, it seems like we should be able to vote on our own computers or our phones by now, or something like that. Would that be a further security risk or is having these insecure places to do voting, is that more insecure? Is that even a possibility in the future, do you think?

Ben: Yeah, so first of all, I'm not really, I don't have data to support-

Chris: Yeah, this is pure philosophy, so feel free to...

Ben: Yeah. So we're just sort of brainstorming here. I think we still need to go to a physical location for the foreseeable future, more so because we need to really think about all the ramifications and the attack vectors and the possibility of tampering or compromise. So, in my head, it's not because of some exact reason why we need to hold off a little bit, other than we need to think about it and really discuss it and really flesh out the attack factors and things like that.

Chris: It has to be an ironclad security strategy before you start thinking about-

Ben: Yeah. And look, the reason the Internet's a combat zone is because there's very little attribution, right? If everyone has to have attribution and like Astonia did, where you have to use a card tied to your person that says, "I am Ben and I'm getting online with this IP address and this email address and stuff," cyber crime goes way down because everything can be tracked back to an exact human in theory. But the flip side is, "Okay, how do I prove that I am the one that submitted that vote? Or how do I prevent someone else from not voting on my behalf just because they figured out my token or my credentials or whatever it is?" So those are the ways where I think we need to think more about just the ramifications and the attack vectors. And I do think having to go to a polling place and being on a pre-generated list, those aren't perfect things, but they're deterrence. It's hard to do it at scale, hard to attack that at scale without people noticing.

So, I think my preference is, let's get a little bit more paper involved in the election process and truly have a better technical solution and then maybe we go kind of hardcore into the technical solution.

Chris: Pulling place as old school two factor authentication.

Ben: Yeah.

Chris: So as we start to wrap up here, what other credential based cybersecurity emergencies do you think are on the horizon? Where do you think the next wave of cyber crime is going to come from? Is it from either a tool or a technique point of view?

Ben: So, well going back to the credential stuffing thing for a second or just account compromise, what's interesting is the adversaries iterate and innovate and they're not always super sophisticated, but there's enough super sophisticated or sophisticated ones that sort of raise the adversary's game. If you think about tools like Mint or some of these other sort of luxuries or nice things that help you manage your finances or you can log into one place and then it can log into your Twitter and Facebook and all these other things. The reason I bring that up is the adversaries use those because what they do is they log into something like Mint or some of these other tools, or even like a bank account that allows you to connect to another bank account, then they can test credentials through Mint. Then all Bank of America or JP Morgan or Citibank or whomever sees, is that they keep seeing failed logins from Mint. So, it's not like it's coming from Russia or China or wherever.

So that's just an example, but it's an example of people getting a bit more sophisticated in using our own tools against us. So there's that. I think the other thing is, and this is quite frankly a massive problem, the cloud providers are not responsible for security. They're responsible for security at the foundational level, the infrastructure level, they say they're responsible for security of the cloud, but you're still responsible for security in the cloud. So the reason I bring that up is, we talked to organizations over and over again who race to put their application in the cloud or use some SAS based service or whatever, it could be Slack, G Suite, Office 365, Salesforce, whatever. And people just think that this cloud provider is doing all the security for them, which I understand why they might think that, but it's not true.

The truth is that cloud provider might be securing the underlying routers or patching the underlying Linux or whatever the operating system is, but you're still responsible for making sure the right people have the right access, they're doing the right things, they're not leaking data, that kind of stuff. And that's actually a huge problem because these teams can whip out a credit card, sign up for AWS in a minute, spin up clusters that are costing thousands of dollars and doing compute, storing data, whatever. And really you just have to hope that the default security policy is good enough or someone didn't click the wrong thing. So I think that's a huge problem.

And then the other thing is, it's just, how do we allow people to be productive and have a good experience as an employee or consumer knowing that a lot of what they want to do is their personal device? So, your phone or your tablet or whatever essentially has parts of your professional identity and your personal identity and I think we're going to have to figure out how those kind of unify but also stay separate. And I don't think anyone's really solved that yet.

Chris: All right. So for a bonus round here, could you tell our listeners a little bit about the project that the Obsidian team has been working on? I've heard it leverages data science, AI, and machine learning to drive identity intelligence.

Ben: Yeah. So, we collect activity logs and other information from all these different applications and cloud providers. We unify that and then we really try to surface to the security team, the IT team, where your access creep or identity creep has occurred. So, basically every organization, you probably have accounts that you don't use. You probably were given an account when you started and then you never logged in. So, where do things like that occur? Where do you have extra surface area that really shouldn't exist? Might cost you money, but it certainly adds risk. And then we're also looking for detection of threats and then helping to respond. So we're really trying to understand how the employees are utilizing these different applications, should they be doing what they're doing? Should they have that access? And then how can you right size all of that?

Chris: And if our listeners want to know more about Obsidian, where can they reach you?

Ben: So obsidiansecurity.com or @obsidiansec. And I'm ben@obsidiansecurity.com.

Chris: Great. Ben, thank you very much for being here today. This is a lot to think about.

Ben: My pleasure. Thanks for having me.

Chris: And thank you all for listening and watching. If you enjoyed today's video, you can find many more on our YouTube page. Just go to YouTube and type in Infosec CyberSpeak to check out our collection of tutorials, interviews, and past webinars. If you'd rather have us in your ears during your work day, all of our videos are also available as audio podcasts. Just search Infosec CyberSpeak in your favorite podcast app. To see the current promotional offers available for podcast listeners and to learn more about our Infosec Pro Live boot camps, Infosec skills on demand training library, and Infosec security awareness and training program, go to infosecinstitute.com/podcast or click the link in the description. Thanks once again to Ben Johnson and thank you all for watching and listening. We'll speak to you next week.

 

Join the cybersecurity workforce

Are you a cybersecurity beginner looking to transform your career? With our new Cybersecurity Foundations Immersive Boot Camp, you can be prepared for your first cybersecurity job in as little as 26 weeks.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.