How to get started with bug bounties and finding vulnerabilities

On this week's Cyber Work Podcast, BugCrowd and disclose.io! founder Casey Ellis discusses how to think like a cybercriminal, the crucial need for transparent vulnerability disclosure, the origins of BugCrowd and why mentorship is a gift that goes in both directions.

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

  • 0:00 - Intro
  • 3:15 - Getting into cybersecurity
  • 4:30 - Criminal mindset in cybersecurity
  • 5:49 - Ellis's career to date
  • 9:10 - Healthcare cybersecurity
  • 11:47 - Mentoring others
  • 13:52 - Mentorship as a two-way street
  • 16:12 - Bugcrowd and bug bounty
  • 19:18 - Vulnerability disclosure project
  • 21:30 - Bug bounty popularity
  • 24:52 - U.S. sanctions on hacking groups
  • 26:52 - Hiring hackers
  • 31:52 - Pursue specialization
  • 33:51 - Cyber threats flying under the radar
  • 39:17 - Working from home safely
  • 40:48 - How to get into bug bounties
  • 42:18 - How to report vulnerabilities
  • 44:04 - Advice to begin ethical hacking
  • 45:23 - Learn more about Ellis
  • 45:56 - Outro

[00:00:00] CS: Today with Cyber Work, I got to speak with Casey Ellis, founder of Bugcrowd and disclose.io. Casey discusses white hats learning to think like black hats, the crucial need for transparent vulnerability disclosure, the origins of Bugcrowd. And why mentorship is a gift that goes in both directions. All that and more today on Cyber Work.

[00:00:26] CS: Welcome to this week's episode of the Cyber Work with Infosec Podcast. Each week, we talk with a different industry thought leader about cybersecurity trends, the way those trends affect the work of infosec professionals, and offer tips for breaking in or moving up the ladder in the cybersecurity industry.

Casey Ellis is the Chairman/Founder and CTO of Bugcrowd. He is a 20-plus year career veteran in information security and has been inventing stuff and generally getting technology to do things it isn't supposed to since childhood. Casey has worn a variety of professional hats working as pen tester, security risk consultant, and solutions architect, chief security officer, and most recently, as a career entrepreneur and company leader. Casey pioneered the crowd-sourced security as a service model, launching the first bug bounty program on the Bugcrowd platform in 2012 and co-founded the disclose.io vulnerability disclosure standardization project in 2014.

Casey is a sought-after industry visionary, media commentator and keynote speaker, and has advised the US Department of Defense, Australia and UK intelligence communities, and US house and senate legislative initiatives including preemptive protection of cyberspace ahead of the 2020 presidential elections. A proud native of Sydney, Australia. Casey lives with his wife and two kids between Sydney and the San Francisco Bay area. He is happy as long as he is pursuing potential.

So a lot of what I want to talk to Casey about is in that intro there. I’m very excited to talk about Bugcrowd and disclose.io. And Casey also uh asked me, or wanted to be on the show, because he wanted to talk about uh hiring an army of hackers to hack back or to counteract hacking in an organizational sense. So looking forward to hearing more about that.

Casey, thank you for joining us. Welcome to Cyber Work.

[00:02:11] CE: Thanks for having me.

[00:02:13] CS: So you mentioned it in the bio, and I always like to start here. You got interested in computers and tech pretty early it sounds like. So what was your first attraction to it? And what got you excited specifically about cybersecurity? What was the initial draw?

[00:02:27] CE: Yeah. My father was a science teacher. So I think what that kind of netted out to is I just always had technology around. He'd bring stuff home. First computers coming in, all those different things. But stuff like radio, and lasers, and whatnot. I was pretty fascinated in physics growing up. And it gave me a lot of opportunity to get my hands dirty and see how things work.

I think combined with that, I started to recognize this this appreciation, I guess, for criminal creativity. Like just looking at it all, understanding how criminal behavior worked. How criminal kind of enterprise worked, and being fascinated by it, but having absolutely no desire to be a bad guy myself. So when it kind of came together and I was able to form a career was pretty much straight out of high school. I actually picked up a network engineering role. Started hacking stuff at the behest of clients in that role and ended up realizing that pen testing was a profession I could get into. That was pretty much all my Christmases coming at once and that's where it all started out.

[00:03:27] CS: That's fantastic. So can you talk to me a little more about, yeah, the criminal mindset that you sort of learned about early on. You said obviously that you weren't interested in going down that path. But can you walk me through some of the facets of the criminal mindset that especially excited you?

[00:03:42] CE: Yeah. Just the idea of almost solution, like solution engineering without rules. And in some ways, you can sort of see the transference of that off into entrepreneurship as well. But yeah, for criminals, basically they've got their own business model. They've got their own job that they need to get it done. And they do it in a way that has to make its way around all sorts of defenses, but also is kind of blind to the rules and laws that might say, "No. Don't do that in the first place."

And I think what that does is creates this this environment for solution development almost with pure pursuit of possibility as its primary objective. That's the part that I just get intrigued by. And again, it's an interesting kind of duality that that sets up, because do no harm. Like an ethical kind of – That's not even something that I had to be necessarily taught or reminded. It's just a kind of person that I am. So trying to reconcile those two things and try to figure out like how can I make a career out of this when I realized that was actually possible was pretty cool.

[00:04:46] CS: Yeah. That's awesome. So moving on to your career background. You talked a little bit about where you got started. But I like to usually research our guests by looking through their LinkedIn profile, because it almost always tells a story. So you'd entered networking and information security pretty quickly out of your schooling network TIA, which you just mentioned. In 2009 you founded White Label Security, which provided information security services on behalf of IT organizations. And in 2012, of course, you founded Bugcrowd, a crowd source organization responsible for creating the first bug rounding programs. In between all of this you also have ongoing mentor advisory and board roles for companies like Tall Poppy Group, CyRise, Flirtey, Starmate, and the CTI League. So can you untangle some of these many organizations and what you do for each of them? And also, tell me how you manage to find 12 extra hours in the day. Because that's quite a stack.

[00:05:36] CE: Yeah. I don't. I don't. And there's definitely, I think just on the second part around time management, it excites me. Like I love what I do. I love the opportunity to learn new things. To take what I’ve learned and be able to help other people, whether it's all in the interest of trying to get the good things out and create solutions that improve the state of the world in general, but also have the opportunity to generate wealth and be viable businesses and all that kind of stuff. It's just something that I get a kick out of.

So I kind of make time for it in that sense. But also, but it is a lot. So I think time management and prioritization is really how I try to squeeze all of that into like a 24-hour day. Hopefully not all of which I’m actually spending working, because I’ve got a family and all those other things to consider as well.

But yeah. So Tall Poppy Group is actually the consulting company that I started when I first basically cut the leash from salary employment. And then made the decision to become an entrepreneur. That was 2006 – No. 2008 from memory.

White Label Security is actually the precursor to Bugcrowd, because we're basically white-labeling security services to sell in Australia and bringing them in, finding the best talent and basically doing that all by hand. I think we're coming back to that in a later question. So I won't go too much further into that.

CyRise and Starmate are both technology accelerators. CyRise has a focus on cybersecurity. Starmate is actually where Bugcrowd got to start. Plus it has the most Australian name I think possible, Starmate, because with the weird R in there and you're saying mate as well. So that's kind of fun.

[00:07:20] CS: Yeah, phenomenal.

[00:07:22] CE: CTI League is a cyber threat intelligence group. That was really formed to basically get as many people together to help the healthcare sector fight off adversaries in the cyber domain as Covid started to break out. So that was really – It continues on today. But particularly in those early phases as everyone was trying to figure out what's what, that was a very busy time.

[00:07:46] CS: Yeah, I can imagine.

[00:07:46] CE: It's been a phenomenal group. And then Flirtey is a drone delivery company. That's really about you know the cybersecurity aspect of that in terms of my interaction with them. I’m a part of that because I actually see that being a thing that's eventually just going to be real in terms of how we all do live. And I wanted to see around the corner on that one and figure out how it's all shaping up.

[00:08:07] CS: So I’m curious about the CTI League. So you said this was started during Covid, and it's sort of related to healthcare security. Or what's – Is it about like people – Because there's so many like so much stuff about like phishing and people like being phished by like, "Secure for Covid. Click here," or whatever. But is it related to that or is it related to like healthcare data?

[00:08:31] CE: All of the above.

[00:08:32] CS: All of the above. Okay.

[00:08:32] CE: Yeah. Essentially what – Yeah, the court, the problem that the founders of that group saw that they need to solve quickly is this very diverse array of basically evolving cyber threats that would be triggered by Covid, which is pretty much exactly how it played out. So yeah, the spammy scammy kind of end user stuff that you just mentioned. There's definitely an aspect that speaks to that, disinformation, information warfare. Just literally healthcare organizations that are pivoting to work from home so they end up with exposed infrastructure on the Internet. There's just a whole laundry list of things that pretty much people from each of those domains kind of got brought in. And it's almost like a tasking system in a sense that gets the right people with their eyes on the right subject.

It's a fascinating thing to be involved in, because I think this idea of almost, at that point, crowdsourcing like an ice stack in a sense, or the sharing and exchange center to get the right people onto task as quickly as possible. With healthcare, like it's urgent, right? So it's one of those ones where they needed a lot of help. And that's essentially what it is.

[00:09:51] CS: Yeah. You said that, obviously, it was a huge time crunch at the very beginning there. But I imagine it's still probably – There's still probably an awful lot of things on the to-do list, right?

[00:10:00] CE: There's still a lot going on, yeah. The time crunch at the beginning of Covid. I’ve got an expression that I throw around that like speed is the natural enemy of security. And I've modified that somewhat to say like haste is the natural enemy of security. Like if you're doing things quickly because you feel like you have to. Generally trying to do those things in a manner that's secure and that manages this risk well gets de-prioritized, because you're just trying to get the thing done. And that creates a lot of unintended consequences that people can take advantage of. So that was like the early, probably the first six months of Covid. A lot of that. And it's settled into your business as usual.

[00:10:39] CS: More ongoing counter information and so forth.

[00:10:42] CE: Exactly. Yup.

[00:10:43] CS: You mentioned, or I mentioned, that you talked about a little bit mentoring groups and stuff that you've had in your past. Can you talk about the role of mentoring others in your cybersecurity career? Did you have mentors in your life and do you have mentees that you sort of pay it forward with now?

[00:10:59] CE: Yeah. I mean, that's effectively it, and that also kind of captures why as well. I’m a firm believer in paying it forward. Particularly, I’ve had mentors and actually understood and deliberately accessed the power of mentorship since pretty early on in my career. I think I sat down, had a conversation with someone at some point and it just clicked that this was a really important thing to do. So it's been a feature of my own personal growth.

And ultimately, it just feels right to pay it forward. Like at this point in time, going through, like creating a venture-back startup, creating a category, seeing those things grow, going from CEO to Chairman. Like all these different kind of aspects of my career. They're not things that everyone necessarily gets the opportunity to do. And I’m humbled by that. So the logical kind of thing to do at that point is to find ways to pay it forward to other people that might want to do the same thing.

[00:11:57] CS: Right. Now do you have sort of like a formal like sort of mentoring group either at your company? Or do you literally just sort of like contact people or someone contacts you and then you sort of take them on and then that was how you did it?

[00:12:10] CE: Usually. Yeah, usually people reach. So in terms of me being the mentor, people will usually reach out and we sort of figure out how it works and go from there. Within the company, we do have like an established basically mental matching program. And that's for all of the executives. Not just myself. And for the the team as well. And it actually works. I think mentorship works best in both directions, right? Because yeah, there's folk that I’ll end up talking to 10 or 20 years younger than me. I’m learning as much from them and their connection with how things have evolved and changed as they're learning from my experience. So if you can set it up in ways where it's bidirectional and not just the old guy telling stories –

[00:12:49] CS: And so I was going to ask, can you give some advice in terms of maybe not etiquette, but yeah, a way of making it a two-way street especially for younger people? I think there's that feeling of either like, "Oh, I don't feel like I could necessarily –" What can I tell them that they don't already know? Or you're sort of still young and inexperienced. And so you're like, "Yeah, thanks for the advice." And then you just kind of – You don't sort of like pay it back and return and so forth.

[00:13:15] CE: I think everyone – I think this is kind of a core principle of mine, is you can learn something from anyone. Like anyone who's not you knows something that you don't know that's potentially helpful. And I also pretty deeply understand the imposter syndrome that can come with going out and asking people to have these sorts of conversations.

Yeah, probably the piece of advice I’d give for folk looking at it from the younger end of things is just to like what's the worst that can happen? People say no, right? That was a really good piece of advice given to me by a mentor on how to push to mentors, which is like now we've got this like recursive loop happening here. But it was really this idea of just you being bold enough to go out and actually say, "Hey, can we grab a cup of coffee?" I’ve got some really dumb questions that I want to ask you. But I’d love to be able to get a little bit of your time. Like here's how I think I might be able to help you. In return, if you're not interested in that, that's fine. But you can see that I’m trying to give value back. Not just take it from you." I think that's a really important aspect of it.

But also to recognize the fact that like people that have done stuff, people that have achieved things, they actually want to – They want to pay it forward. What I’ve observed is there's a pretty common sense of that desire to give back and help people grow their own thing across most folks that have achieved crazy outcomes. Just knowing that. And in the process, they're going to ask you things that help them stay relevant and help them stay connected to like the state of the art from your perspective.

[00:14:55] CS: Yeah. Yeah, I could keep talking about mentor.

[00:15:00] CE: It's a dense subject. But I think it's worth [inaudible 00:15:01].

[00:15:01] CS: It's a dense subject. Yeah. Yeah. Yeah, I appreciate that. And I hope folks are taking notes because I think those are some really great points. But I’m going to jump on to the next point here. So I teased this out in the intro. But I mean, it has to be noted that your company, BugCrowd, is kind of the origin place for the idea of the bug bounty program. And then certainly we've talked about bug bounty programs as like a great way on this show dozens and dozens of times about for people who are like looking to get experience but haven't been in the workforce yet. Start with the bug bounty program. Show what you can do. Report everything.

I mean, the idea is so ingrained and ubiquitous now especially amongst people who are learning in cybersecurity that it's hard to remember that this idea only happened 10 years ago. So how did you come to the concept of the bug bounty program, the crowdsource security as a service model? And how did you grow it into this piece of essential security protocol?

[00:15:54] CE: For sure. It's so true too. Like some of the founding kind of moments for Bugcrowd, they feel like a million years ago and yesterday all the same time. So I completely agree. Bugcrowd didn't invent per say the concept of vulnerability disclosure to all the bug bounty program. But we did create the category of intermediating it. So actually coming into the middle and saying, "Okay, we're going to help the hunters connect to the things that they need to connect to. And we're going to help the organizations that need broader, basically, access to talent and access to knowledge from a cybersecurity standpoint. Actually do that whole thing."

So really, at the beginning, like coming into this out of a pen test market, the problem that I was kind of irrationally annoyed by that I thought I might be able to solve is this sort of fundamental imbalance when you think about the fight that we've got in terms of defending our staff. Like if you've got lots of hackers with lots of different skill sets, lots of different motivations, and really an incentive to create a result on one side. And then you've got Bob and Jane, the pen tester or the security people on the other side, it doesn't matter. Like Bob and Jane could be the best pen testers on the planet. They're still going to eventually lose because the math is wrong. So that's the thing that I wanted to try to find a way to solve. How do you balance that equation out to basically level the playing field from a defense standpoint?

And people at the time were starting to talk about the Facebook and the Google bug bounty programs. This is when I’m running a pen test company. I start asking them why aren't you doing it? And they all basically said the same things. It's like we don't know how to trust hackers. We don't know how to pay people in other parts of the world. Like all of these, there was this fairly short but fairly consistent list of objections they had to being able to fundamentally plug this latent potential that exists in the hacker community into the unmet demands that they had as a defender. And it was literally a flight home from Melbourne after I’d had a series of these conversations with the light bulb kind of went off. Like, "Wait, if we could solve those objections in the form of a platform and a service offering, then this is actually more about the future of work than it is about you know like bug bounty, or vuln disclosure, or any of the particular expressions of crowdsourcing that are out there at the time.

[00:18:16] CS: Okay. So moving on from that to disclose.io, which is a vulnerability disclosure standardization project. So it says it's a set of open source tools aimed at making disclosure vulnerabilities a straightforward thing to do and easily shareable and comparable with standardized tools. So what was going on at the time in vulnerability disclosure that made this service such a necessity? It just sounds like it was just kind of catch-as-catch-can or whatever.

[00:18:43] CE: A little bit. Like I alluded to it before. I think, yeah, the legal backdrop of hacking just in general was written before the concept of someone doing that type of thing in good faith existed, right? So it's all it's all about like if you're breaking circumvention in the software of a device or if you're exceeding authorized access in a system that's out there on the Internet, you must be a criminal. Because why else would you do that?

And at the time, I think that that sort of made sense because the Internet was only just beginning to exist. And and there was a lot of different things in play. But that's kind of the umbrella that the hackers that are operating in good faith kind of live under when they're doing this stuff. And that chills things. So what you end up with is I kind of characterize it as an autoimmune deficiency. Like if hackers are the Internet's immune system, then this is like an autoimmune problem that we've got. How do you basically deal with that? How do you make it easier for companies to create legal language in their VDP, vulnerability disclosure programs, brief, that keeps them safe and allows them to prosecute an actual criminal but clearly distinguishes that from people that are trying to help and then get it out as much as you possibly can?

So been a collaborative project that actually pre-existed when I started working on it. We sort of slip streamed a lot of things in together and now it exists under that disclose.io kind of masthead mostly to attract people that have the ability to contribute to it and take it forward as well.

[00:20:27] CS: That's great, because that leads perfectly into my next question, and a little parable actually, because one of our previous guests was a man named Connor Greig, who was briefly in the news in the UK because he won a McDonald's online game. And in sending the prize, McDonald's accidentally sent them all of their database access passwords and keys. And he said that the scariest part of the whole story was that he spent 24 hours over the weekend trying to find someone to report it to. And they didn't have a bug bounty program in the UK. So he had to eventually contact the US branch and get a contact email from them to go back to the UK corporate.

So do you think the fact that bug bounties aren't as ubiquitous? Is it because there's that element of maybe embarrassment? I mean, it's only recently that Apple started its bug bounty program and there are ample examples on the Internet of people making jokes about huge companies with bug bounties that basically amount to sending out some corporate swag rather than cash.

[00:21:20] CE: Yeah. There's a lot to unpack in that question. Probably the first thing would be that your vulnerability disclosure program and a bug benny program are different in the sense that a bug bounty is almost like a subset of a vulnerability disclosure program. So vulnerability disclosure is like neighborhood watch, right. Like as an organization I put something out there and say, "If you find something, here's how to tell me. Here's the channel. Here's what you can expect me to do in response to that." And that's where some of the safe harbor pieces that we just talked about with respect to disclose.io come in. Bug bounty is when you add cash to that as a way to thank people for doing it, which in turn incentivizes them to do certain things. So that's a pretty important thing to kind of distinguish.

Why it doesn't happen? I think it's a marketing problem in some ways. I think – And it kind of goes back to this traditionally always being a bad thing. Like that's how a lot of inertia behind it. We spent the first two or three years of Bugcrowd basically trying to turn that ship around across the board. And I feel like us and others have succeeded in doing that at this point. But not everyone knows it yet. So there's an awareness aspect to it.

I think the other side of it is that, more fundamentally, companies aren't necessarily operating on this idea that like cybersecurity is a human problem. And if you've got humans building your stuff, they're going to make mistakes. That's not because they're terrible or bad. It's because they're human. So assuming that's true, then it becomes a question of, "Well, how are you going to deal with that? And how are you going to receive information from the outside?"

Not every organization has actually crossed that kind of threshold of maturity in thinking, I think. Because it is an awkward thing to have to go from not believing that to believing that. You basically got to get comfortable with people calling your baby ugly, which there's some friction associated with that, and that's I think why not everyone's done it.

[00:23:19] CS: Yeah, I mentioned that only because it feels like there's like a similar energy between an employee at work clicked the wrong link and got the company breached and they're embarrassed to report it. Versus the company did something wrong or the company had an error that intentionally or unintentionally was found. And it's like, "Well, if we have to admit that." Yeah, then the PR guys have to get stepped in and we got to make a press release and everyone's gonna feel terrible. And yeah, yeah. Yeah, that's a big hurdle to jump unfortunately.

So can you speak about the recent sanctioning by the US government of groups like NSO group and Candiru that sell hacking tools and spyware? Considering that NSO paid a million dollars in 2015 for an iOS zero-day. And something like that is probably worth quite a bit more now. It seems like a no-brainer to pay fairly and play nice with people who find your exploits right.

[00:24:10] CE: Yeah. I think those sorts of conversations with – So I refer to that as offensive procurement. The whole like white hat, black hat, good guy, bad guy, positive, negative characterization of different organizations, I try to steer away from that because it gets very vague very quickly. I think about it in terms of like offensive versus defensive procurement. So are you buying this vulnerability because you want to kill it? Or are you buying it because you want to keep it alive? And NSO and Candiru are examples of the latter.

I think the more we understand around how the offensive market works and how these exploits get used, good, bad and otherwise, the better the economics and people's understanding of what a bug is actually worth becomes which point do you question buying them for defensive use becomes more of a no-brainer, right? So this idea of like what is this bug actually worth? That's a really hard question to answer, because it's a marketplace. And the answer is oftentimes it depends.

But if you can get a better narrative around, "Okay, here's what an adversary is going to do with this thing if they get their hands on it, and here's the potential damage, here's the potential value to them that they'll pay you for it," all of those different things. Then all of a sudden it becomes a bit easier to work out what you should offer in a public bug bounty program. And yeah, again, like it's still fairly early I think in normalization of the pricing and different things like that. That's one of the reasons why it's not a no-brainer just yet.

[00:25:47] CS: Okay. So I want to move from that to the sort of topic that we had sort of planned before the show here. We we decided today to sort of work under the umbrella of this idea. Hiring a team of hackers can help organizations stay one step ahead of the fray by identifying weaknesses in an organization's defenses before a breach occurs. After all, what hackers do best is wait and learn from their mark's behavior, or learn their marks behavior leveraging tried and true brute force attacks and exploiting vulnerabilities.

So on first read that sounded to me like saying you need to make sure to have a blue team or even a purple team on staff in your company. But the last line makes it sound like something a bit different. Is there a distinction in your mind between pen testing and hiring to use your words and army of hackers?

[00:26:33] CE: No, in my mind. And I think that's partly kind of product marketing conversation around what the distinctions are. When I talk about hiring an army of hackers, what I’m actually talking about is hiring from an army of hackers. So what you're doing is thinking about it in context of like the broader the pool of talent I’ve got to select from is, the more likely I’m able to connect my question with the right person to give me an answer out of that pool, right?

It's not necessarily. In the case of a bug bounty program that's public, you are going out to the open internet and saying, "All right, everyone can play now. Let's go." That's probably 80% at this point of what Bugcrowd does. it doesn't actually look like that. It looks like the same sort of crowd sourcing or even an outsourced model, where instead of going out to the open Internet, there's privacy around it and different things, and you've actually put work into selecting the right people to do the right jobs.

We've seen that from a skill standpoint. Like people with difficult to find skills. And also from a trust perspective as well. Like if you're the Department of Defense and you're looking at like testing of something that's private and meant to be behind the wire, then you want to make sure that you've got people that are trustworthy and have an established track record of being trustworthy actually working on those types of things. So that's what I’m talking about in terms of the overall pool to be able to select from.

[00:27:57] CS: Okay. And it also seems to be an underlying assumption of sort of creative collaboration within this this team in terms of idea sharing. And as you said with mentees and mentors, keeping relevant and keeping sort of one step ahead in terms of what tools, and techniques, and vulnerabilities are sort of at the cutting edge right now.

[00:28:18] CE: Most definitely. There's a really strong uh community learning component to just hackers in general. But particularly to the bug bounty community and different subsets of it. So you've got folks that you know purely focused on cars. You've got folks that focus on medical devices, on mainframes, on cloud, on mobile, on whatever else. And they're sharing what they learn as they go along. Usually, like it's technique-based and here's a new set of things that are coming out that we actually think is relevant to everyone.

That collaboration piece is critical, because I think it's how the hacker community grows. How it self-educates and learns over time? Because there's that curiosity combined with this sort of desire to teach that a lot of hackers have. If you can grab a hold of that and harness it, it's very powerful.

[00:29:12] CS: When you sort of hire a team like that, do you think in terms of matching like disparate specialties together like that? Say, someone's healthcare, someone else's Windows 11, someone else's connected cars and things like that. Do you sort of like try to pick from all the different buckets?

[00:29:32] CE: Yeah, depending on what the objective. So when we do it, because this is stuff that we've actually built into the platform. We've got a team supervising from a data standpoint and then working out exactly what it is the customer needs for a particular thing. So the answer is to some degree it depends. But if you're talking about – I do think one of the biggest trends that Covid has kind of kicked over is this concept of technology convergence. So where it used to be like I’m a web app person, or an API person, or I'm radio person, or binary or whatever else. At this point in time, so many different technology kind of groups are converged into a single product that you actually need folk from everywhere.

And when we do bug bashes with automotive manufacturers, it's a really good example of when you see this happen, because you'll have the person who grew up building cars and gets computers like doing the car bit. And then you've got someone who is really strong on mobile applications. Because lo and behold, there's unlock or telemetry for that vehicle. Then you get someone who does the infrastructure, and backend, or AWS, or whatever else. Kind of it's like the Voltron robot, like kind of we'll assemble to –

[00:30:45] CS: Yeah, coming in like Voltron. Yeah.

[00:30:48] CE: Yeah, exactly.

[00:30:49] CS: Yeah. And that's worth sort of putting a thick underline on, because when we do sort of live webinars and stuff, we'll get questions like, "What should I specialize in? Should I study everything, the vulnerabilities in Windows 10? Or is that going to be useless later?" And it feels like any kind of specialization, if you're obsessed with it, like go do it. Like you're going to be able to find an application for it.

[00:31:14] CE: It's where you can be most effective. Like that's always been the advice I’ve given to people that are entering the field and want to get into the offense. Even just security in general. It's like go like dip your like finger into as many different aspects of security as you possibly can. And watch your own reaction. Like figure out what the things are that like start to draw the interest out of you that you can create the most inertia around. Like once you've found those things, then double down, because they're always going to be relevant.

If I’m talking to folk now about what language should I code in and different things like that. I do think there's some stuff that's counter-intuitive and that it's handy like .NET, Java, COBOL. If you're new into security research or coding, you're going to probably run towards like Go, or Rust, or some other kind of newer languages. But the reality is that old stuff is going to outlive earth probably.

[00:32:13] CS: Yeah, yeah.

[00:32:13] CE: [inaudible 00:32:13] starting to age out. So that creates an opportunity for newcomers to fill those gaps as an example.

[00:32:19] CS: Yeah. Wasn't there like a rush of people that they needed who could do like COBOL or Fortran or something because of fair databases, military databases?

[00:32:27] CE: A whole lot of people got caught off guard in 2020 with the digital transformation because they had to move stuff around quickly. And, "Oh, wait a sec. All of the COBOL people have retired. Like what are we going to do?

[00:32:38] CS: Yeah. I actually knew someone who they attempted to call her back into service from retirement. So yeah, I guess that was probably not uncommon, huh? Yeah. So as you noted in our conversations before, ransomware attacks suck up a lot of air in the current cybersecurity news page. But you said that other types of cyber threats are just as damaging and often to organizations that have other things to lose than just money, places like hospitals, utilities, schools and individuals. So what are some other types of cyber threats that you believe are flying under the radar and need to be more thoroughly addressed by the cyber security community?

[00:33:14] CE: Yeah. I always love this question, because like I’m mentally picking from about 20 things right now. I would say the one that I’ve really seen kind of storm forward over the past year or two is actually almost the convergence of information warfare with cybersecurity. So this idea of misinformation, disinformation, like mistrust actually being enabled by software vulnerabilities or by misconfiguration in systems. So where the cyber security side of the house makes information warfare easier for whoever might be doing it.

And I think that's a pretty important factor to address almost as an overlay for cyber security. Like we're in the business of making sure that the data – The information, the systems that it transits across is kept like confidential and available for the users. If we don't do that or if there's any issue around that, then you can start to create stories around why that is that they create much larger problems. We see that in corporate espionage. We've seen it on like a nation level. And I don't think it's going to go away.

Really, what it begs is this question of to what degree can you make cybersecurity as transparent as possible to the public to be able to make them more confident in what you're doing? It's a very kind of fluffy. Like it's pretty abstract trends compared to ransomware, or malware, or like SQL injection in websites or whatever else. But as a general theme, I do think that's going to dominate a lot of the conversation over the next five or ten years.

[00:35:03] CS: To that end, I mean, it sounds like a prescription is some degree of sort of cyber savvy or cyber security education on like a blanket level. Like what are some things that you think the average person doesn't know that should know that sort of would lift all boats in this regard?

[00:35:20] CE: Yeah. I mean, one of them is that it's really hard. Like technology is hard to begin with. Like making it do the thing it's meant to do is difficult oftentimes. Making it not do all the things that it shouldn't is a very hard goal. So this idea of like risk management 101 for the average consumer and having them understand the fact that like to er is human. Like this is a thing that is constantly evolving that the defenders actually need to be put on the hook to be chasing. Because ultimately, if there wasn't an adversary, then we wouldn't be having this conversation right now. We'd just be going off and building cool stuff, right?

So I think that's an aspect of it for the average consumer. And I think for companies, it's really finding ways. This is why I’m such a big proponent of disclosure and VDPs and allowing for things to be published after they've been fixed, because it reinforces this idea of like, yep, that was something that like we let out. We got this for response. We fixed it quickly. And now we're actually using it to teach others to avoid similar things going forward. So, yeah.

[00:36:25] CS: Yeah. I mean, they can't possibly be the only company that had that error in there. So if you let people know and you keep it research, someone else will go, "Oh, boy. That's us too." Tie that up real quick here.

[00:36:39] CE: Just to give you like a real quick technical one on top that.

[00:36:41] CS: Please. Please. By all means.

[00:36:44] CE: Cloud configuration, this idea of the pivot to work from home has been really actively exploited since about June of last year. And I think security posture, security policy, all those different things, are still pretty much a work in progress in most organizations. Some are further along than others. But we all kind of got caught off guard. So that's a logical thing. So the idea of focusing upon where there's potential for exploitation in your kind of work from home setup. But then also whatever you've moved out to the cloud to make it more accessible to your employee base. Making sure that the configurations and the different things that are set up out there are resilient.

Oftentimes, people can treat the cloud like it's a data center, but just not theirs. And I think testing that assumption and making sure that infrastructure as code is doing the right thing and all that kind of stuff. There's a lot of focus going into trying to break that type of thing right now. So having another look at it to make sure it's secure is a good idea.

[00:37:53] CS: For sure. There was so much exploitation happening with the big transition. But I’m also just amazed at how much was staved off. I mean, who would have ever imagined this quantity of the workforce would transfer online so quickly and so thoroughly and possibly so irrevocably. So can you –

[00:38:15] CE: Yeah. But, possibly, that's where the devil in the detail is. So it's returning back to a hybrid model and working out what the security models are to support that. I think that's actually going to be more difficult collectively than the sudden shift to work from home. That was inconvenient, but we're all doing the same thing. So I think that element of it made it a bit easier. This is a little bit more nuanced, yeah.

[00:38:36] CS: Yeah. I mean, if you were to sort of like scrap the protocols that aren't working now, like what would you – Maybe there's nothing across the board. But just for work from home people in general, like what sort of implementations would you say have to be almost kind of standard operating procedure that aren't right now?

[00:38:52] CE: Yeah. I think treating your family like they're the open Internet is one that I recommend a lot. It's like I don't trust – I love my family. They're awesome. I don't trust them from a cybersecurity standpoint, because my assumption is that they're going to think about security decisions as they use their things differently to how I would. Like we talk about the stuff all the time. But it's not quite the same, right?

And that also, to get to me, they're a logical place to start potentially. Like that same thing's true for anyone who's a target within an organization. So that whole idea of being able to segment off what you're doing from everything else that might be happening in your home. I think that's a pretty easy to implement oftentimes and a fairly quick win.

[00:39:36] CS: So there's a hard stop between your child's iPad and your work computer is what you're basically saying. Yeah. Yeah.

[00:39:44] CE: Effectively, yeah.

[00:39:45] CS: So for ethical hackers or someone who are trying to get into the field, do you have any suggestions for getting involved in bug bounty programs to build your experience roster? Do you have any good places to start? Or just kind of start anywhere?

[00:39:58] CE: Well, I mean, start anywhere is not very good advice. And it is partly true. Like go off and just learn. But in terms of where to learn from, there's a lot of really good content creators out there that speak through the bug bounty lens like Hack Luke inside of PhD. Sterk and a bunch of others. There's Bugcrowd university as well where we create content and put it out for the purpose of education of researchers.

I think between that like, just reading, watching videos, doing all those kind of things and just like absorbing as much as you can again so that you can figure out which of the things that pique your interest most that you can kind of double down on. That's a good thing. And then joining communities. There's Discords, there's Slacks. If you look at the DevCon villages, for example, if you're thinking about it from like a vertical technology standpoint, like there's the aviation village, there's the aerospace village, there's the biohacking village, which is all around healthcare. There's machine learning, all sorts of other stuff. So just joining in and starting to talk to people. It kind of goes back to the mentorship stuff that we were talking about before. But I think it works in very similar ways on a peer level as well.

[00:41:15] CS: Okay. So as we mentioned before, a lot of organizations either don't have bug bounties or actively even resisting them. Or vulnerability disclosure programs that are easy to find and easy to sort of go through. So if you manage, for whatever reason, to find a vulnerability on a site that doesn't have a reporting method, is there a way to safely contact them and broach the issue without getting a backlash?

[00:41:38] CE: Yeah. The safest the safest way to do it with I think a bit of a tradeoff for visibility into how the conversation goes and whether or not it's been fixed is actually to go to the local cert. So if it's an American company, you go to certs EC, or someone like that. If it's Australian, you go to [inaudible 00:41:55]. It's the computer emergency response team. And basically every country has one. It sort of acts as a highway from the government. You've got a problem, you can generally get in touch with people a lot more easily if you come in through that lens. But also it's a layer of abstraction [inaudible 00:42:15] reporting if they feel concerned about blowback or whatever else.

The other tip there is actually community.disclose.io. That's effectively a forum that kind of crowdsourced is trying to get answers to these types of questions, which we put together like myself and probably three or four dozen other people around the place just known as folks who know the right person and can get in touch. So I’ll often get DMs saying, "Oh, someone who isn't me found a thing. And I’m trying to work out." Yeah, which is fine, but like scaling that out to be able to try to make those conversations easier to have on both sides is part of the intent to that forum.

[00:42:58] CS: Okay. So this has been great. I’ve really enjoyed this. And we're getting close to the hour here. So I want to get you on your way here. But just as we wrap up, what advice do you have for listeners who are actively trying to get started in ethical hacking. You've already given a lot of good experience. But like what would you need to see for example on a resume to consider hiring someone as a white hat hacker?

[00:43:22] CE: Yeah. The resume part throws me, because oftentimes it's not so much that. What I’m looking for in –

[00:43:30] CS: In terms of like an experience stack, I guess? Yeah.

[00:43:33] CE: Yeah. It's really like hunting. Like the ethical hacking side, how much experience do you have hunting? Are you someone who's just picking up tools and using those tools to scan things? Or are you like trying to go deeper and do your own research on how things could be done better either with the tools and then inferring things from the output or actually creating some of your own stuff in the process. Like that's the kind of thing I look for. I look for that sort of creativity, going back to where we started. The people with the spark to be able to come up with their own creative approaches that end up netting out to the research-driven assurance as opposed to just assurance. That's what I look for. I’ve got a particular bias in that area, but I think that's how you kind of make yourself most unique.

[00:44:20] CS: Okay. Well, that's awesome advice. So one last question. If our listeners want to know more about Casey Ellis or Bugcrowd, where can they go online?

[00:44:29] CE: Certainly. So I’m @caseyjohnellis, J-O-H-N Ellis on Twitter and LinkedIn and just about everywhere else. My personal blog is cje.io. Bugcrowd is at bugcrowd.com. And you can check out what we do. Actually sign up for the platform. Look at different offerings from both the customer point of view and also from the researcher point of view if you visit that site.

[00:44:54] CS: Casey, thanks so much for joining me today. This was a real blast to talk to you.

[00:44:57] CE: It was a pleasure. Cheers.

[00:44:58] CS: And as always, thank you to everyone for listening and watching this week. New episodes of the Cyber Work podcast are available every Monday at 1pm central both on video at our YouTube page and on audio wherever fine podcasts are downloaded. I’m excited to announce that our Infosec Skills platform will be releasing a new challenge every month with three hands-on labs to put your cyber skills to the test. Each month, you'll build uh new skills ranging from secure coding, to penetration testing, to advance persistent threats and everything in between. Plus, we're giving away more than a thousand dollars’ worth of prizes every month. Go to infosecinstitute.com/challenge and get started right away.

Thank you once again to Casey Ellis. And thank you all so much for watching and listening. We will speak to you next week.

Free cybersecurity training resources!

Infosec recently developed 12 role-guided training plans — all backed by research into skills requested by employers and a panel of cybersecurity subject matter experts. Cyber Work listeners can get all 12 for free — plus free training courses and other resources.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.