[00:00] Chris Sienko: Cyber Work is celebrating its next major milestone. As of July 2020, Cyber Work has had over a quarter a million listeners. We’re so grateful to all of you that have watched the videos on our YouTube page, commented on live release feeds, left ratings and reviews on your favorite podcast platform, redeemed bonus offers, or just listened in the comfort of your own home. Thank you to all of you.
Because our listenership is growing so quickly and because Cyber Work has big plans for the second half of 2020 and beyond, we want to make sure that we’re giving you what you want to hear. That’s right. We want to hear specifically from you. So please go www.infosecinstitute.com/survey. That’s www.infosecinstitute.com/survey. The survey is just a few questions and it won’t take you that long, but it would really help us to know where you are in your cybersecurity career and what topics and types of information you enjoy hearing on this podcast. Again, that’s www.infosecinstitute.com/survey. Please respond today and you could be entered to win a $100 Amazon gift card. That’s www.infosecinstitute.com/survey.
Thanks once again for listening, and now on with the show.
[01:19] CS: Welcome to this week’s episode of the Cyber Work with Infosec podcast. Each week, I sit down with a different industry thought leader and we discuss the latest cybersecurity trends. How those trends are affecting the work of infosec professionals while offering tips for those trying to break in or move up the ladder in the cybersecurity industry.
Our guest today, Gabe Gumbs, is the Chief Innovation Officer at Spirion. He came to the program with some very intriguing discussion topics. One particularly has landed to a common theme on the show. Gabe wanted to tell us about “the skills gap that wasn’t” as well as some updates on data privacy and the wake of GDPR CCPA, and some ways that you can make data privacy a profession to live with.
Gabe Gumbs has a deep-rooted passion for technology, information security and problem solving. As Chief Innovation Officer at Spirion, a leader in rapid identification and protection of sensitive data, he’s channeling that passion to make the digital world a safer place by spearheading Spirion’s vision for data privacy in the next decade and beyond. He’s leading the way to a more secure and private future for all. Gabe, thank you for joining us today.
[02:23] Gabe Gumbs: Thanks for having me, Chris. A pleasure.
[02:24] CS: Okay. I just realized that I misspelled. Is it Spirion or Spiron?
[02:29] GG: Spirion.
[02:30] CS: Spirion. My apologies.
[02:32] GG: No worries.
[02:32] CS: Check out Spirion. Okay. Before we talk about data privacy, we always like to start up by finding a little bit about our guest. How did you first get interested in this field? Have computers and tech always been part of your background or did you move into it later in your career?
[02:46] GG: No. It’s always been part of my background. Early in kind of – Well, pre-security days, I was dabbling in many different types of technology. I got involved with my local NY love group many moons ago, a Linux user group meet-up, and things of that nature. It was also kind of around the same time the 2600 scene was kind of growing up a bit more in the New York City area. So I’ve always kind of been involved and interested in kind of around it to some degree. The early part of my career actually was in networking. And from there – Before I’ve taken on my first infosec position, I had already in technology as well too. So it’s been a little long love affair with technology and security.
[03:43] CS: Okay. So was there a particular sort of defining event or something where you were doing networking and you’re like, “Oh, I like security better. This is more interesting to me.” Was there some particular thing where you’re like, “Oh, this is what I want to do.”
[03:54] GG: Yeah. Well, it was kind of experimenting with things kind of in my own personal time from a student back then. I’m just kind of testing things out and breaking things and building things, etc. And so that interest was there. It wasn’t until an actual opportunity presented itself in the workplace where I was at the time, as a network engineer, that allowed me to move into it in a bit more of a professional capacity.
[04:17] CS: Okay. So, tell me about your job at Spirion. What exactly does the average workday of a Chief Innovation Officer look like?
[04:25] GG: Well, it means you spend a lot of time in problem space and talking to other people about their problems. Luckily, it’s not lay on the couch type of problems. Although for many of them, it can be.
[04:36] CS: Yeah, one leads to the other.
[04:37] GG: Indeed. But I spent a lot of time trying to understand what the challenges are that organizations face around data privacy and data security. And what technology Spirion builds, we can leverage to alleviate those problems and eliminate them in some cases. A large part of my day is spent around building the overall product strategy for the larger portfolio. What we’re building? Why we’re building it? Who we’re building it for? Things of that nature.
A lot of it is spent talking to actual practitioners on the ground, again, with their problem. And then with my own internal teams, we’ve got product management team and an engineering team and the research and development team. And we take all these ideas that we come up with. Do some market research as well. We test things. We prototype some ideas, and we get into the hands of customers and see how it actually solves problems in the real-world. And then we turn those things into product.
[05:32] CS: Okay. Do you have sort of direct reports? Do you sort of actively manage your various teams?
[05:41] GG: Indeed. Yeah. So there’s an entire innovation team that I actively manage. So there’s kind of two arm set, if you would. There is more of the academic research side of it and then there is the very hands-on technical side of the research as well.
[05:55] CS: Okay. Have you had any particular – We’re trying to get a sense of like how security managers are sort of taking care of their team at a point where everything is so sort of distant and work from home, and there’s not a lot of sort of face-to-face collaboration with COVID-19 and so forth. Have your sort of management strategies had to change at all now that everyone is sort of often they’re in their own individual cone of silence?
[06:22] GG: It’s a little from column A and a little from column B. I’m an old grizzly work from home veteran.
[06:27] CS: Oh, yes.
[06:28] GG: We have pride at taking my current position, actually, work from home for the last 15 or so years. So for me, personally, working remotely and managing remote teams did not post the same type of challenge. In my current role at the head of the innovation strategy table, we had a very office-centric environment pre-COVID. And a lot of that was just rapidly coming up with ideas, testing them and so forth. A lot of those things happened very organically in person, a lot of whiteboarding and things of that nature.
I’d say that’s really the only thing that was heavily impacted, was the whiteboarding a bit. That exchange of ideas still happens. It really meant that we had to leverage more technology platforms to be able to do that. Where we would in the past get up and maybe go talk to someone really quickly. We use some chat technologies to kind of do the same. Again, for myself and for those within my direct orbit, not a massive change. Not a massive change at all. Yeah.
[07:37] CS: Okay. A lot of our listeners that the main slant of cyber work is that our listeners are working out what type of careers they want to enter. So I wanted to sort of atomize some of the career steps that you took to get to the position you’re at now. What types of positions, experiences skills learning that you need to do to become a chief innovation officer? What are some of the sort of past signposts?
[07:59] GG: Yeah. Well, it certainly was a securitist route. That is to say I didn’t wake up one morning any number of years ago and say that’s where I want to be specific. Although, it was in the general arena, if you would. But in terms of getting to that place, a large part of my – A tidbit. Three-quarters of my career path was very much on the practitioner side of the house. Let’s say I was actively putting together security programs and having to get the security solutions to solve problems directly for the business.
In this capacity, I think a lot of the skills learned from there, it kind of blow it out to do it on a larger scale for numerous organizations, hundreds of them. Thousands, really, at that scale. And so some of the things that really helped me along the way was a very early understanding of technology and its interconnection point. So I don’t know that everyone needs to necessarily know the different layers of the OSI model, but it’s helpful. Not that everyone needs to know how to program, but I certainly advocate for it.
Picking up those types of deep technology skillsets along the way along with more of just the managerial skillsets by the time… my existing position is very helpful. But I still spend a lot of time learning, a lot of time learning. And I think not so much what the steps were to get here, so much as it is the steps to be good at what one does when they’re there. I mean, that does require constant learning.
Programming, for example. I’ve been getting my hands dirty and learning Go Lang, for example. I’m actually really enjoying that. I’m trying to think. I spent a lot of time working with – We’ve got a number of data scientists on the team. And so there are some new concepts and theories there that I had spent the last two years really getting very deep into and understanding how they operate. How adversarial networks are created and those different types of ML models are built, etc.
I think the easy – The short answer is, regardless of where you want to end up, I think it has to be up a passion. So much so that you have to enjoy getting really deep into the study of it as supposed to just the practice of it. But that does need to be a healthy balance of both you studying the practice. Yeah.
[10:38] CS: Okay. Can you talk a little bit about sort of ongoing learning and what – You say you’re working on some new languages and so forth. But sort of tell me about your sort of like your learning preferred methods. I mean, what do you do? Do use books? Do you do labs online? Do you take active courses of study? Do you just sort of like come through things after dinner? How do you sort of keep your skillset fresh?
[11:09] GG: That is yes.
[11:10] CS: Yes all of those? Okay. Buy everything.
[11:12] GG: Yeah. I do still enjoy some dead trees once in a while. And so last, for example, I was on vacation and I took my rather thick – I don’t know. It’s 400 or so page Go Lang book with me. I do leverage things like Coursera and other online learning platforms. And I probably spend a few hours in those every week. I do a lot of just reading of academic papers as well too. A lot of interacting with others in and around my field. It’s not necessarily those just in the product strategy and breach side of the house, but those directly in the depths of security and privacy.
I just spent a lot of time in some of the community Slack channels. Yeah. So I spent a lot of time like trusted public set channel and things of that nature. For me, those are some of the more important places to really learn from, because there are other human beings discussing the challenges they have and the solutions that they’re exploring for these problems.
[12:20] CS: Okay. Do you have any certifications in your background? Do you have any sort of thoughts on getting certs? Do you have any particular ones that you sort of require from your team and so forth?
[12:34] GG: Required, not necessarily. It kind of depends. My approach to building teams out is usually kind of balancing out the overall skillsets are across the team, not necessarily everyone must have a CISSP kind of thing. That said though, interestingly, the entire team this quarter is challenged to pick a sort of their choice and explore, actually getting it.
For myself, again, I like a lot of hands-on type steps. The last active cert I had was the GWAPT, which is the GIAC’s web application penetration testing certification. Yeah, it’s a very hands-on certification that tests one skillset in application security. I am currently seeking. So I’m studying for a couple of the IAPP certifications. Those of more privacy-oriented capital.
And the team are working towards their CEH, the certified ethical hacker certification. I still focus more towards kind of what their needs are from a professional standpoint. If you’re looking to enter into security, then kind of minimum bar to entry, a lot of folks are going to be looking for CISSP or something similar, right? That’s just kind of your barrier to entry for a lot of folks.
But from there, I think one should try to – Certs that actually can demonstrate your mastery of the topic I think – Because there are – I’m not going to call any up. But I think there are some certs that are study the book a few weeks, maybe a couple of months, and you’ve got yourself a cert kind of thing. Versus being able to demonstrate, “Ah! I understand this topic well enough that I can apply it.”
[14:28] CS: Okay. Yeah. I mean, that’s good. It’s important to not think of certs as trading cards or things to collect, but as things, tools that can solve problems and so forth.
[14:40] GG: Yeah. Quick guide anecdotal story. So I’m sitting in someone’s office. This is easily seven or eight years ago. There’s a wall of certs. And when I see a wall of certs, without exaggeration, the are probably 30 of them, like easy.
[14:54] CS: All framed and everything?
[14:55] GG: All framed and everything. I was more fascinated and impressed about it. One of them caught my eye. So I asked the question, “How’d you get that cert?” And the response was, with a very deadpan look in on their face, “Well, I studied for it.” I was like, “Oh! Well, of course.”
[15:15] CS: Sure. Yeah.
[15:17] GG: And it as at that moment it occurred to me. It’s like, “So that’s the wall?” I picked up the book, I read the material. I got the certificate.”
[15:26] CS: Proof of concept. I study. I took the test. I now have the skill of taking that test.
[15:31] GG: Right. I realize I should probably not inquire any further about the other two dozen plus certs.
[15:38] CS: Yeah. I guess I mostly ask to, because we get a lot different type of guests on the show and some will say, “Yeah, certs are really important. Or I recommend this one or this one.” And other people would say, “Certs are completely unimportant. Just as long as you can do the task, we don’t really care what your resume looks like,” and so forth. So it’s always interesting to hear sort of where different people stand on the use or application or necessity of them.
[16:00] GG: And I’m somewhere down the middle on that one. It’s a big fat, “Oh, it depends.”
[16:05] CS: Sure. Yeah. Yeah. Yeah. Again, it’s a tool. And if you need the tool in your toolbox, you better have it.
[16:10] GG: There it is.
[16:12] CS: In the talks before the program, we came up with a nice combo of topics to discuss today. So we’re going to move around a little bit throughout the show. It’s not just one thing. But we’ve had a couple guests on here talk about GDPR, and CCPA, and a topic certainly bears repeating. The area you specifically wanted to discuss was the right to be forgotten, in which organizations that collect data as part of their regular transactions with clients or customers must have a strong system in place to safely remove the data after it’s served its purpose. What are your thoughts on the difficulty or newsworthiness of this provision?
[16:47] GG: Well, I think you touched on it in the last sentence, after it’s served its purpose. So the word purpose is explicitly defined in GDPR, and it is defined as the reason which you collected that information in first place, and you’re only allowed to process that information based on the purpose that you expressed to the data subject when you got it, which basically says, “So, Chris. When you went to www.amazon.com and you provided me with your home address, and your phone number, and your credit card, etc. You provided that to me for the purpose of becoming a customer so I can fulfill your order. And so I’m only allowed to process the information in that way.”
There are some provisions also once you sign up for the platform that also say explicitly like, “Hey, we’re going to use this information, for example, to understand how people like Chris, how they shop. What their purchasing habits are. What their likes and dislike are.” So that’s another purpose. Now, the second you stop being a customer of Amazon under these provisions, you sensibly have the right to say, “I no longer want you using that information that you collected on me for anything outside of those purposes.” But Amazon, of course, still wants to be able to market to people like Chris. And so that’s one of the many challenges of the right to be forgotten from the business aspect. That data is extremely valuable. That data is necessary for me to process for my business to exist and to grow. I don’t exist to process that data. I process that data to exist and to grow.
How do I forget about you while still being able to learn about people like you? That’s one challenge. And there are some answers to that challenge. Yeah, there are some ways that it can be accomplished. Differential privacy methods come to mind. The first is you scrutinize or anonymize the datasets so that I can extract knowledge about the person without retaining any direct identifiable or indirectly identifiable knowledge about the person.
[18:54] CS: You’re scrubbing my name and identifiers, but you’re just sort of keeping like the demographic data of what I bought.
[19:02] GG: Yes. But that too again still has its own challenges. There are different privacy attacks against datasets where I can re-identify individuals, right? If the dataset is small enough for limited enough or not diverse enough, it becomes easy for me to know, for example, if you live, let’s say within a metropolitan area and you live in the condominium building and there are 100 people. And I retain, say, information about your sex, male or female. Well, that already eliminates some percentage of the individuals within that building. So I start narrowing down, “Oh, it can be.” Then if I retain, say, your age. Not just the range, but explicitly your age. I’ve narrowed it down maybe even further at that point.
And then as you start looking at individual identifiers of any subject, it is very difficult to be able to apply the appropriate curious to data while also still forgetting about that individual. It’s not wholly impossible by any stretch of the imagination. But it is challenging. Then there’re also the the kind of paradox of how, “Well, how do I ensure that information anywhere else within the organization was A, all found and remediated? So I found every single instance of all of Chris’s information and I have scrubbed all of it were deleted all of it. If I didn’t, how do I know when it resurfaces when I finally do find it if I was supposed to have forgotten about you, right?
Second half for that is really the bigger problem, because if I don’t retain some information about you, then I would know that it was still around for me to have violated it. It means I have to first find all of it, right? There are no shortage of challenges with the right to be forgotten.
[20:51] CS: Okay. You’ve laid out the problem pretty well here. Do you have sort of a similarly laid out solution that’s not being implemented right now that you think would take care of this problem?
[21:06] GG: I don’t know about not being implemented. I think it’s more about how it’s being implemented, right? It’s the larger implementation throughout the entire data lifecycle. Again, a lot of what I try to do is solve for the entirety of the problem and put into place not just a one point of a solution or one point of an answer, but look at it throughout that entire lifecycle. From the time that information is first captured, am i gathering enough information to know what type of information it was and what purpose it was that I was capturing?
And then as that information is used, shared, processed, analyzed, do I also now have the appropriate controls in place to respect both consent and compliance and security while it’s being used? Then, finally, once I get to archiving and destruction, again, do I have the right policies, procedures and controls in place to do those things as well?
That well laid out answer is along – You have to look at each step within the lifecycle of data from the time it’s created; used, shared, archived, destroyed and apply the appropriate control throughout each of those points of the lifecycle. So the answers I see today a lot of controls being applied to maybe one point in that lifecycle. Some folks may take some additional measures when they first gather that data, when the first start processing it. And then maybe not take the same level of care in the middle stages as it’s being used and shared, which then starts doing things like violating consent on the GDPR and things that nature. Its in GDPR too. Like we’ve had HIPAA in place since ’95, right?
[22:46] CS: That sounds right.
[22:46] GG: Yeah. HIPAA has had a similar notion for decades too, right? We share a lot of health information for the purpose of understanding how to treat different ailments and things of that nature. In this COVID environment, we’re doing a whole lot of health sharing right now as well for research purposes.
[23:08] CS: Yeah, fast and sort of desperate sharing.
[23:11] GG: Fast and desperate sharing. Are those right measures in place, right? A lot of those things aren’t even new concepts, much less new calls for remediating that data. In fact, HIPAA had explicitly defined the proper way to de-identify data many moons ago. And the level by which their definition of data and finding the data does differ than, say, FERPA, for example, right? The answer to your question is the well laid out way to implement it today is to ensure that you’re looking at data throughout every stage of the lifecycle and applying the appropriate control.
[23:50] CS: Okay. I mean, I guess what I’m also trying to get to is based on laws like GDPR and CCPA, is the sort of language and the law sufficient to sort of get us to that? And is the reason that it’s not being done more a matter of people either intentionally or unintentionally sort of skirting the regulations such as they are? I guess I’m trying to get a sense of what the point of friction is here.
[24:18] GG: Well, there are several points of friction. And you asked not the letter of the law does as well. CCPA certainly does not even explicitly state what a good mechanism for – Or what the minimum bar to entry for de-identification, anonymization, data scrubbing, shredding, etc. It doesn’t explicitly state this is what you need to do to make sure that subject data is well de-identifying. Same way, say, HIPAA does. GDPR does go a little bit further and do so. However, GDPR equally takes a very wide approach to what is subject data and it’s defined as directly or indirectly identifiable.
So the IMEI number in your SIM card in your phone is something that’s indirectly identifiable to Chris. Even something like that, you have to figure out how to make sure that it fetch data that you collected, shared, etc. How that is being properly handled? And then some other friction points is really just a big knowledge gap. We’ve move fast and hard on CCPA, for example.
[25:19] CS: Yeah. That’s a fast rollout.
[25:22] GG: Very fast. Folks still try to understand the intricacies, the nuances. Then, of course, very few of the provisions have been challenged in a legal setting. There is no precedent for a lot of those things. Yeah, we’ve got a ways to go before it all becomes very prescriptive for anyone to just wake up one morning and go, “Ah! I know how to do this.”
[25:44] CS: Right. Okay. So we’re all still learning as we go here at this point.
[25:48] GG: There is certainly a lot of learning still left to be done. Although there’s a lot we do know, and you should certainly take all of those measures right now. As I mentioned, make sure you’re handing data based on its preferences, its process, its purpose. But that we do have a ways to go.
[26:03] CS: Do you see general improvements based on the rollouts of these things? Obviously, there are still problems. You said, problems of implementation and stuff. But it seems like it was pretty – Have a pretty lawless non-system we had for years there. Do you feel like that there’s some sort of like order happening around all that chaos?
[26:23] GG: I feel like it is getting better every day. Yeah, it’s getting better every day. It’s certainly a whole lot of one foot in front of the other.
[26:32] CS: Right. Right.
[26:35] GG: But I don’t see us going backwards. At least not right now.
[26:38] CS: Right. Yeah. I mean, it’s important to sort of understand the distinction between the sort of like conniving like, “Ooh! I’m going to take this data and do something nefarious with it.” Versus people like, “It’s my first day on the job. I didn’t even know.”
[26:50] GG: Right. Like, “No. I’m hoping that was unacceptable.”
[26:52] CS: Right. Speaking of sort of like the job aspect of it, obviously with these new laws taking effect and potentially opening new responsibilities for enterprises of all shapes and sizes, are there any new type of careers or positions that might be on the increase due to the regulations with GDPR, CCPA and other sort of regulations like this?
[27:10] GG: Certainly an opportunity for those that understand the law and technology to really make really strong impacts in our world. There are not many of them. Certainly not many that I met with a firm grasp of, again, both the technology and the law. There is very much those opportunities. And some of those come in the shape and form of data privacy officers and titles such as those. Those things exist.
There is in my both professional and personal opinion, there’s going to be a lot more opportunities for analyst position. That is to say, today, we’re in the security world. We’ve got a lot of SOC positions, right? We’ve got SOC analysts, level one, level two, level three analysts. The nature of their job around understanding security risks is now also coupled with being able to understand a privacy risk. What’s the difference? Well, let’s say you have an alert that some data has left the company, right? You see it crossing one of your data loss prevention technologies. You see it leaving the egress point, etc. And you just don’t want anything to leave. That certainly is problematic from the security perspectives.
You also have now privacy challenges of, “Gabe is now explicitly requested that you no longer share his data with a third-party.” Even though you may have a legitimate connection to a third-party where you share this information digitally and no security violations may have occurred there, there certainly is a privacy violation when you share my data and you weren’t supposed to. A very real one where you will also be fined and subject to lawsuits, etc., by doing so. So we need to be able to automate and orchestrate and understand when an alert such that triggers, right? So what does that mean? It means we need to be able to have privacy operations as part of our larger functions within an organization. And some privacy operations I see equally as a another opportunity for new types of roles within organizations.
And maybe what we do is we grow and expand out the security operations role into the privacy operations or recombining them. So we take privacy and we put it right into the middle of our security operations center and we go from security operations centers to security and privacy operations centers, right? From a SOC to a SPOC, if you would. Long live the data, right? I see a lot of opportunities on that front.
[29:54] CS: Okay. Sorry. I just got a weird Zoom message here. We talk a lot on this podcast about the skills gap in cybersecurity. Basically, that there’s this great disparity between the number available cybersecurity positions open, which is a lot, and the number of qualified positions to do them, which is not a lot. In our discussion, you mentioned the security skills shortage that wasn’t suggesting that you might have some views about the topic that might run counter to popular opinion. What in your opinion is the future cybersecurity jobs versus the available workforce?
[30:28] GG: Oh! Let’s take those privacy operations that I just mentioned, because, yeah, you can’t have privacy without security. So they’re certainly going to be hand-in-hand, if not completely morphed into one. If you can’t feel the security roles today, how do we ever plan on filling the privacy roles that we’re now faced with? The answer is you’re not.
But from my perspective, I don’t think there was ever really a shortage of human beings. I think we had two major problems. The first is, for far too long, my fellow practitioners have made security way too esoteric of a topic that just scared too many people off, right? Like, “Oh my God! It’s the dark arts. I do not know how I am supposed to enter this.” We’re all walking around with the mark of the dark one on our forearms.
The second is that the technologies did not allow us to scale what resources we did have. So we see some of that getting better with security orchestration and automation and remediation solutions, right? We’re able to rapidly not just alert on things, but orchestrate responses to things that gets us into a better magnets for triaging. I think there’s a lot of opportunity there to close this perceived skill shortage. Because, again, I don’t think it’s a natural skill shortage. I think it’s a technology issue. I think we haven’t built systems that have allowed us to orchestrate, automate, respond and scale out our needs nearly well enough, because there’s just no way that we would ever been able to have put enough warm bodies in the seats.
[32:05] CS: Right.
[32:06] GG: Yeah. So I think we first start by removing this fail and this cloak of this is some big esoteric thing. And then we start building better technologies that allows especially at the entry-levels too, right? The entry-levels allows for far more folks to be able to simply enter into those roles. And that’s largely a technology problem.
[32:27] CS: Right. Okay. Yeah. Yeah, that’s a really good point. That is definitely the sort of like the hard point of the funnel, is getting the sort of beginner-beginner people in there and stuff like that. Do you give any anymore thoughts on that? Obviously, that’s the problem. But where do we sort of like change things?
[32:48] GG: Maybe we could start by lowering some of the requirements? Don’t ask someone to have 10 years of Kubernetes experience when it’s only been around for six.
[32:55] CS: Yeah. Right. Yeah. It seems like an awful lot – Yeah, the sort of like the HR requirements are one of the big sort of chokepoints.
[33:03] GG: It’s certainly a chokepoint, right? Again, back to the certification ones, is it really mandatory for someone to have the CISSP? Is it really? I don’t think so. You certainly can take a first year mechanical engineering student and teach that person a lot of the skills they need. And/or that person can teach themselves and where they could take some secondary schools to get what they need. Things that would be applicable to the positions they’re applying for. That’s one. Again, the other one is by definitely orchestrating and automating way more of the tasks. We’ve done a better job of automating. We’ve got a ways to go on the orchestration of our technologies in general.
[33:46] CS: Okay. I guess as we sort of wrap up today, where you see data privacy going in the next 5 to 10 years, especially with regulations coming on board and more sort of options for enforcing these things? What do you see the landscape looking like in the next decade?
[34:04] GG: Well, the technology landscape, I think, the trajectory of that one will have to start leveling out a bit. So we’ll hopefully see a lot of consolidation in the privacy technology stack. So there’s one area that I certainly see that going into. Today, a lot of folks try to solve for just one problem that they’ve identified. And so there’re a lot of solutions that have sprung up to solve those one problem. So we’ll start seeing more consolidation of that.
But I also see it – Again, I see the convergence of security and privacy for the same reasons I mentioned in this show, which is you can’t have privacy without security. You can’t have security without privacy, but not the other way around. And so I see more of a convergence of the two as well both in the business functions as well as in the technology functions. Those things will be inextricably linked. And I see all of these still starting where even security has, which is in just understanding the data. We will be incapable of protecting it if we don’t know what it is, and we’ll become even less capable of preserving its privacy if we don’t know what it is even if we’d managed to protect it, whereby “protect” means I’ve put it on a lock key. But if the person’s with the keys are people that should not have access to it, then I’ve also violated privacy. We need to understand what it is as well too.
[35:26] CS: Okay. So as we wrap up today, tell you more about Spirion and some of the projects you currently have in the works.
[35:32] GG: Yeah. Spirion is a data security and privacy company. Been around for the better part of the over a decade and a half. We’re headquartered down in Sunny St. Petersburg, Florida. Yeah.
[35:43] CS: Yeah. Yeah. Nice.
[35:45] GG: A number of things that we’ve gotten rise are very similar to some of the things we’ve talked about today, which is helping organizations protect that data. So protecting the security and the primacy of it. Helping them be able to respond to data subject access request and then be able to discover and classify, apply appropriate controls to their datasets. We’ve got a number of things in the works including being able to offer some additional analytics and governance solutions to those larger data privacy and security products.
So we’ve rolled out to those things earlier this year. We got a few more that we’ll be announcing just in the next couple of months. So, certainly looking forward to those things being released and folks checking them out. You guys can head on over to www.spirion..com and take a poke around. We’ve also –
[36:34] CS: S-P-I-R-I-O-N. Is that right?
[36:36] GG: Yup. S-P-I-R-I-O-N, yeah. We also have a little podcast that we do also called Privacy Please Podcast.
[36:43] CS: Privacy Please. Okay.
[36:44] GG: Yeah. Privacy Please Podcast. It’s all very focused on data security privacy mode. I’m active on twitter @GabrielGumbs if anyone wants to give me a shout. Yeah, you can find us in all those locations.
[36:56] CS: Okay. Gabe, thank you for joining us today on Cyber Work. This is really interesting. And I really enjoyed hearing about all these sort of things that I have, this sort of vague knowledge of and are changing every day. So, thank you.
[37:07] GG: I thank you, Chris. I appreciate it.
[37:09] CS: Okay. And thank you all for listening and watching. If you enjoyed today’s video, you can find many more on our YouTube page. Just go to youtube.com and type in Cyber Work with Infosec to check on her collection of tutorials, interviews and past webinars. If you’d rather have us in your ears during your workday, all of our videos are also available as audio podcasts. Just search Cyber Work with Infosec in your podcast capture of choice and please rate and review as if you have a moment. For a free month of the infosec skills platform that you saw in the little video at the start of today’s show, go to infosecinstitute.com/skills, sign up for an account, and in the coupon line type the word cyberwork, all one word, all small letters, no spaces and get your free month.
Thank you once again to give to Gabe Gumbs and Spirion and thank you all for watching and listening. We will speak to you next week.