The Wild West era of data collection is over | Guest Sean Falconer

Today on Cyber Work, Sean Falconer of Skyflow and host of the Partially Redacted and Software Huddle podcasts, joins me to talk about the present and future of consumer and user data privacy, the pros and cons of adding more privacy regulations into place and his journey from software development and engineering to his current place of working closely and deeply with the future of API-based data encryption and privacy. And stick around because Falconer will share the best career advice he ever received!

0:00 - Consumer and user data privacy
2:02 - When did Falconer get into tech?
6:40 - Three degrees in computer science
12:40 - Current issues around data privacy
19:25 - The end of "Wild West" data privacy laws
24:00 - External factors on data privacy
28:03 - Why am I accepting cookies on websites?
34:45 - Experiences and learning for data privacy careers
41:44 - Learn more about Skyflow and Falconer
42:26 - Outro

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast 

Chris Sienko: 

Today on CyberWork, sean Falkner of Skyflow and the host of the partially redacted and software huddle podcasts, joins me to talk about the present and future of consumer and user data privacy, the pros and cons of adding more privacy regulations into place, and his journey from software development and engineering to his current place of working closely and deeply with the future of API based data encryption and privacy. And stick around, because Sean's going to share some of the best career advice he's ever received. All that and more today on CyberWork. Welcome to this week's episode of the CyberWork with Infosec Podcast. Each week, we talk with a different industry thought leader about cybersecurity trends, the way those trends affect the work of Infosec professionals, while offering tips for breaking in or moving up the ladder in the cybersecurity industry. My guest today, sean Falkner, has a lofty goal to make the digital economy safer, and he's doing so one API at a time. A former competitive programmer, entrepreneur and expert storyteller his many accomplishments including designing the software used to create ICD-11 at the World Health Organization, founding provencom and leading developer relation teams at Google. When he's not interviewing industry insiders as one of the hosts of the popular software engineering daily podcast or diving into all things data security and privacy on partially redacted. You can find this world-class tech talent serving as head of marketing and developer relations at Skyflow, the world's first and only data privacy fault delivered as an API. So today we are going to be talking about a topic that Sean suggested and to sort of break it down into a single phrase, he was suggesting that we might be getting close to the quote end of the wild west era of data collection and data privacy. So we're going to see if that's true. We're going to see what's on the horizon and I look forward to hearing about all of it. So, sean, thank you for joining me today and welcome to Cyberwork.

Sean Falconer: 

Yeah, thank you so much for having me and thank you for that lovely introduction.

Chris Sienko: 

My pleasure. So to help our listeners get to know you, maybe some of them are already listening to partially redacted software engineering daily, but for those who don't, when did you first get interested in cybersecurity and tech? What was the initial draw, and how far back were you when you realized you were excited about it? Do you remember how you got started?

Sean Falconer: 

learning everything yeah, I mean tech, I would say started when I was very young, sort of early, early high school. It was kind of the I was fortunate at least consider myself fortunate to kind of had been in high school in the 90s when sort of the internet was being introduced to the home, and that really I grew up in a small place in Canada, so that kind of the idea of like unlocking the world where I could suddenly, like you know, talk to different people or you know, kecknecht online, learn anything I wanted, was absolutely amazing and I just like dove in full force. And in terms of security, I would say that I kind of dabbled a little bit at different parts of my career. So initially, when I was doing my master's degree, I worked for a company called Bulletproof Solutions as a software engineer, which was a security company, and we were actually working on their first ever software project at that time, which was to try to connect the different silo databases of police forces and RCMP that exist in Canada. So there's this problem. It might be the same in the United States as well, and this is, you know, a while ago, but essentially if you got like a speeding ticket in one city and then you know, two hours later you got a speeding ticket in another city and then again two hours later those police officers would not actually necessarily know that you were just like getting speeding ticket after speeding ticket after speeding ticket in these different places and it takes a while for things to kind of bubble up and it also has to be at like a certain level. So we were working on this project to be able to do like a federated search across all these different silo databases and bring that all together. And this was, you know, before things like elastic search and all these different technologies existed. So it was a quite a hard problem to solve. I did I was working on my master's degree at the time and doing this part time, so I don't know actually what the state of that project became of. But then from there I was the CTO and co-founder of a tech company and there, of course, just as a byproduct of that, you have to be knowledgeable about security. We were in the HR tech space, so we had a lot of data about job applicants, resumes, plus employer information including payment methods. So that was kind of my first foray into things like PCI compliance and so forth and just as part of my responsibilities, I needed to make sure that we're, you know, keeping things locked down and that I was staying up to date in terms of the software we were running and so forth. But really I didn't fully lean into this area until I joined SkyFo about two years ago. And while we aren't necessarily we don't describe ourselves as a cyber security product we are a data privacy platform. You can't really have privacy without security. You know that's a big part of it. So I've become more and more interested in security, to the point that I host the security and privacy focused podcast, partially redacted, like you mentioned in the introduction and there's also so many amazing experts at SkyFo that learn from, who have spent 20 plus years in the space, that you'd have to actively be trying not to learn this stuff to not actually just like pick it up through osmosis and having conversations. We're blingers the whole time.

Chris Sienko: 

Exactly. Yeah, right, yeah, no. Yeah, you know we have a pretty big tent in what we describe as security around here, whether it's security awareness for employees or security engineering or things like risk or privacy or compliance, and that's all. Those are all pieces of the puzzle. You don't have to necessarily be working in a sock or designing a. You know the architecture to be a part of security. So so, yeah, you mentioned that your career touched on security only a few places. Your working career background is primarily focused around different aspects of software development, but within that, there's a lot of variants of aspects of your work. So, like you said, for four years you worked with a team on Google, google's business communication suite. With Pregati synergistic synergistic research, you worked with visualization techniques for mapping and reuse of ontological descriptions, xml schemas and database schemas, and now with SkyFo, your work is closely aligned with a truly of the moment concept, which is data privacy and SkyFo's data privacy vault as a service. So I would love it if you could tell me about your work journey along some of these different points, because software development and engineering ties them all together. But can you walk us through like one career role to the next and see what appealed to each one of you and what changed and what new opportunities caused you to try the next thing.

Sean Falconer: 

Absolutely. I just second yeah. So I've had, I guess, like you could describe like, a diverse set of roles, jobs, projects over the years, but there's definitely some connective tissue between it all, at least from what I see from my perspective. I did three degrees in computer science, as well as the post-doc in bioinformatics, and along that journey to what I thought my career was going to be, which was to become a professor, I always kept a foot in industry as a software engineer, which is when I worked for Bulletproof Solutions and a bunch of these other different companies. My motivation for that was one the money helped fund my education. It was much more fun to work as a software engineer than grading papers as a teaching assistant and you got paid a lot more. And then I also enjoyed it, but I also didn't want to lose my technical skills. I saw a lot of people who are professional researchers, whether a professor or even someone doing their PhD, that, like you know, they couldn't code like a whole world application because they just been too far from the technology For a long time. And I was a little bit nervous about that idea because, even though I wanted to be a professor, I also still loved the industry, still loved building stuff and I also had always thought of myself as potentially being an entrepreneur, so I wanted to keep those doors open to me Right, and my job at Purgati actually came from my PhD work. So I was working on large scale data integration problems, exploring ways that we could use automation or AI along with, like, human and the loop experts, specifically in the medical research space. Oh, okay, and I gave a talk at a conference showcasing some of the work that I was doing and the founder of Purgati actually happened to be there in the audience, came up to me afterwards and basically offered me like an internship on the spot, because they were working on something very similar at a NASA Ames Research Center in Mountain View, california, and they were essentially in the side of the world of like investigative search, essentially working with three letter acronym government agencies, and they were doing similar things like where you have this like really complicated set of data, you don't really know how everything sort of comes together. It's coming from lots of different sources, like how do you bring those things together in a way that you could start as an analyst or domain expert to explore it visually and you know comprehensively and kind of dig in and ask different questions, and that was what we were working on there. And I remember, because I'm a Canadian citizen, I've been living in the United States now for 14 years, but at that time I was actually living in Canada so I couldn't even go to any of our like customers sites because of you know, I was in a US citizen, stuff like that. So I'd have to like get on a phone call and talk about, like, how do you actually install this thing and use it, and so forth. Yeah, yeah, yeah, talk to them, yeah. And then I left academics eventually to start my own company and I did that for about seven years and then I joined Google and I eventually became their head of developer experience for business communication API's and there I was kind of working on the intersection of, like conversational AI and commerce Like, and this was all sort of pre LLM era hype and but we wanted to be able to use channels like rich communication services, which is a carrier communication service that's kind of like upgrade, sms or even from places like Google search, to bring people into a conversational experience, sometimes powered by AI, also sometimes human, in the loop. So a lot of the projects I've worked on have tried to combine sort of AI automation on really really complicated like issues related to data that require expertise or human in the loop as well, because there's such complicated problems you can't necessarily fully automate them. And that's been a lot of the sort of consistency between my the various jobs I've had and then two years ago I joined Skyflow, originally as their head of developer relations, and then I also expanded my scope to product marketing and now I lead all of marketing for the company and it's some you know. In many ways I've done like pretty much every job that you can have in the tech company, like engineering, product management, product marketing. I've been taught at university, been a researcher, built teams, been in IC, but all that has sort of been in this like data space across different verticals and different types of applications, but it's really been centered around the data and I think that's one of the consistent themes we see at amongst the executive team at Skyflow is we're all kind of data people. You know we came from Salesforce or Microsoft or places dealing with really big data problems and that's been our approach to trying to solve the problems around data privacy is how do you kind of approach this as a first principles data problem, not necessarily as something that you're kind of bolt on to an existing product or architecture.

Chris Sienko: 

Yeah, I was gonna say, the things that popped out in in hearing hearing you talk about that is that you have these. It seems like you're always sort of drawn to projects that involve like big data sets and like very sort of like large but precise problems that need to be solved around that and stuff like that, and it seems to seems to drive you from from project to project in terms of, like you know, you find this one big thing, you solve it and it's like okay, now, now, what's where, where do we go next?

Sean Falconer: 

Yeah, I mean I like I think I've tried to optimize my career around like educational opportunities, like what am I going to learn here? What is, how's this set me up for? Like the next next thing, as well as I like solving problems. You know, going even going back to the days when I was in my undergrad doing competitive programming, it was all about problem solving. Like how do you take these like stories that they tell in these problem descriptions and figure out what is the algorithmic approach to solving this problem and how fast can I write that code to basically solve it? And I just really, really enjoy that. And a big motivation for my career has really been kind of like what's the biggest, gnarliest problem that I can kind of jump into? Yeah, have an impact on that definitely.

Chris Sienko: 

That definitely popped out as as you went from from job to job there. That's really cool. So, sean, as I mentioned at the top, of the show will be discussing data privacy and some of the big changes that might be happening in 2024. But first, could you tell me about the current issues around data privacy that help bring Sky flows data privacy vault as a service into what are the top two or three biggest issues around data privacy that you see like right now and enter that in? How does something like Sky flows data privacy vault as a service, api and products like this work to secure some of those issues?

Sean Falconer: 

Yeah. So I think there's two like big problems that we see and they're like big historical problems. One is that historically, it's been very, very hard to essentially keep data secure but still usable. So there's always this like trade-off right when, like we can apply techniques like encryption, even techniques like tokenization and so forth, and they work for certain use cases, but they break a lot of things that we actually knew with the data. So if we encrypt the data, then it breaks things like search or analytics and so forth. So then we need to decrypt the data to actually make it usable and then essentially we're exposing it to risk. So that's been a really hard problem to figure out is how do you essentially keep data protected while actually keeping it usable just for various workflows? And then the other big problem that I see, which is essentially like what I call like a PII sprawl problem or sensitive data sprawl problem. So what ends up happening, especially in any modern system even it doesn't even need to be that big what we end up doing is we end up with a bunch of copies of the data. So if we think about collection of someone's information during account signup form, it's not like that data just goes from the account signup form arrives in the database. It's only ever gonna live in the database. It actually touches all these different systems along the way being passed through the API gateway. Eventually it's gonna end up in the database. We're probably gonna end up with a copy in like a warehouse or analytics store of some sort. Along the way, we might log it in various log files, the backups of all these different systems. So instead of having one copy of that data, we might actually have like hundreds or thousands of copies of the data. It's a little bit like if you took your passport and instead of having a copy of it that you have in a secure location in your home, you created like 10,000 copies of it and put them all over the place and then you tried to like protect those places. That's a much harder problem to solve, but that's basically the problem that we've created for ourselves in most modern systems. It's better than actually in the world of Cloud, where we can just kind of like pick our pieces and plug them together and stuff like that. So when we end up having to do there, historically is we try to we either build like a security fence around that and then everybody that's within the fence. We just trust that, you know. Hopefully they should be there and they're doing the right things and that's yeah. Yeah, and that's not a very effective means to actually lock down these systems. You're gonna end up accidentally exposing someone, or even, you know, have a bad actor or someone penetrates the fence and then they have access to everything. Or you end up trying to kind of plug holes in this like leaking dam in these different places, so you end up with like one tool that provides security and governance and a bunch of other things to like your database, but then you need another tool over here and another tool over here and then that becomes like a huge like operational nightmare. So those have been the two fundamental problems that I see organizations really struggle with and the way that we've approached that at SkyFlow and this comes from, like I mentioned, you know a lot of our background and data like our CEO and co-founder came from Salesforce, came from Oracle, and he saw companies struggle with this over and over again. He thought about it for a long time and what he was able to sort of like one of the key innovations that we came up with was figuring out a way to allow us to essentially keep data secure while still usable, and we created a technology that we call polymorphic data encryption, which has essentially similar characteristics that you would want from something like homomorphic encryption, which is homomorphic encryption. It's like the holy grail of encryption, where you can keep everything encrypted to whatever operations you want on it, but it's just not performant. And with basically polymorphic data encryption we've been able to give you the same value that you get from homomorphic, where we can keep everything encrypted but still allow you to do operations and use the data. But by restricting the problem space, essentially the PII becomes a solvable problem, and that is a big part of our secret sauce. So we can essentially solve this false dichotomy of essentially, usability and privacy. And then the other thing is around the sprawl problem. How do you solve that? Well, that's really where the heart of this idea of isolation, protection and governance which a data privacy vault gives you. There's other companies that have essentially created or innovated similar technology, companies like Google, netflix, apple or some of the pioneers of this and essentially what they recognize, the same way that people have recognized that encryption, keys, secrets, identity don't belong, like in our database or source code, because these are kind of special, we need specialized technology for them. The spirit of the data privacy vault is also recognizing that PII is also a special type of data the rules and engagement for are different in terms of who needs access, what they need access for it, how they perform operations on it, how often they need to touch it. So we need specialized technology developed for essentially the isolation, protection and governance of that data, while essentially de-scoping other systems for touching it. So, instead of storing this information within your database or in your data warehouse, what we're doing is essentially transforming the sensitive data into a non-sensitive version of that data that then you can still, because of polymorphic data encryption, run your. It's not gonna break your workflows and the types of things that you need your data. So, essentially, what we're trying to give you is all the things that you wanna do with sensitive customer data without you actually being exposed to seeing the underlying data, and that's really been the key thing. Interesting, okay.

Chris Sienko: 

It's almost like a teleporter. Yeah, exactly. Yeah, that's interesting. Yeah, I might be giving this all wrong, but I wanna. It seems like a lot of the problems that you've discussed here just are kind of baked into, like the last 20 years of the way like the internet has dealt with data and commerce and so forth, like we're gonna talk about the sort of like the Wild West early parts of it. But it seems like so much of what we know about this now was kind of learned on the fly as we were trying to find more and more things to do with data. Because I don't think in like the 90s or early 2000, people even knew like, oh, let's take all this data and crunch it and use it on marketing things and use it on this and spinning around here, and so once you start doing that, then you're like, oh, we have to protect all these little fiefdoms and, like you said here, so which is, of course, why it's gonna be very hard to sort of undo it. But when we spoke before the show, you described 2024 as the year where we might see the beginning of the act, end of what you described as quote, the end of the Wild West era of US data privacy in the US and other countries that are not currently protected and how they might begin to make moves toward passing sweeping GDPR or CCPA style data privacy laws. But before we talk about what that might look like or inspeculated on at all, can you give us the current state of data privacy as it stands? You were talking about that a little bit, but like what's the sort of like the 10,000 foot view of the way data privacy is done right now and a quick sketch, especially of like the early kind of Wild West era when it was at is kind of woolly as versus like how we're trying to brain it in now.

Sean Falconer: 

Yeah, so I mean, I think there's a few things going on. I think things are trending in a very positive direction overall if you look back 15, 20 years ago. But I think some of the problems that we were just talking about, like the PII sprawl problem is I think one of the reasons that we've kind of gotten ourselves into that place is because, partly, it's like a fundamental misunderstanding of the nature of data, where we've essentially taken customer data and we've treated it like all other data. And that really goes back to like 1980s, thinking of when we were first designing systems to be used in the workplace, where we're collecting customer data, employee data, and back then your data was siloed within a box underneath your computer. So it's not that someone couldn't steal it, but they had to like physically get entries to your office and like walk out with a hard drive which was big and heavy and that skipped leg day because they'd have to lift a hundred pounds. Yeah, right, yeah, exactly. And then what we've done is we've kind of taken what worked from data and security from perspective back then and then scaled it to the cloud to millions and millions of users and essentially we've like engineered our way into like an almost intractable problem where we have this like major sprawl issue, where the data is like all over the place. But on the positive side, I think, if you look at where we were 15 years ago when social media and so forth was first kind of burst on the scene, there was at the time this like sentiment that like privacy is dead. I remember people talking about that when I was in graduate school and they're like who cares? We'll share everything online. And what I think when it ended up happening was one companies like Meta eventually got in a lot of trouble, but also the consumer woke up to somewhat the reality of the situation, which is that if you can use something for free, but they're collecting lots of information about you, you are essentially the product. They are monetizing you based on your data and you should have some say about what's they're doing with that. And on top of that, people learn that the information they're giving is not all that well protected, because it's always. I've been seeing every week in the news there's some major data breaches happening. So I think consumers have become more and more aware of this problem and since 2018, when GDPR was first introduced, there's now over a hundred privacy regulations in the world. So things are changing and I don't think building or like essentially just thinking about data privacy purely from a regulatory standpoint is the best way to think about it. I think we just should be like, basically, privacy is a fundamental human right. You should be developing systems that are private and secure just because it's the right thing to do, not just because some regulation tells you to do it. But at the same time, that being said, regulations do help become a forcing function for companies to do the right thing when they're not necessarily it's not that they necessarily have an intention of like they're twiddling their thumbs and being like oh, like you know, I hope there's a data breach this week Like they don't want those things to happen. But it's easy to sort of push it down the stack of priorities when there isn't necessarily the risk of fines or other types of infractions that can happen. So privacy is now, I think, more so than ever before a C level boardroom conversation. That's actually happening at many, many companies.

Chris Sienko: 

Yeah, and I appreciate that. You know your intrinsic optimism about companies believing that it's the right thing to do to ensure privacy. But you know regulations also for those who say you know we could or we could, you know, cut this amount of budget lying out of our, you know, and make, maximize our profit. So you know, in those cases, yeah, sometimes you do have to crack the whip a little bit, but you know, ideally all of them want our, you know, have our best interests in mind as well. So, you know, fingers crossed. But so, if these changes in data privacy are indeed coming, sean, like you said, it's likely because of external factors influencing it. People sort of realizing they're being, you know, they are the product. As you say, you know, no system voluntarily adds friction or complexity into the mechanics unless it's really necessary. So, yeah, you know I guess you've sort of already answered this but, like, along with sort of people waking up, what are some of the external factors that you think has? Because it's definitely changed a lot in the last couple, two, three years? Like people are, it feels like it's hit, kind of you know, a breaking point here. What do you think are were some of the external factors that got people to start thinking about this like really seriously.

Sean Falconer: 

Well, I think one is like some excuse me, I think one is the some of the big fines that have been issued to some. You know big tech companies, you know they do make big headlines when you see a company like being fined billions of dollars and so forth. That also, of course, creates more consumer awareness and there's more and more news coverage of the issues. So I think there is two sort of forms of pressure that are happening. One is like consumer expectation. The other is all the regulations that are coming into place. But I think another thing that we haven't really talked about is that some companies are also doing a really good job of using privacy as a differentiator. So, like Apple is a really, really good example of this, where they have been really effective at using privacy as a marketing tactic, essentially, and they're able to turn their focus on privacy as a differentiator and probably a way to actually drive revenue.

Chris Sienko: 

So I think those are which is essentially kind of offsetting the extra costs of, you know, investing that heavily into privacy I mentioned.

Sean Falconer: 

Yeah, exactly Right. So I think they've been really smart about that approach and I think there's more and more companies that are kind of going in that direction. And you know we talked to a lot of startups at Skyflow and we're seeing more and more startups, you know, want their sort of culture and marketing to be, you know, privacy first and make that sort of like a core part of their identity as a company.

Chris Sienko: 

Yeah, yeah, no, I think that's a great answer and I think also, you know, I'm sure a lot of other companies are already looking at that model, but I guess it's worth sort of putting out there that, like this is something that you can both, you know do as a revenue generator, as you say, as well as doing the right thing, as you said. We should all want to, you know, do right by our customers.

Sean Falconer: 

So yeah, and I think that you know, if you even compare it to something like the car industry, where there are, you know, certain regulations that exist that cars have to, you know, meet from, like, a safety perspective, there's also cars that use safety as a, you know, a marketing tactic and so forth, and I don't think they've been able to figure out like at least I'm not in the car industry, so maybe I'm wrong on this but my take, at least, is that I don't see a lot of like news and buzz about how this has become a barrier to innovation, like the fact that we need to have seatbelts in the car is, like you know, limiting our ability to innovate in the car space. I think the opposite actually happens, that it actually creates innovation. So, you know, there was a time where you could, you know, drive and talk on your cell phone at the same time when cell phones first came into play, and essentially, we eventually realized that that was dangerous and we developed or innovated new technologies to still allow us to use a phone in the car, but do it in a more safe way. So I think that it can actually be like an inflection point or ingestion for innovation and technology improvements that lead to not only more safe products but actually probably more revenue for a company.

Chris Sienko: 

That leads perfectly into my next question here, because I wanted to talk about the user experience around all this stuff. So, you know, to split the experience of data privacy into the insider view, like that we are learning about here data privacy, gdpr and all that and to the average user view, what is well this most more closely watched year of data privacy looks like from a UX perspective? Because you know, and I know, this is not your sort of your area, but you know, at this point, you know, every website you go on, you're customizing your cookies. I accept, I don't accept, I customize and you know, I think there's a user experience element of it that is, people don't know why they're doing it, they don't know why it has to happen every single time. It's exhausting to them and it gets them a little frustrated. So I mean, do you think that this type of customize your experience before you start using the site or service will continue and get more intrusive as these regulations progress? Is there a possibility of some kind of your universal like key that we could, you know, like a key chain that we could tell each site okay, use only these and or no cookies, I'm not hungry, or whatever? You know and apologies for the glare because I'm about to don my small tinfoil and I'm not gonna get that here but like it almost seems like some of this is almost done as a way to sort of exhaust people around the idea of data safety and just like have everyone just say, okay, I don't care anymore, you know, take my data. Whatever you know does something like your APIs, you know, and other sort of regulations that are coming up. Do you think that that will streamline the user experience? Do you see a path forward that way? Or is this just gonna be sort of part of the new new in an attempt to make sure that everyone's being sort of ethical in the way that they track and so forth?

Sean Falconer: 

I certainly hope that this is things don't get worse. From a cookie pop-up standpoint and, to be honest, like I don't know what the solution here is, necessarily, but I don't think what is there today is sustainable. You know, it's just something that feels like it has to change, like it's hard to imagine the web five years from now and that this is still the situation. It's kind of like in the old days when you used to have to in the like browser wars era, where you would come in and be like, choose your experience are you on the internet explorer, are you on Firefox or Netscape or something like that and then you would have like a different experience depending on which browser you were going to. I just like it's just not sustainable. I think systems that have a lot of, like friction and frustration, that are actually not effective, are not ones that kind of went out in the long run. And also most of these, a lot of these popsups, are designed to be confusing to people so that they do the thing that the company wants them to do, and I also think that is, you know, not in the best interest of the consumer of the browser. So I just don't think that these things are going to be there for the long run and but I don't know exactly what's going to happen. Like I could certainly there should be a, you know, potentially something where there's like a some level of trust that can be involved, right, like essentially, you're giving informed consent to the company or to the like the website that they can track you, and you should be able to maybe blanket give informed consent to like a I don't know a block of types of companies that you trust or you meet some certain standard or something like that, versus just like every single time having to agree to do it, and I think it's a lot of ways that's like kind of lazy you know user experience when you always have to prop the user Like ideally, a great software system knows what the sort of you know default someone actually wants ahead of time, and I think that you know potentially that's something that we can move towards in the future, but I don't know exactly and I also don't know all the necessarily the the limitations around the regulations that are preventing someone from doing something better Makes sense.

Chris Sienko: 

Yeah, okay, yep, just thought I wanted to bounce it off the wall. I mean, it blows my mind that I have to literally you know enable cookies every single time I go to the same website, even though it's like. It's just wild. So I want to turn this into the you know, the career space. I want to get you some new you know, junior professionals, possibly, you know, in your field here. So people currently working in data privacy or trying to get into it what will the changes coming in the next few years mean for their skill sets? Do you think that there are any new skills or learning areas that you recommend that people who already do work in the space need to get into so they don't get left behind?

Sean Falconer: 

I think the big like obvious one is AI. So you know, privacy is one of the biggest question marks and biggest concerns surrounding LMS and other forms of gender of the AI right now and it's what's stopping a lot of larger investment in them from. You know, companies like lots of companies are interested in learning, but we're not seeing a ton of like moving beyond sort of demo where a proof of concept or interest because of some of the challenges around you know, privacy and security and there's also a lot of, I think, misleading information up there and people creating products I can see your cover in this for AI that don't really solve the fundamental issues. So I think this is something that privacy professionals you know, whether you're entering that space today or you're already operating in that space, then you need to kind of get up to speed with because a big part of your job, if you're a privacy engineer, a lot of times it's like bridging the gap between the engineering organization and maybe like legal and compliance. So you need to be able to speak to the engineer in the language that you know they understand and the way they like to talk about things and then be able to translate that essentially to you know legal and compliance in a way that they can understand it, so that essentially, everyone can, you know, do their job in a way that's gonna work for everybody and you know if you are, if AI is an investment or LLMs or is an investment that your company's making, then you need to speak, you know that kind of language and have enough knowledge about it. What are the actual problems that exist there and how does the technology work? Why is this different than you know?

Chris Sienko: 

some of the problems that we've seen in the past, like a database and so forth, yeah, yeah, so many security roles and roles in privacy and related things seem to involve being intermediaries between different parts of the company and being able to understand both sides of the wall and being able to sort of explain over it, to make sure everyone's sort of on board, which I don't think was necessarily the case 10 years ago. Yeah, absolutely yeah. So for listeners who are choosing or maybe re-choosing their careers after another role and are considering areas like data collection, data privacy, privacy regulation, day security and others, what are some must-have sort of skills you know, I mean, you sort of learned along the way and so forth? Like, what would you recommend now that someone getting into this space would do in terms of experiences, in terms of certs, in terms of learning, either at home or in a professional standpoint? Like what are the things that are absolute must-haves for people working in this area?

Sean Falconer: 

Yeah, I think it depends a little bit on the role that you want to move into, like, let's say that you're I don't know a salesperson or you know you're in product marketing or something like that, and you're like, oh, I'm interested in like security, or I'm interested in privacy, I want to take my role but do it in that space, then that's different than I'm an engineer and I want to pivot into being like a privacy engineer. So, in terms of if you're in the existing role and you kind of want to move into the area of security and privacy, I think a big part of it there is like educating yourself, at least broadly, about, like, the tools and technologies in space. What are the problems, what are the challenges that different companies face? How would people solve this today, what does future look like and so forth, when there's tons and tons of resources available to do those kinds of things today, like podcasts and YouTube and conferences and lots and lots of articles and so forth. So a lot of it's kind of just like trying to get out the speed so that you have some sense of what's actually happening. How do I like talk about potentially the product or space that I'm interested in in an authentic way, and that is important, regardless of the role that you're in. If you're an engineering and you kind of want to move into privacy engineering. I think that's great because it is a very fast growing space and they're always looking for people who have technical backgrounds, that kind of move into the privacy engineering, and I think that's something that's changing. If you looked, even probably five years ago, on LinkedIn and did a search for privacy engineering jobs, probably the first like 50 pages is like Google, meta and like Microsoft or something like that. Now there's a lot more diversity in terms of the companies that are hiring for that role and there's we already talked about sort of a lot of your job is sort of being that translator or storyteller between two areas, and I also think that you know this is an exciting time to get into the space because of a lot of the things that we're talking about, like there is a growing momentum to make this a priority day, one priority for companies, and there's also a lot of really exciting things that are happening in the world of privacy, enhancing technologies, like things like secure enclaves and homomorphic encryption and secure multi-party computation. So there's a lot of different things that you can do in the space that doesn't necessarily just have to be like hard core or technical. There's lots of opportunities for people who can kind of bridge the gap between the super technical and then not so technical.

Chris Sienko: 

Nice, yeah, that's great. So as we wrap up today, sean, I'm trying something new here for 2024. So where would be like could you tell our listeners the best piece of career advice you ever received, and is that advice still something you would give to professionals preparing to enter the field now?

Sean Falconer: 

Sure. So I've had a lot of probably both good and bad career by something here, so sure. But you know, one thing that always kind of jumps to mind for me was it was kind of like an accidental advice from my master CISIS supervisor, who's kind of this like crazy Russian expat to Canada that never really adopted the Canadian lifestyle but he joined basically in communist era USSR. So Russia changed a lot since he lived there, canada changed a lot and he was kind of like a fragment of the old world. But one of the things that he said to me once was that the problem with people with PhDs is that they think they know something. And essentially what I think he was saying there, or at least what I interpreted that to be, is that they reached sort of a level of credibility or achieving something like a PhD and they stopped learning because they think that they're an expert and they stopped being curious.

Chris Sienko: 

They're at the top of the mountain, yeah.

Sean Falconer: 

Yeah, exactly, and you always have to keep learning and I think that's easier than ever to do that. You know, as I mentioned, you know this podcast, youtube, conferences, all this sort of stuff and that's really where I've tried to, you know, optimize my career is really around, how am I going to grow and learn? And because I've been able to sort of skill stack across these different types of roles, it's given me a lot of flexibility in terms of, like, where I take my career, and then it really becomes like an alignment of do I really believe in the problem that this company is solving and cares about and do I like the company and so forth, and I'll figure out a role that's going to be a good fit for me there.

Chris Sienko: 

Yeah, that's, that's great advice. Thank you very much. So, speaking of you, you mentioned, you know, certain ties to the outside world, including podcasts. I'd like to have you tell our listeners about your multiple podcasts software engineering, daily and partially redacted. What are the focus of these shows? What kind of guests do you get and, if you can think of one, what's an ideal episode to introduce new listeners to either of the shows?

Sean Falconer: 

Sure, so the. So I host actually three podcasts, but we'll focus on software HUNL and partially redacted. So partially redacted is focused on privacy and security engineering related topics, and the I say the types of guests that we have there span everything from you know regular engineers that want to talk about something you know a problem that they had to solve that relates to security and privacy, to privacy professionals, kind of sharing their thoughts and experiences with building privacy organizations or working with engineering teams or how do you build a privacy program from scratch to security professionals as well, talking about some of the challenges that they face. And you know we've had people from you know Snowflake to you know Google, microsoft, to lots of companies like that, all the way to Dr Lori Craner who runs the CMU privacy engineering program, so you know professors, all the way to you know people that are working in big tech. And then on the software HUNL side, that podcast is primarily focused on the software engineering or tech news, tech investment. Guests are primarily engineers, product people, sometimes venture capitalists and founders. And I think like a great episode there for people to check out is this interview I did with Bob Muglia back in December, who is the former CEO of Snowflake. He was that executive at Microsoft for 23 years. He's essentially seen the entire life of like structured data, essentially like SQL based databases, like he was the person who ran the SQL Server Group at Microsoft to essentially building Snowflake, the modern data platform, and he wrote a great book that we talked about. But it's a really he's a really smart guy that has a huge breadth of stories and experience working directly with, like Bill Gates and Steve Ballmer, and it's just like one of the you know, funnest, most interesting interviews I've ever done.

Chris Sienko: 

Oh, that's cool. Yeah, I'm going to check that out right after this. Thank you, okay. Well, I hope everyone will check out software HUNL, is it? Yes and partially redacted. So one final question If our listeners want to learn more about Sean Falconer and Skyflow, where should they look?

Sean Falconer: 

online Skyflow is pretty easy. You can find us at skyflowcom or, you know, search and we'll come up, and then for me, I'm pretty easy to find as well. You know there's a lot of stuff on the internet that I produced over the years, but if you just look me up on Twitter or LinkedIn with my name, sean Falconer, you'll find me, and I'm always happy to connect with people and be helpful if I can.

Chris Sienko: 

Thanks, yeah, you almost certainly get some, some connects. We have a very engaged listenership here, so, uh, so look forward to that, and if you all decide you want to connect with Sean, let him know. You heard about him on cyber work, so thanks so much for your time and insights today, sean. This was a real pleasure. I really appreciate it. Thanks so much for having me, chris, and thank you to our 80,000 plus cyber work viewers and subscribers. You're in input and enthusiasm. Make this a joy to do each week. If you have any topics you'd like us to cover or guests you'd like to have on the show, please feel free to drop them in the comments below. I always like to hear what you're you all are interested in as well. So before I let you go, I hope you'll remember to visit info second suitcom slash free to geta whole bunch of free and exclusive stuff for cyber work listeners, for instance, our new security awareness training series, work bites, which is just awesome, and I encourage you to go and watch the trailer. Do you have better security awareness skills than your coworkers? What if those coworkers were a pirate, a vampire, an alien, a zombie and a fairy princess? Info second suitcom slash free is also the place to go for your free cybersecurity talent development ebook, where you'll find our in depth training plans for the 12 most common security roles, including sock analyst, penetration tester, cloud security engineer, information risk analyst, privacy manager, secure coder and more. Once again, that's info. Second suitcom slash free and, yes, the link is in the description below. Thank you once again to Sean Falconer and Skyflow, and thank you all so much for watching and listening and until next week, happy learning.

Free cybersecurity training resources!

Infosec recently developed 12 role-guided training plans — all backed by research into skills requested by employers and a panel of cybersecurity subject matter experts. Cyber Work listeners can get all 12 for free — plus free training courses and other resources.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.