California Consumer Privacy Act: Are you prepared for 2020?

Watch our on-demand webinar on how to best prepare for the California Consumer Privacy Act with Jay Rodne, Privacy Director at Sentinel and former Washington State Representative, and Aaron Weller, VP of Strategy at Sentinel and Fellow of Information Privacy.

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free

– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

Chris Sienko: Hello, and welcome to today's episode of the CyberSpeak with Infosec Institute podcast. This is an audio rebroadcast of a recent InfoSec webinar entitled California Consumer Privacy Act: Are You Prepared For 2020? In the manner of the European Union's GDPR mandate, California, which in recent years has become the fifth largest economy in the world, has created a privacy policy that is expected to have a wide-reaching impact.

Our webinar guests are Jay Rodne, Privacy Director at Sentinel LLC and former Washington State Representative, and Aaron Weller, VP of Strategy at Sentinel and Fellow of Information Privacy. During this webinar, we'll discuss how privacy laws may continue to evolve due to the influence of CCPA, and steps to help you prepare for the CCPA. In addition, we ended the webinar by taking questions from listeners about the CCPA and data privacy.

Just as a reminder, if you'd like to watch this webinar as it unfolds, including presentation slides, you can find this podcast on our YouTube page by searching InfoSec Institute and visiting our channel. Without further ado, here with moderator Camille DuPuis, are Sentinel's Jay Rodne and Aaron Weller.

Camille DuPuis: We have today Jay Rodne, and Jay is the Privacy Director at Sentinel LLC, which is a data privacy and security consulting firm. He previously served as a State Representative for Washington's fifth legislative district from 2004 to 2018. While in the Washington House of Representatives, Jay did serve in a leadership position as Ranking Member on the House Judiciary Committee from 2007 to 2018. Prior to his role with Sentinel, Jay served as the Deputy General Counsel for Astria Health, which is the largest integrated healthcare delivery system in central Washington. Jay, really thankful that you made time to join us today.

And also thankful that we have Aaron Weller. Aaron is the VP of Strategy at Sentinel, and he has over 20 years of global consulting and industry experience. After five years leading PWC's privacy practice for the West Coast, he spent a year helping a leading technology company with their GDPR implementation efforts. And he now provides strategic privacy advice to companies looking to innovate their use of personal information. Prior to joining PWC, Aaron co-founded and ran an information security and privacy strategy consulting firm, and he's also held other roles including the Chief Information Security and Privacy Officer for two multinational retailers. Aaron is recognized as a thought leader in the field of privacy, and he has been accepted as a Fellow of Information Privacy by the IAPP.

We're just really excited to have these two guests here, and really going to be able to provide us some great information that we can use to prepare for the upcoming Privacy Act. So thanks again, Jay and Aaron, for joining us. We appreciate your time and appreciate you sharing your knowledge with us.

Aaron Weller: That's great. Thanks so much for the introduction, and welcome to everyone on the phone or who's listening to the recording of this later on. We're going to talk today about the California Consumer Protection Act, and really we wanted to start off with setting the stage for those of you who may not have had to deal with some of the other privacy requirements in countries outside the United States. And you may be thinking, "Well, why are we talking so much about California?" And there's a couple of reasons here. One is that California, if it was its own country, would actually have the fifth largest economy in the world. It's the most populous state in the US, and not only is it home to a lot of people, but it's also home to a lot of businesses. California's economy really drives, and certainly in the use of technology and the use of personal information ... The companies in Silicon Valley and in other places in California really drive a lot of the way that the world sees how personal data can be used in new and innovative ways.

One of the other things with California, from a privacy perspective in particular, is that California tends to be very forward-thinking when it comes to actually looking at privacy protections for its residents. California was the first state, several years ago now, to introduce a law that actually had penalties if personal information was breached or lost. And then all of the other states in the Union, Alabama was the last one to implement a law a couple of years ago, but now every single state in the Union has a similar law to the original California law. So, what happens in California tends to move across the rest of the country in some of these areas, particularly where we've seen it before with respect to privacy.

The history with the CCPA is actually pretty interesting in that this started out where there was a system ballot initiative was put on the November ballot. And there was a group of interested parties and privacy advocates who said well, we've seen the law come into Europe, the General Data Protection Regulation or GDPR. Which has got some pretty significant penalties, but also gives individuals a lot more control over what they can find out about information that they give to companies, and what those companies do with it. They were looking towards Europe and GDPR and saying well, maybe not all of that will fit in the US legislative context, but some of those pieces we can try and get into a law at the state level in California. And then based on historical precedent, what's probably going to happen is that that may then lead to other states looking at similar laws. It may even potentially lead to a federal law down the track as well.

This ballot initiative was proposed, and pretty quickly a lot of people who are interested in space looked through the text of the bill and said, "Gosh, there's a lot of stuff in here that could really be problematic if it goes into effect." Some of the issues were around just the level of complexity and the level of effort it would take to comply with the law. But there were also some areas where really the language of the bill as drafted would have caused maybe some unintended consequences. One of the examples of language that was in the original ballot initiative was that if anyone complained or made a complaint about how a company was using data and there was a lawsuit, they could potentially get up to a double digit percentage of the settlement value just for making the complaint. Which a lot of people in the technology industry in particular felt would lead to an ever merry-go-round of people making complaints about pretty much every aspect of what they did, because there was just such a huge incentive.

A lot of people rallied around and said, well, okay. How do we get this into something that ... We get rid of some of the things that are really going to be problematic, or they're areas which were very ambiguous. Like, some of the definitions which we'll talk about in a minute could be interpreted in ways that we don't believe the drafters originally intended. There was a process, a very intensive process, of about three weeks last summer where we went through several drafts of the bill before actually the backers pulled it from the ballot in part of a deal to say, "Well, we're going to pull it from the ballot in exchange for a law actually being passed by the State Legislature that's going to make it a law and put it on the books. And we won't have to go through the ballot process." So that's what happened.

I mentioned GDPR light. And you've probably seen articles in the press about some of the first fines coming out under the European General Data Protection Regulation in the last week or so. GDPR, it's a very complex law, but it applies to anyone that does business with Europeans. That means that a lot of domestically-owned and operated and run businesses in the States, who do not have customers that are in other countries, may not actually have had to comply with GDPR. So CCPA does have some similarities, and we'll go over those, to GPPR. But there's some important differences as well.

I mentioned the drafting process. To fix some of the drafting errors and ambiguities, the bill was actually replaced within a couple of months by a clean-up bill, Bill 1121, which was signed into law at the end of September last year. This actually doesn't change a lot of the CCPA, but it does clean up some of the most significant issues. Because of some of the conflicts between what's written in CCPA today with other California laws, there is a continuing process regarding further amendments. There are also, and we'll talk about this a little bit later on, efforts in other individual states and also at a federal level to have effectively one law that covers a lot of the same ground. But it's going to give businesses some consistency and certainty rather than having a repeat of what happened with the Data Breach Laws, where we have 50 different variations that cover similar provisions. But there are some differences between every state, which makes it complex to comply with.

CCPA, as I said, is going to have national impact. Even if you don't do business in California, it's likely that CCPA will have some impact on you. It does introduce the rights of private action and class action lawsuits for some kinds of breaches and misuse of personal information. A lot of the enforcement of civil penalties lie with the Attorney General, and we're still waiting for the AG's office at the moment to come back with their guidance on exactly what they're thinking about enforcement, how they're planning on doing that, and the areas where they are most concerned about compliance.

Again, this is a dynamic environment where we don't know all of the answers yet. While CCPA in some ways would serve as a template for other states, draft legislation that I've seen, that Jay has seen for other states, has sometimes skewed more closely to the language of GDPR than CCPA. We've now got this situation where again, I fear that we are going to have lots of variations around some similar topics. And I do believe, because I'm involved in some of the backroom discussions, that part of CCPA may well form the basis for a future, federal privacy legislation if that is considered a priority by Congress.

What are the risks of non-compliance? From a regulatory perspective, there are some specific clauses that talk about damages. Up to $750 in damage per consumer per instance, or actual damage in the event of a breach, or potentially civil penalties which rise to 7,500 per intentional violation if the offense is not cured within 30 days. And that's an important point, that there's actually language that says that if there is a complaint that the AG is going to take action on, a company will have 30 days to correct its activities before that penalty is handed down. There are some things like that that, as you're going through and thinking about how are you going to deal with CCPA, really thinking through how are you going to manage those kind of scenarios?

There's obvious reputational risk. Privacy is in the news a lot at the moment. And brand damage can happen depending on the nature of an instance. And then financial risk. Not just fines and penalties, but also the cost of defending against even unsuccessful lawsuits, the cost associated with understanding, responding to complaints that come in even if they are unfounded in the requirements of the law themselves. And then potentially increased cyber security insurance costs, depending on the situations that could occur.

So, who is affected? The definition in the law is that CCPA applies to California residents. The name of the law is actually the Consumer Act, but the way that the law is written at the moment, it also talks about ... It's ambiguous when it comes to things like, are employees included? Is household included, when some of the members of the household may not be a consumer of a particular business? The definition is intended to focus on consumers. It does focus on California residents. But if you read through the law, and Jay will talk to you in a second, some of those definitions are a bit ambiguous.

In terms of businesses, it's any business that collects information of California residents. However the definition of business covers ... As it says on the slide, there are some threshold tests, so that the intent is that small businesses with revenue under 25 million will be excluded from having to comply with the law. And then if you're buying, selling or sharing personal information of more than 50,000 consumers. Which sounds like a fairly high bar until you think that if you run a blog that only has 150 visitors a day, over a year that's going to add up to more than 50,000 people. And if you're collecting and selling information, that could put you in the scope of CCPA.

I'm going to turn it over to Jay to talk about some of the definitions in a bit more detail.

Jay Rodne: Thanks, Aaron. And as was mentioned in the introduction, I've served in the Washington House of Representatives for about 15 years, and I can say that the states really are the policy laboratory for the nation. And as states continue to grapple with the issue of data privacy and trying to balance that with competing constituencies in terms of Internet freedom and so forth, we're going to see lots of experimentation at the state level for a lot of these policy initiatives. In fact just this past week, we saw a bill introduced in the Washington State Senate that really is similar to the CCPA. And as the session continues to evolve ... And many other states also have implemented or introduced legislation at the state level actually similar to the CCPA. This is all going to be a very exciting time over the next year. And it will undoubtedly lead to the impetus for, as Aaron mentioned, federal legislation down the road to deal with a more unified, national standard.

What I'd like to do in this portion of the presentation is talk about some key definitions under the CCPA. We'll talk a little bit about some of the important distinctions between CCPA and GDPR. And then we'll delve a little bit more into the specific rights afforded under the CCPA, and some of those compliance initiatives that we think are important to get in motion now so everyone's prepared for CCPA's effective date, which is January 1, 2020 of next year.

Key definitions. Aaron talked a little bit about the ambiguity of the definition of consumer. And as the statute defines consumer, it simply states, "A natural person who is a California resident." Personal information is very broadly defined as, "Relating to, identifies or describing or capable of being associated with, either directly or indirectly, with a particular consumer or household." And again, that term household is ambiguous. It's not defined under statute, and so that will be perhaps some area of amendatory action this year in the California Legislature.

Some of the differences between the CCPA and GDPR. I think the theme you'll see is that many of the definitions under the CCPA are broader. If we look at individuals defined under GDPR versus CCPA, individuals include households and families. Person definition is likewise more broadly defined, and include inferences or probabilistic identifiers, or even browsing history. Data use is also a little bit more ... Actually, if you're going to use data for a different purpose under the CCPA, you have to provide that notice at or before collection, which is a little different than under the GDPR. And the main differences really are, under the CCPA, more broadly defined. Right to delete covers more types of information. Right to delete covers only data collected from the consumer. The opt out of selling of data is again broader. It's a broader opt out right under the CCPA as opposed to GDPR.

Interestingly enough, the private right of action is more broadly defined under GDPR as opposed to CCPA. CCPA only allows private rights of action for the failure to implement and maintain reasonable security practices. And then the fines under GDPR are much more expansive, including up to 4% of global revenue. Whereas under CCPA, they're much more narrowly defined.

Aaron: And I think why we care about this slide and why we're showing the differences here is that if you've gone through an effort around GDPR, a lot of the things that you've done can be adapted and reused for CCPA. But I think, and Jay's going to walk through here, when you get into a lot of the different definitions, the variations in the definitions from GDPR and from even some of the other existing US laws mean that you're then going to have to go back and say, "Well, okay. We went through and thought about what information we need to provide to someone if they ask us to tell them what information we have about them." But this information under CCPA could be slightly different. There's a lot of rework, or at least rethinking. Maybe not necessarily significant tweaks to scope, but there's a lot of going back and saying, "How much of what we've done already can be reused?"

For those of you out there who are like, "That's great, but I didn't actually have to go through GDPR the first time," the good news is that all of the consulting firms and a lot of other companies have already been through it. There's a lot of good guidance out there around how do you set up a program to respond to customer inquiries? How do you deal with some of these other things? Even if you didn't have to go through GDPR yourselves, as you think about CCPA, a lot of that same path that a lot of companies have trod in the last couple of years, they've laid the groundwork for you being able to deal with some of these issues fast.

Jay: Thanks, Aaron. Now as we've discussed in the last slide, personal information is more broadly defined under the CCPA as compared to the personal data definition under GDPR. The lists of the components of personal information and personal data are listed there on the screen. One thing I'll note is that personal information under CCPA explicitly includes education information. That's not necessarily called out under personal data definition under GDPR.

Now, let's talk a little bit about the specific rights afforded to consumers under the CCPA. There's a right to access, a right to have PI deleted, right of disclosure, right to opt out and opt in, and the right to non-discrimination. And we will talk briefly about each of these rights and some of the compliance initiatives that we've identified. Before we do that, I want to just talk a little bit about specific rights and compare and contrast those with the data subject rights afforded under the GDPR.

From the screen there, there are five rights listed under CCPA. And a lot of those rights are included in the data subject rights list to the right of the screen. Although, the data subject rights under GDPR include several other, additional, specific rights, like the right of rectification and the right of rejection of automated individual decision making, which we'll talk a little bit more down in subsequent slides.

First and foremost, the right to access. And the quote there is directly from the statute. "Consumers have the right to request that a business that collects a consumer's personal information disclose the categories and specific pieces of information that the business collected." And in looking through some of the compliance strategies, we've identified four really specific ones. And you'll see a lot in these compliance initiatives consistent themes around training employees, training of staff, understanding how data flows into and how it's stored in an organization, transparency in providing appropriate notices, and ensuring that the notices required under the law are given in a transparent and conspicuous manner.

Specifically, with right to access. Again, data mapping is really critical in understanding or being able to comply with the consumers' right of access. Businesses really have to implement policies and standards and training around how the organization collects, stores and protects consumer personal information.

Want to talk a bit about intake channels, because this is directly from the statute that businesses must provide consumers ways to exercise their rights under the CCPA. At a minimum, that includes providing a toll-free number and a website notification that is conspicuous and includes that number so that consumers have a reliable method to exercise their rights under the CCPA.

The next right is the right to deletion. And again, important compliance initiatives around are really focused on training employees, that they're trained in how to not only authenticate and verify what is an appropriate and legitimate consumer deletion request, but how to carry that out. It has to be set forth in policies and procedures that can be reliably tested and authenticated. And then also there's some technical considerations as well in terms of ensuring that you've got systems in place that will promptly and accurately implement a deletion request from a consumer.

Aaron: I mean if you think about it, although it's not explicitly stated here, how do you delete personal information if you don't know where it is? A lot of the work that you're actually going to have to do to be able to allow, to exercise these rights ... I mean, think of it like an iceberg. Yes, you're going to have to train employees. Yes, you're going to need to have a process on the front-end, but a lot of this really has that underlying ... You need to know where the data is. You need to know how to propagate a deletion signal through multiple systems and out to third parties you may have shared it with. When we talk a little bit about what are the things you need to do to be able to comply with these rights, you need to be thinking about not just what the requirement says on the face of it, but also all the steps leading up to that that are going to allow you to be able to do these things.

Jay: The next right is right of disclosure. And again, this provides consumers with the right to be informed of the categories of their PI collected, the sources from which that PI was collected, the business purpose, the categories of third parties to whom the PI was shared or sold, and the specific pieces of PI that was collected or sold. That really is an important demarcation point for ... A lot of the states right now are exploring how to empower consumers, how to empower individuals with control over their data. And this certainly is a very important step in that regard.

Compliance initiatives, again, center around training employees to understand when they've received a request for disclosure from a consumer, how to verify and authenticate that that is a legitimate request and that the consumer is who he or she says, or presents themselves to be. And ensuring that you've got the appropriate notices, both externally and internally within the organization in terms of what the rights are and how to afford consumers the ability to exercise those rights. This is an important right, and that's the right to opt in and opt out. Again, under the GDPR this is termed informally as the right to be forgotten. But the legal standards that consumers have the right to direct a business that sells personal information to not sell their information to third parties without that consumer's consent.

Consumers aged 13 to 16, you've got to provide those consumers in that age bracket the opt-in. It's not an opt-out. They specifically have to be opted in to the sale of their personal information, which again implies that there will be significant policies or processes in place to identify the ages of consumers and how to properly implement those opt-in rights. Consumers under the age of 13 need affirmative parental or guardian consent. And again, that also implies the ability to understand and differentiate consumers who are aged 14 as opposed to aged under 13. Opt-out requests have to be honored for at least one year. And a caution here. "A business that willfully disregards the consumer's age," this is from the statute, "Shall be deemed to have had actual knowledge of the consumer's age." If a business is mere negligence in terms of not having processes in place to accurately assess and validate a consumer's age will effectively mean you've got actual knowledge in terms of, for liability purposes, under the statute.

Compliance initiatives. And again, that first bullet point there is right from the statute itself. "Businesses that sell personal information have got to have a conspicuous link on their home page entitled Do Not Sell My Personal Information." Again, employees have to be trained on how to properly verify an opt-in or opt-out request, trained in how to differentiate different ages of categories of consumers for different standards. There's got to be processes in place to have the identification ability when PIs collect from minors under the ages of 16.

Timeline compliance. Safeguards have got to be implemented that will track and honor opt-out requests for the legally required time period. And then of course, we've talked about parental, guardian consent in order to have that opt-in. You've got to have that parental, guardian consent. And that implies a process to obtain that information.

The next right is the right to non-discrimination. A business shall not discriminate against a consumer because a consumer exercises any of their consumer rights. And I think it's going to be a little gray area in terms of the financial incentives that are permissible under the statute. Because the CCPA is clear that there should be no discrimination for those consumers who exercise their rights in terms of pricing or level of services. But the CCPA does then in another section provide for the permissibility of financial incentives. And how those financial incentives are defined is ... They're permissible if the difference is reasonably related to the value provided the consumer by the customer's data.

And I think that is an ambiguous standard. Obviously the California Attorney General's going to have to do some rule-making around that standard to really more succinctly define when financial incentives are permissible. As of right now, it's ambiguous. Does that mean that you can offer discounts to consumers if you use their personal information? And if so, what is reasonable in that context? It's going to be I think, again, a lot of focus of amendatory action, perhaps this year, and AG rule-making in California.

Compliance initiatives. Again, review policies and procedures to ensure that consumers are not treated differently for invoking their CCPA rights. You've got to inform consumers of financial incentives or compensation for the sale of their PI to third parties. And you've got to obtain consumer opt-in consent before enrolling in any kind of financial incentive program.

Enforcement and penalties. As we discussed previously, the enforcement and penalties under the CCPA are different than under GDPR. Under the CCPA, the California Attorney General has enforcement authority. It's up to ... Actually, it's 7,500 per violation. For unintentional violations, the California AG can invoke penalties of up to 2,500. But there has to be a notice and opportunity to cure of 30 days. And if that business fails to cure unintended violations within that 30-day period, then the Attorney General can invoke that civil penalty of up to 2,500 per violation.

Private enforcement actions, again, as Aaron mentioned, are permissible under CCPA. As well as class action lawsuits. Statutory damages are from 100 to 750 per incident, or action damages, whichever's greater. Again, there's a notification that private plaintiffs must provide to the business to allow an opportunity to cure the alleged violation.

Now I'll hand it back over to Aaron to talk a little bit more in detail about some compliance strategies as we prepare for the implementation of CCPA.

Aaron: Yes. If you think about everything we've talked about so far as being what do you need to do, I wanted to really dive into how we think some of this stuff should get done. I mentioned earlier that whether or not you've been through GDPR, there will be some work you need to do that really comes back to the foundations of how well do you understand the way that personal information under the CCPA definition, as we looked at earlier, is collected, used, managed and shared through your organization. And we've got seven steps of what we think are some of the big pieces to really work through how you can get to a good place with respect to compliance.

Starting with the privacy policies. Really updating your notices and policies to ensure that they meet the CCPA requirements. And I'm using notices and policies in the same way that the IEPP does in that notices are externally facing, your commitments that you're telling your customers. And then policies are your internal-facing documents that tell your employees how they should be handling personal information, and your internal processes. Both of those are going to need to be updated, including some of the things that Jay was talking about, like if you don't already have a dedicated 1-800 number for privacy requests and complaints, you're going to need to make sure that that's prominent in there as well.

With data lifecycle management, whether or not you already have an existing data map ... And I don't necessarily mean a graphical map. But an understanding of where that personal data comes from and how it's flowing through your organization. Which technologies and systems does it touch? Which business processes does it support? What are the third parties that are involved in each process? And really where you have personal information, do you have those key controls, like encryption, that will help protect you and give you a safe harbor from some of those breach provisions? Or do you have ways of separating some of that personal information from maybe other information so that you can reduce your risk? So really getting that data lifecycle through whichever way, whether you're using a top-down, more manual process of interviews and gathering information, or whether you're using a tool such as the several that are on market right now that go out and look for personal data, or kind of a combination of those two. This is really foundational for a lot of the rest of the work that you need to do.

For a privacy by design perspective, really making sure, similar to what we've done for years with information security, that privacy is really built into every way that you use personal information. The way that I think about the different between privacy by design and some of the security controls is that a lot of security controls, you can often run tours or go through and look at code, look for specific vulnerabilities, give your report that says this is where you need to go and look for things. From a privacy perspective, often the privacy controls you're looking for are more process-driven. It's, did we collect more information than we need to? Do we understand what information we got consent for? And what did we actually get consent to do? Did we get consent to send someone a package to their home address but not necessarily to send them marketing material using that same information to that same address? When you're thinking through privacy by design, again there's a lot of good guidance out there, some of which was really gone through and defined in the context of GDPR. But you can reuse it for CCPA as well.

Although CCPA doesn't include explicit privacy by design requirement, it's similar to the data lifecycle piece in that it's going to be very, very hard for you to effectively say, "Yes, we know that we've actually got a good understanding of the data. And therefore, that we can actually respond to those data subject rights completely and accurately," without having gone through this step.

The individual rights processing. Really this is what Jay was just talking about with going through those individual rights. For each of those rights, you've got to think about what does this actually mean? What data is in scope? How am I going to actually gather or manage the data that is in the scope of this particular right? And then how am I going to communicate back to the individual that requested to exercise their right that I have actually done what they said?

This is something where I wanted to take a second and talk about really CCPA has two different kinds of scope. One is around if you have any personal information, then those rights around data subject access and being able to delete data will always come into effect. If you think back to the definition of selling the data, it might be that you don't actually have ... You may not sell data under that definition. If you're a first party, so if you're an organization that has a direct relationship with consumers, you're not a data processor on behalf of others and you don't sell that data ... You may use service providers, but you're not actually monetizing your users' data. It might be that the requirements around saying stop selling data and then being able to turn that off in the background just don't apply to you.

Even if they do apply to you, because this was not an explicit right under GDPR, even if you've gone through the GDPR prep you may not necessarily have thought about where does selling data get included in contracts? Do I need to go back and revise my contracts to make it clear that anyone you're sharing that data with can't use it for their own purposes? There's a lot of things all bound up in that individual rights processing area.

With information security, again thinking about ... With those penalties that talk about security breaches, go back and think about doing a risk assessment. And then make sure we're saying here close medium-risk gaps. My philosophy is generally that you can't do everything, but you need to get, for all of these areas, to a legally defensible position. As I think we mentioned earlier, neither Jay or I are practicing lawyers, although Jay is a lawyer by background. We're not intending to provide legal advice in the presentation. From my perspective, definitely close the high-risk gaps from a security perspective, and close the medium-risk gaps. But as someone who in the past has run security teams, you get to the point where there's diminishing returns. And you are better off spending time really thinking about how do we detect when something's gone wrong? And then minimize that damage. Have a really good response process. Rather than trying to close every, single, tiny gap.

From a data processor perspective, I just mentioned that you're going to need to go back and review your contracts. Potentially update service-level agreements. The things that you're looking for in those contracts are going to be around who is the data processor and who is the data controller? Which way does that accountability flow? And then what is the secondary usage that any data processors are able to do with your data? If you sell it to them for one ... If you sell it to them. If you give it to them for one purpose ...

Let's say you've got a payroll service you're working with to perform payroll. If they can turn around and then resell that data or do something else with it that's outside of providing services to you, even if it doesn't necessarily say that you're selling it, under the definition of CCPA that could be considered valuable consideration. And you may be in a position where a consumer could say, "Hey, I don't want to be able to do that. I just want my payroll to be run. I don't want all those other services that I can't opt out of." Those are the kinds of things you're going to need to look back and say, are we really sure that we're not selling data unless we absolutely intended to? And we have a way to stop selling it if someone asks us that.

The other piece, then, obviously with any change management initiative, you need to think about training and awareness. Who's going to be in your call center handling these calls? Are you going to actually train everyone? Are you going to have a dedicated group that just gets special privacy training? Is the 1-800 number going to go through to a small hunch group who have special training? You need to think about how are you going to deploy some of this, and then what's the right training for those different roles that may be involved with different aspects of this?

I wanted to circle back and look at some of the areas where ... If you've already done GDPR, these are some of the things that you're still going to have to do. I mentioned with sales opt out. That really is a different scope of work, even if you've gone through GDPR. You need to work out if a company sells or is planning to sell or monetize personal information in the future. If you do, you then need to have this very specifically worded, Do Not Sell My Personal Information button on the home page of your website. Which is going to cause whoever runs your website to be like, "Well, we don't want to do that." That's when you have that choice around if you do sell personal information but it's a really small part of your business, is there a way to effectively say, "We can pull that back and have a good story that says we've gone through it. We really don't sell personal information. So we may not need to go and make that change to our home page"?

You also need to think about what is actually personal information and can you get a lot of information into this bucket of de-identified information? Where the law then doesn't specifically apply. What does that look like? Would that damage the way that you actually need to use that information? Or are there some scenarios where de-identified information is good enough? And again, it means that you can potentially sell aggregated information, which doesn't identify individuals, but not information that allows that identifiability down to the individual level.

Again, those are some choices you need to be making. And that's not really a choice you're making from a privacy or a compliance perspective. That's a business decision that needs to have a much more broad discussion. And I can tell you that with some of the clients I'm working with, we're having that conversation at a board level because that's actually a very significant business decision, and not one that really should be purely dictated by a compliance mandate.

In terms of personal information, you're going to need to go back and look at what does that expanded definition mean for any data mapping work you've done in the past? And then potentially you're going to need to go back and at least revisit and tweak what you've already done. From a privacy notice perspective, there are some new requirements that are very specific to CCPA. But do think again, if you're a global or a multinational company, do all of your global sites need that change? Do they actually provide services to California residents? From a scoping perspective, if you have 50 global websites, do you need to change the privacy notice on all 50? Probably not. You can go through and do a risk analysis and think about what's that process? What changes need to be made, where?

From an access perspective and data subject rights, this is where a lot of the work is going to be, just because of some of the changes. The 1-800 number is a specific requirement. That wasn't a requirement for GDPR. You're going to need to develop processes to really think through meeting the requirements of CCPA where they go beyond GDPR in several ways. And then making sure that you have processes to make sure that you're continuing to comply with these requirements on an ongoing basis. This isn't a point in time that it gets to January the 2nd, like Y2K, and you're all done. This is going to need to be an ongoing process where you build it and then it just becomes business as usual.

Training, we talked about a little bit already. For deletion and de-identified data, make sure that you go back and look at your policies and say, where do we really need to use identifiable information? And how can we potentially reduce the scope of the effect of this law by getting some information, potentially older information or things that are archived at the moment in its raw form, aggregate it, de-identify it? And then reduce the scope of things where you need to go through and then delete from all of those systems or provide access to all of that information if somebody asks.

Aaron: The contract piece, we talked about already. But CCPA does have a four-pronged, a four-part test to make sure that even if you are selling de-identified data, that re-identification does not occur. Make sure that that's explicit in any contract where that's relevant. And then go back and make sure that once the Attorney General's office does release their guidance, then make sure that you understand how you should verify consumer requests. Because we've already seen with GDPR that there's a lot of abuse of the system. People ask for other people's information. And unless you have a good way of actually validating that the person asking for the information is who they say they are, then there's a potential that you could effectively cause a data breach through the processes that you're putting in place to try and comply.

And then make sure that if you do ask for proof of identity, for example, that as soon as you've verified that individual, that that information gets deleted in a timely manner. The last thing you want to be doing is gathering more information when somebody asks to exercise their privacy rights. And then effectively creating a worse privacy problem by collecting and storing that information when you don't need it.

One of the things that I did mention that from a strategic perspective ... Some of these things you're going to need to go back. If you're in a compliance role or a privacy role, you're going to need to think about how to have this conversation strategically with the board of your company if monetization of user data is really a large part of your business. It is still unclear what's actually going to go into effect in 2020. There is a lot of pressure from the technology industry in particular. And you can probably imagine the sectors that actually make a lot of money out of monetizing user data are screaming the loudest to try and get the law changed before it comes into effect. The initiative backers have stated if the law changes too far, they're just going to go back to square one and say, "We're going to put on the ballot what we originally wanted anyway."

So there's this tension between trying not to push too far, but also trying to minimize some of these issues, particularly with the definitions, and really just aligning those with things that businesses have already had to comply with. Why would you have a different definition of children from the definition that's already in COPPA? Why would you have a different definition of personal information from a lot of the other laws that are out there? Why would you have things like olfactory information, which always makes me laugh. Literally the way that somebody smells. If for some reason you've captured that information, that's now considered personal information. You could re-identify someone from the way they smell.

Aaron: Those are the kinds of things where I know why those things are in there. They're trying to be forward-looking and thinking about new technologies. But also, it's a nightmare for an organization that's like, "Well, we have to essentially prove a negative, that we don't have any of that information. Otherwise, we'd have to go and deliver it to someone if they ask for it." I have already seen, and Jay touched a little bit on this earlier, but Jay and I have already seen variations of this law in other states. I've seen the Washington State bill, I've seen New Jersey. I've heard anecdotally that there are several others out there, some of which are closer to the GDPR language, and some of which are closer to CCPA. My fear is that we're going to have a real mess on our hands in the next year, 18 months, unless the federal government takes some action.

And I'm also involved in some of the efforts at a federal level, before this becomes a real issue for businesses, to try and push for a federal law that gives a level playing field. And really takes a lot of the good things from the European law, from the California law, and gives something that's going to work in the US legislative context. Even if you've gone through GDPR, there's still work to be done. I do think there's a lot of work from GDPR that can be leveraged. And if you haven't gone through GDPR, I think there's a lot that you can learn from the pain that larger organizations with operations in Europe went through in the last couple of years, that will help you get to the right place, quicker.

But I do think that the way that this is moving, and we've already seen this with some of the fines in Europe over the last couple of weeks, laws are trying to limit companies' ability to monetize user data without users really being in control. There's a lot of trying to balance in CCPA about, you can still monetize data. You can still sell it. But you can't hide that fact from the user any more. You can't just claim because it's a free service, that they have to provide you with all of the data you want.

There has to be more transparency. And there has to be more of the ability for a user to make an informed decision about what they're doing. And I don't think that's a bad thing for companies that can innovate and have good relationships with their users. But I think if the way you're making money right now is grabbing as much data as you can and really, if the users find out about it, that's too bad but they can't do anything about it. Those are the business models that are going to be explicitly challenged by this law and others.

Just a little bit on some of the stuff that I'm doing and Jay and I are doing at Sentinel. Really the way that we're looking at this is CCPA is one law. It's the flavor of the month. It's the thing that we're concerned about right now. But fundamentally, a lot of these requirements are similar to not just GDPR, but also to other both proposed and current laws. The way that we're looking at this is if you have the right privacy program that's going to be reflecting your ethical values, your business strategy, where you are and how you make money, as well as the legal and contractual requirements that you're under, that's really something that's going to stand you well for the long-term. Yes, you may have to make tweaks when new laws come out, but you're not going to need to have a wholesale revisiting of your business every time a new law comes into effect. That's a lot of the stuff that we do, is really helping organizations to think strategically about what they have and how they comply, not just today but over the long-term as well.

This is really just a summary. If you're looking for a one page just to go and discuss with people, this is really a summary of the seven points I covered earlier. That you could really print this one slide out and say, "Here's the seven things," if you want to make them seven workstreams and go and write a work program for each of these. You'll see that, as I mentioned, a lot of consulting firms are going to have something similar. The requirements are the requirements. But I like this for really thinking about how can we break this down into bite-sized chunks that then we can go and assign individuals to go and work on? And make sure that we're moving forwards on multiple paths at the same time.

Camille: Well with that, just wanted to thank you again, Aaron and Jay. This was just such an informative presentation. And I think it really broke it up in a way that made it seem more manageable. It sounds kind of scary, all of the things that are involved with the California Consumer Privacy Act, and how to prepare for that. So, really appreciate you breaking that down for us. I know everyone's eager to get to questions. That'll be the next slide. We've got some great questions coming through. So, I'll go real quick through this.

After the presentation, a lot of people have been asking. The slides will be available, and we'll also be providing a free e-book. That will have what you need to know. And again, just reiterate some of the steps that Aaron and Jay laid out and some preparations to take. One other thing to mention is that privacy training for those on your team that are going to be really impacted by this is available, and a certified information privacy professional is going to be needed in the coming weeks and years, even more so than before. So, we do offer flexible training options to help you with that certification, as well as an exam pass guarantee with our live, online, Flex Pro options. You'll receive the link to learn a little bit more about that as well. But if you're interested now, you can go to InfoSecInstitute.com/IAPP.

And with that, let's go ahead and move on. Running short on time here, but we're going to get to as many questions as we can. Question I'd like to start out with is from Carla. And the question, Aaron and Jay, is has the California legislature addressed the issue as to whether a company's employees are considered consumers? That'd be kind of interesting to talk about what data you're allowed to have in regards to your employees.

Aaron: That's a great question. The law, as it stands, is ambiguous. And if you've seen, at the moment the California Legislature is running a series of events in California where people can give feedback. And that's one of the areas that's been brought up and raised for clarification is, at the moment the way that the law is written makes it look like employees could be in scope. Some of the draft language that's actually in my inbox review right now, talks about making a distinction between an employee in the role of a consumer and an employee in the role of an employee. And they're seeking to exclude the employee in the role of an employee from scope.

But let's say that you were an employee of a grocery store and you actually shopped there as well, what they're trying to do is to say, well, you could still very validly ask the information about you as a consumer of that grocery store. But they want to try and exclude the employee context because a lot of the way that the rights are written doesn't make as much sense in an employee context. That one, at the moment it's still ambiguous. But I haven't yet seen the final answer on that. And that's where there's several of these things where, that's why there's still pressure from not just the industry but also from a lot of privacy lawyers that I know. And trying to really say, we want to get to the point where the rights are clear, so the businesses know what they need to do. But also that we're not exposing these things beyond the scope that it would appear to be intended by a law that literally says, "Consumer," in the title of the act.

Camille: Perfect. Thank you. Another question here from [David 00:53:52]. Many privacy experts, he says, are predicting that there will be additional changes or amendments to the CCPA, which I know you touched on earlier. He said, do the panelists agree that amendments are likely? And if so, any crystal ball predictions about what those changes will be? Maybe some of the big areas you expect to change before 2020.

Aaron: Yeah. Whether or not anything actually changes is ... I've heard various percentage bets on whether anything's likely to change or not. I've seen lists. I mean, I've got one document that's 15 pages of proposed changes, which clearly not all of those are going to go through. I think a lot of the changes that I've seen, the areas where there is a lot of concern, are around some of the definitions. The employee one's a big one. There's also definitions in there around family information, and then household information is defined slightly differently from family. The definition of personal information itself, the definition of sale. I think a lot of those, there are concerns around just the breadth of those definitions and the difficulty in actually complying.

I think some of those ones are a bit more reasonable to expect maybe some nuanced changes. I have seen some other proposed changes, which are things like instead of actually providing back the information ... Let's say I have information about Jay. Instead of providing, in response to the access request, a list of information about Jay, I've seen one proposed amendment that would be we just provide the categories of information. Instead of saying, "Your name is Jay Rodne, you live at 1 Arcadia Avenue," we would just say, "Hey, Jay. We have your contact information." Which again, I think is pushing that way too far away from the original intent of the law. And I don't see that actually coming through as being an amendment that would be accepted. I think we've always got that backstory that if there are too many push for changes, the original backers, as they say, can always go back and say, "You do what you want with the law. We're going to throw through everything that you cut out and just put it on the ballot."

Jay: I will add that there are clear conflicts in the CCPA penalty provisions with the penalty provisions of other consumer protection laws under the California code, so there's got to be some work around either harmonizing the penalty provisions across a number of different statutes, or repealing some of the other statutes, or specifically carving out CCPA from those statutes. The penalty provisions piece needs to be clarified and harmonized, certainly before the law takes effect.

Camille: I think it'll be interesting to watch as this starts to take more into effect. And other states, what provisions they have when learning from some of the things that happen in California. I bet there'll be a lot of changes after the fact as well.

Another question here. Looks like we'll have time for it. Just maybe one or two more before wrapping up here, and I want to thank the audience for asking such great questions for us. Does the CCPA allow any provisions for the complexity of backups in other systems that make it very difficult to erase data? Or, selective data in that case? Is there anything regarding that, that you know of?

Aaron: Backups and archiving is always an issue with the scope of any of these laws. It really comes down to ... The way that I've seen this done from a GDPR perspective is that if Jay said to me tomorrow, "Okay ..." If I back up Jay's data today, if Jay says tomorrow, "Well, I want you to erase my data," and we erase if from the live system but not from the backup. The way that you can do that that I've seen people argue that it's defensible, and obviously talk to your outside or inside counsel, is that you have a process that when you restore a backup, if you ever do, that you would then go back and effectively clean that backup for anything that was erased since you actually recorded it.

That means you've got to have a great system for, "Okay, we backed up on Monday. The deletion request was on Tuesday. Therefore any deletion requests that came in after that backup was made, if we restore that backup the first thing we do is we purge all of those records before we use it for anything else." And again, I'm not a lawyer but I've heard people say well, that's a reasonable compromise. It's not explicitly called out in the law, but that's what I've seen done where there's been other provisions in similar laws.

Chris: This concludes today's episode of CyberSpeak with InfoSec Institute. Thank you all for listening. Remember, if you enjoyed today's episode, you can find many more including webinars, tutorials, and interviews with security thought leaders by visiting InfoSecInstitute.com/CyberSpeak for the full list of episodes. To see our current promotion for podcast listeners considering a class sign-up, please check out InfoSecInstitute.com/podcast to learn more. Also, if you're like to try our free security IQ package, which includes phishing simulators you can use to fake phish and then educate your colleagues and friends in the ways of security awareness, please visit InfoSecInstitute.com/securityIQ. Thanks once again to our guests, Jay Rodne and Aaron Weller, and thank you all again for listening. We'll speak to you next week.

Join the cybersecurity workforce

Are you a cybersecurity beginner looking to transform your career? With our new Cybersecurity Foundations Immersive Boot Camp, you can be prepared for your first cybersecurity job in as little as 26 weeks.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.