[Webcast Transcript] The State of US Privacy Law: What to Expect in 2023 and Beyond

Editor’s Note: On September 29, 2022, HaystackID shared an educational webcast on the topic of US privacy law. As privacy continues to move to the forefront of not only information consideration but of business concern for today’s cyber, data, and legal discovery professionals, it is increasingly important for those responsible for the planning, implementation, and management of privacy programs to have a firm understanding of the current and future landscape of privacy legislation and regulation.

This session was developed and shared by a team of privacy and legal discovery experts and highlighted the changing US privacy landscape. The presentation also included discussion and anecdotes designed to paint a compelling picture for business, information technology, and legal discovery professionals and practitioners seeking to better understand and respond to privacy challenges in today’s communications-connected world.

While the entire recorded presentation is available for on-demand viewing, a complete transcript of the presentation is provided for your convenience.


[Webcast] The State of US Privacy Law: What to Expect in 2023 and Beyond

Presenting Experts

+ Chris Wall
Data Protection Officer and Special Counsel, Global Privacy and Forensics, HaystackID

+ Jennifer Kashatus
Partner, DLA Piper

+ Kevin Clark
Discovery Counsel and Vice President, Analytics and Review Operations, HaystackID


Presentation Transcript

Moderator

Hello everyone, and welcome to today’s webinar.

We’ve got a great presentation lined up for you today. Before we get started, there are just a few general admin points to cover.

First and foremost, please use the online question tool to post any questions that you have, and we will share them with our speakers. Second, if you experience any technical difficulties today, please let us know using that same questions tool and a member of our admin team will be on hand to support you. And finally, just to note, this session is being recorded and we’ll be sharing a copy of that recording with you via email in the coming days.

So, without further ado, I would like to hand it over to our speakers to get us started.

Kevin Clark

Hello, and I hope you are having a great week so far. My name is Kevin Clark, and on behalf of the entire team at HaystackID, I would like to thank you for attending today’s presentation and discussion entitled The State of US Privacy Law: What is Expected in 2023 and Beyond.

Today’s webcast is a part of HaystackID’s regular series on educational presentations developed to ensure listeners are proactively prepared to achieve their cybersecurity, information governance, and eDiscovery objectives. Our expert panel for today’s webcast includes two individuals with deep understanding of privacy and compliance developments, challenges, and concerns, especially with the changes to the US state law coming in 2023.

We encourage the audience to make this participatory. If you have questions during the presentation, please put them in the text box and I’ll do my best to have those questions addressed by the panel.

So, with that, let me have our panelists introduce themselves. To begin with, as I mentioned, my name is Kevin Clark. I joined HaystackID earlier this year and currently serve as the Discovery Counsel and Vice President of Analytics and Review Operations, focusing on advising and managing the support for review-centric operations.

Next, we have Jennifer Kashatus. Jennifer.

Jennifer Kashatus

Thanks, Kevin. Hi everyone, my name is Jennifer Kashatus, I’m a Partner in the Washington DC office of DLA Piper focusing on privacy and cybersecurity, and I look forward to speaking with everyone today. Chris.

Christopher Wall

Thanks, Jennifer. My name is Chris Wall, and I’m a privacy lawyer, and it feels good to get that out there. Like Kevin, I joined HaystackID earlier this year, and I serve as HaystackID’s DPO (Data Protection Officer) and Special Counsel for our Global Privacy and Forensics Group. And I help Haystack and our clients navigate the privacy landscape that we’re going to be talking about today, both within the US and around the world, specifically as that relates to technical privacy and data protection compliance, and cyber investigations and data analytics, and discovery.

Back to you, Kevin.

Kevin Clark

Thank you, Chris. So, today, we plan to cover the following topics:

  • An introduction to privacy and where some of these privacy concerns come from.
  • We’ll then talk about some of the notable contours of privacy and the data protection landscape in the US. We won’t delve into data protection laws outside the United States or data transfers. Our focus will be primarily on the US.
  • We’ll follow that up with a very brief discussion of US Federal Privacy Law.
  • We’ll spend some time then talking about the US state legislation, specifically five US states that have laws that take effect in 2023.
  • And at the end, we’ll leave some time for questions to address anything you want to talk about to try to cover the questions we didn’t get to touch upon during the discussion.

So, remember, please ask your questions in the question box.

So, let’s get rolling with today’s discussion.

Our world is connected. No one needs to tell any of you this. I’ll get things rolling with our panelists. Chris, why are we talking about privacy rights right now?

Christopher Wall

Yes, so I think everybody is logged into this because privacy is a big deal; it’s a kitchen tabletop exercise, as I like to refer to it. And maybe the Wall kitchen table is a little different from everybody’s else kitchen table, but we do talk about this kind of thing. As I talk to my kids about their use of social media, privacy comes up. As I talk to them about where they share their personal information, whether that’s at school or wherever else they go, it’s a topic we talk about. And I think if everybody who is listening today or watching today asks themselves, “How many of you have checked your Twitter feeds or posted your daily TikTok, checked Facebook, or purchased something online, or set up your online payments for the month, or checked your kid’s school progress on ParentVUE or Canvas or whatever your school’s online system is” you should be concerned about your privacy. You should be thinking about your privacy.

If you’ve used your phone today, all of that data is stored and you need to know who has access to that data, and how much of that data you’ve given permission to other people to access or to use in certain ways.

And so, when we talk about privacy and data protection, that’s what we’re talking about, at the end of the day. We’re talking about your data, our data, the data that belongs to us, and how other people use it, and what permissions we give to other people to use that data.

Let’s go to the next one, Kevin.

Kevin Clark

Thank you, Chris. I think you added this slide. What are we looking at here?

Christopher Wall

Yes, so I did. I kind of added this slide just to illustrate. So, around my house – and I might not be representative, because I’m maybe a little bit of a geek. But I looked at my own home network over the weekend, and I currently have around 140 devices connected to my home network. That is the Internet of Things. And some of those devices are laptops or printers, things that my wife and I use for work, for instance.

But other devices connected to my home network collect data about us and my family, but I installed them knowing that. That includes my thermostats, window blinds, humidity sensors in the basement storeroom, little sensors on my freezer and refrigerator to let me know whether my kids have left the door open and all of my ice cream is thawing.

Anyway, a host of devices connected to my home network.

And the important thing is that I read my end user license agreement for every one of those devices, and I know that when they are collecting data, I know what they’re collecting, and I know what rights I have to that data. And in short, I opted in, I guess, to a lot of those. And I also actively monitor what data leaves my network, so that I know which, if any of those devices, are phoning home and sharing my personal information. How many times did I open the freezer for ice cream? How many times they’re sending that information out for others to use?

So, the purpose of our conversation today is to talk about the idea of whose data that is, and how that data is being used, and what states in the US are doing – and briefly, we’re going to talk about what Federal efforts there are to control how that data is being used, and what people have to do if they’re going to use our personal information.

Let’s go to the next one, Kevin.

Kevin Clark

So, data was a little concern 15 years ago. Today, we’re going to talk about specific laws in the US, including privacy laws around education, children, financial transactions, health records, et cetera. That is the general approach of the United States, covering specific parts of our lives that we’re most sensitive to by specific regulations. We’re also going to talk about some of the cultural backgrounds behind those laws. And we’re going to talk about what happens when or whether the data travels to unseen places around the world.

Christopher Wall

So, Kevin, we talk a lot about privacy like it’s new, but there really is nothing new under the sun. And we’ve heard it said that when you use social media, you’re not the customer, you’re the product. That’s a meme even today.

But keep in mind that the idea that you are the product has been used to criticize media decades before social was appended to that word “media”. A version of that maxim, “You are the product” predates not just Google, Facebook, Insta, and Snap or Twitter, but the internet itself.

In the 1970s, network TV was essentially society’s Facebook. And back in the ‘70s, this idea of network television was ground-breaking technology that was beamed straight into our homes, and it was a mindless escape – remember Gilligan’s Island, the Brady Bunch – a mindless escape, but it was also a source of news, and it was widely criticized back then for distorting, for oversimplifying, for sensationalizing information in the hunt for advertising dollars. Not too different from what we deal with today, or what we look at today with social media.

So, then with the Snowden revelations, it became clear that we’re not only the product but the subject of all kinds of big data analyses. And again, there’s nothing new, necessarily, under the sun.

There was the Church Committee Report in 1975, which revealed that the FBI had cataloged over 26,000 individuals at one point who were supposed to be rounded up in the event of some national emergency, and about half a million first-class letters were opened and photographed by US intelligence agencies up until 1973. And millions and millions of private telegrams were obtained by the NSA up until 1975 for the same purpose.

So, all the time, every day we seem to hear about some new breach or data loss involving our personal information being snooped upon. And we certainly have a history of dealing with this kind of thing, and a history of grappling with how best to manage either us being the product or us being the subject of big data analyses.

The difference, I think, today – the reason why we’re talking about privacy today – boils down to the ease of access of that data, and also the volume of that data.

Let’s go to the next one, Kevin.

Kevin Clark

So, that’s all fine and good, but if we look at the results on us and business, and generally on society, why is privacy becoming more and more a thing?

Jennifer Kashatus

And I think we have several different considerations here. So, first and foremost, the number one we put on the slide is regulatory penalties, followed by brand and reputation damage, exposure, business disruption, overall action. But the key is we start with looking at the GDPR. And again, this is probably the only mention we’ll have of European or non-US privacy, but to highlight the fines, 2 to 4% of global turnover got companies’ attention, including US-based companies that have operations in Europe. And it made companies more aware of the real risk of exposure.

In addition, now you’ve seen Europe, and even member states, really issue large fines for various types of privacy breaches. And I don’t necessarily mean just a personal data breach, but overall breaches. This, plus, with the expanding US state laws, which are really the primary focus of our discussion today, we are now seeing an increase in attention by consumers. Consumers do care about their privacy. Employees – as we’ll talk about for California – also have privacy rights, and employees do care about their privacy. So, this is an area where many companies that might not previously have had to deal with these issues will have to do so. And there is enough public interest and concern and discussion that it’s really hard to hide anymore.

We also see a lot of misinformation. People from Minnesota try and obtaining data civic rights. Minnesota is not there yet. But right now, there is a good focus on these issues.

One thing I want to mention briefly is that I’m often asked, “What’s the biggest question you get? Or what are you spending most of your time on today?” And what we are now seeing, which goes to this point is a lot of companies that are in the B2B space, where they really had very little attachment or touch on CCPA because of the partial employer exemption and limited approach to business content information, now all of a sudden are brought in.

So, we are now seeing a real focus in the B2B community on these issues, even if they don’t have a lot of data. And I think that point illustrates just how far-reaching these new regulations are.

Kevin Clark

Thank you, Jennifer. So, let’s take a look at the privacy landscape and what it looks like today. What are some of the contours of – and the tectonics that got us where we are today?

Christopher Wall

So, Kevin, I think it’s helpful – again, we’re not talking about other countries – but I think it is helpful, at least, to look at GDPR as Jennifer pointed out.

Europe, of course, has a fraught history with the use of personal information, whether it was race, religion, ancestry, ethnicity, or whatever it might have been a long with personal and real property ownership even. And that unfettered collection and use of personal information led to seizures and deportations and concentration camps.

And so, while we can see those principles from European data protection trickle into US laws, the US takes a relatively hands-off approach to business, and we arguably have a different social perspective on the US of data. We invented social media oversharing in the United States.

But what is interesting from a comparative law standpoint is that with each of these laws, the GDPR or any of these other foreign laws, even China’s data protection law (PIPL) or the US State Privacy Laws, they really do reflect the culture from which they spring.

In the US, it is all about protecting the rights of the consumer. In Europe, the GDPR is designed to protect the rights of the individual. And in China, broadly speaking, PIPL and the interwoven DSL and CSL are laws designed to protect the communal good or the state.

So, following World War II, Europeans weren’t the only ones thinking about privacy. And I mentioned that in 1975 with the Church Report, privacy was an important thing in the US. And in the 1970s, legislators were looking at what to do with aggregations of data in the US held by the Federal Government.

And so, it’s a law that we don’t talk a lot about in the United States, but we actually do have something called the United States Privacy Act of 1974. And that law was, of course, created in response to how these databases – which was kind of a new idea back then, these nice collections of aggregations of data stored in nice rows and columns – would affect individuals’ privacy rights.

And so, what the Privacy Act does is it safeguards the privacy of data held by the United States Federal Government. But that’s where it stops. It just focuses on the US Government and what they can do with your data.

And the idea was that the Privacy Act would be a broad Federal law, but then we would take a sectoral approach and we would protect personal information using laws that are focused on where the greatest aggregations of consumer data lay, like finance, or education, or healthcare. And we would leave the other details to the states. And so, that’s the reason we have, in the United States, Gramm-Leach-Bliley, which we’re all familiar with. It gives us an opportunity to opt-out or to look at the privacy notice that each of our financial institutions have. It gives us FERPA, which deals with educational information. And of course, HIPAA, which technically isn’t a privacy law, but we all consider it to be because it protects our personal information as it relates to healthcare.

Actually, there’s one more thing we should mention here, I think, with respect to the unique contours of privacy law in the United States.

In the United States, we have this great concept of free speech, and it’s explicitly guaranteed in the First Amendment of the Constitution. Privacy, on the other hand, unlike being the explicit human right that it is in the EU is an implicit right under the US Constitution.

Again, the details are left to the states, and so there are a lot of US state constitutions that include privacy as an explicit right, but that’s not all states. So, we’re going to talk about these states have approached privacy as they try to protect consumer rights, individual rights. We are going to talk a little bit about the American Data Privacy and Protection Act (ADPPA), and we’ll touch on that a little bit, I think, in a little bit.

Kevin Clark

Thank you, Chris. So, we’re going to start off with a polling question.

Christopher Wall

Hold on, so I’m going to give you the facts first.

Kevin Clark

Yes, sorry, jumped ahead here.

Christopher Wall

All right, we do have a little case study, and this was ripped from the headlines. We have a company that had a good privacy policy in place, it gave privacy notice to its users and its customers. However, they didn’t notify customers about the planned sale of those customers’ data, and the customers opted out of specific uses of their data, but the company didn’t honor those requests. And finally, the company was made aware of the issue involving opt-outs, but essentially ignored it. Now, it’s that last point that might give it away.

So, let’s go to the polling now. Let’s go to that first polling question.

What is the appropriate remedy?

Well, “appropriate” maybe isn’t the right word there. What was the actual remedy? Because it’s an actual case, and it was recently in the news. We’ll give everybody a second here.

All right, we have a well-informed… here we go… oh! All right, I was excited, those who knew, knew. The poll is closed, and the majority got it there.

As I was watching the poll numbers here, the first responders there – almost everybody who knew, knew right off the bat. This is, of course – go to the next one – this is, of course, Sephora. And the correct answer is “All of the above”.

Sephora, of course, based on the very general facts I provided there, the California Attorney General – and that should have been a giveaway, since California is the only US state that currently has a privacy law in place – but in this case, the California Attorney General, essentially, had alleged two major violations of the CCPA.

First, Sephora didn’t inform its consumers, its customers that it sold their personal information to third parties. And second, it didn’t honor those customers’ opt-out requests, even though they knew there was an issue with honoring those. The California AG gave them notice that there was an issue. They didn’t remedy it.

And so, at the end of the day, Sephora not only failed to disclose that it was sharing or selling consumer information to third parties, but it represented in its privacy policy that it was going to do that in the first place. And second, Sephora didn’t have a valid data processing agreement in place with any of those third parties with whom it sold or had it process data for them. And both of those are violations of the CCPA.

And as for the opt-out issue there, the California AG found that while Sephora was receiving consumers’ opt-out requests, they were really, effectively, ignoring them and they continued selling those consumers’ opted-out data, or the data for those consumers who had opted out.

So, let’s go to the next case study if we can. Good job to our audience there. You nailed that one. Let’s go to the next one.

Jennifer Kashatus

And this one ties on a point that Chris made very early in the presentation, which is in looking at his home devices that he was guided by what they’re actually doing, and made decisions based on reading privacy policies, which some of us actually do read. And candidly, the people who bring lawsuits and care about them, do read them, so they are important. But this is a message that we’re going to see continually, you’re going to hear us keep saying throughout the entire policy because what you see is really important.

So, here, we’ve got a company. It had a privacy policy in place, it’s for a mobile application. And in the published notice, the company said it would only share user data for the purpose of servicing the app. And this is somewhat health-related data, this was an ovulation fertility application, and it specifically said it would not share any health-related data to marketing or analytics firms.

According to the FTC complaint, which the company ultimately settled, the company did share health information with third parties for those third parties’ unrestricted use. In particular, the third parties at issue were Google, Facebook, and some other social media entities or analytics providers who then presumably – again, according to the complaint – would use the data for their own purposes including building profiles.

So, we’re going to open up the next poll when we go to the next slide. What do you all think happened in this case?

This is a fun one. We’ll give a few more seconds on this one.

Okay, in this one, we’ve had quite a variation in responses here. So, the most prevailing answer was “Two of the above”. If we can go to the results, and it is indeed – I think it’s on the next page. Yes, the next slide.

The company was required to enter into a long-term audit. It was exactly this, two of the above. There was no fine, no A, and no D, but B and C were in play.

Importantly, the reason we mention this one is although there was no regulator fine, the regulator involvement was significant, and the processes that the company was required to put in place are candidly, probably going to be very expensive to implement. But there’s also class-action litigation, which is ongoing. And so, even if you don’t get fined by the regulator, this was an area that – I don’t like to use the term “sensitive”… I don’t want to use the term “sensitive” as a defining term, but people did consider this information private and they relied on the disclosures, presumably, when deciding to download and use the app.

And so, again, this is a theme that’s carried through in the state laws that we’re going to be discussing, some of which conflict with one another, some of which require transparency to a point that the regulations are so stringent that they almost become less transparent because they’re harder to read. But again, emphasizing that one of the main focuses of privacy laws we’re talking about today is do what you say, and say what you do.

Kevin Clark

Thank you, Jennifer. So, let’s move on and talk about US Federal Privacy Law.

Jennifer Kashatus

This is purposefully blank.

Kevin Clark

Exactly.

Christopher Wall

This is what we call dramatic effect. This does represent the sum total of comprehensive US privacy law. There is none today.

Jennifer Kashatus

As we said before, the US has been built on a patchwork of sector-specific areas. We touched upon HIPAA, FCRA, GLBA, FERPA, COPPA. What I love about GLBA, in particular, is for years I’ve had companies that said we’re not financial institutions, we don’t have to comply with this, and all of a sudden, the CCPA came into play and they said, “No, we think we’re a financial institution, we’ve been overlooking it” because it’s a lot better, more fun to be a financial institution, a lot less onerous than complying with the CCPA requirements. So, they’re not a complete exemption under CCPA, we should note that. But we mentioned the patchwork of requirements.

So, I think one question we’re often asked is will there be a Federal Privacy Law? What’s happening with the DoI. Chris, do you want to make a prediction on this one?

Christopher Wall

I’m not a betting man, but if I were a betting man, I would say at some point in the future, we will have a Federal comprehensive privacy law. I’m just not going to hold my breath for that to happen. I don’t foresee it coming in the next year.

And we can talk about – in fact, let’s do. Kevin, why don’t you go to the next one. Let’s talk about where we are with that.

I think there’s a lot of appetite in the US to have a comprehensive privacy law similar to what the GDPR is, or similar to what some of these states are trying to do. And to demonstrate that, I think back in July on a 53:2 vote, the American Data and Privacy Protection Act moved out of the US House of Representatives Committee on Energy and Commerce. It moved out – with a 53:2 vote. That’s incredible support. And it still would need to pass the full House and the Senate. And I know those negotiations are ongoing.

And the rumor is that the White House will support it if the bill passes, but that’s a big if, because I think ADPPA, as it’s currently written, would allow individuals to file suit over violations – which is good, that private right of action – but most importantly, to California anyway, is that it allows Federal preemption over some of the state data privacy laws. Well, actually, over all of them as they’re currently written.

And so, the idea of an ADPPA is fantastic. I think, generally, most people would support the idea. But the devil is in the details, particularly for California.

I should just mention, the reason why I think it’s probably not going to go anywhere because, of course, the Speaker of the House currently represents the State of California. And I think there’s been a real concerted effort to convince Speaker Pelosi to consider and pass the bill, but she has said, in representing California, that she will not allow the bill to be considered by the House. But anyway, it might be dead, but again, I’m hopeful.

Jennifer, I don’t know if you’re hopeful too.

Jennifer Kashatus

I don’t think it’s going to go anywhere. Even with moving a little bit on preemption and allowing the California data breach part to proceed, it really would preempt largely some of what California has done. I think it would be interesting because even if it were to go through, and we’re so many iterations away from what it would look like, understanding how the preemption would work and what’s left in the states would still be very challenging for companies. But as you said, I think right now for the first time in a long term, there actually is really an appetite to have Federal law, because it is increasingly complicated for companies to comply with the myriad of privacy laws.

Christopher Wall

Agreed.

Kevin Clark

Okay, so let’s jump to the state laws, and in 2023, we have five new state privacy laws that will take effect. What can we expect?

Christopher Wall

Kevin, you said in 2023, we have five new state privacy laws that will take effect, and that’s important, and I hope that’s the reason why most of the people joined us today is to talk about these five new privacy laws.

But that’s not all, in 2022, this last legislation season, lawmakers in 29 states and DC talked about data privacy bills. Like I said, this is a kitchen table topic, or at least it’s a legislative topic around the country in at least 29 states and DC. 23 states held committee hearings on the topic. 14 states passed bills out of committee. Seven states passed a bill through one chamber. And just in 2022, two states actually passed the laws, that was Connecticut and Utah.

I think that’s important. Again, we talk about this momentum and this enthusiasm or appetite for privacy regulation, and that demonstrates that the fact that 29 states around the country are talking about making some movement here.

And when we look at what they’re doing, you’ve got, essentially, two models for these state laws. You’ve got one which is the Washington State model, and then you’ve got CCPA. You’ve got the California model. And to date, only California follows that California model, which itself was based on the GDPR. And all of the others, largely, follow that Washington State model which, ironically, still isn’t on track to become law, but it was the original model upon which the others are based.

And that’s important because while the others laws – other than California – have some differences, which we’ll talk about here, they could be, eventually, something like a UCC for privacy where states tweak the model to suit their specific needs.

And that said, while you’ve got more states following that Washington State model, while no other state has really enacted CCPA-like laws, California is a big state, and CCPA still covers nearly twice as many consumers as all of the other four states’ privacy laws do combined. So, I think there’s a reason why California continues to be a dominating influence in the privacy space, especially after enforcement of CPRA begins in July.

Kevin Clark

Thank you. Well, let’s dive more into California since they were first and they’ve had such an impact.

So, Jennifer, what do you think – looking at California now, what do you think is going to happen in 2023?

Jennifer Kashatus

So, a couple of things on that. So, first of all, we’re talking about – the five laws we’re talking about are five laws plus CCPA, or plus CPRA depending upon how you want to look at it. These are comprehensive privacy laws. In addition to the many other states that considered but didn’t pass laws, there are, what I would call, extremely small issue-specific privacy laws that did pass. So, someone monitoring notices to employees, someone GPS tracking all the different other laws.

So, our focus is really on these five laws, but there are a bunch of other laws that are very issue-specific that did go into effect in the past year or that were proposed and will continue to be proposed. So, there’s so much more out there that complying with CCPA alone and CPRA seems like a full-time job, and you add the others on top of it, we certainly feel everyone’s pain here.

So, what do I think is going to happen? If you look at the different laws – so I’m going to take Virginia as my example. When I print out Virginia, it’s eight pages. Yes, it is [similar] space, but there’s eight pages. The CPRA draft regulations alone, I think are, what, 80 to 90 pages. And it takes people – if you can actually make it through them – days to think about them. And then you’re constantly going back and rethinking.

So, what do I think is going to happen? I think people are going to be scrambling. I think they’re scrambling, in part, because there’s privacy fatigue in handling these regulations. There’s budgetary fatigue. People are transitioning back to offices. And so, what do you do with remote versus in-person? There are all sorts of monitoring questions that are coming up. There’s just a lot to handle right now, and we’re running out of time and we don’t have final regulations.

So, I’m going to take a step back and just talk briefly, what does CCPA cover, it took effect in 2020, being amended by CPRA which takes effect January 1. Very important, so CCPA applies to any for-profit entities, so gross revenues of greater than 25 million or having personal information of 50,000, or deriving revenue more than 50% of revenue.

Let’s go to the next slide, and I want to highlight a couple of really important points about CCPA. CCPA has a revenue component. So, if you don’t hit 25 million and then you don’t meet the other two, you’re out. The other states are not necessarily revenue based. And so, you may find that if you’re – one of the things that we saw in California is you might be a local California-only company, but you still had to hit that threshold. What we’re seeing in other states is you might have a Colorado-based company that has the required number of people, isn’t anywhere close to 25 million, but they’re still going to be brought in under the number of people that they have.

So, I think that you’re finding some local businesses are now brought in to having to deal with privacy where they didn’t before.

So, if we look at CCPA, though, or CPRA, we have violations for $7,500 for the intention, plus your California data breach class-action. But most importantly, we want to highlight this last bullet, under CCPA, a business has 30 days in which to cure the violation. That cure goes away under CPRA. If you remember anything else, I think that’s important to remember is that cure period goes away.

And so, companies before that might have said, “We’re not sure what a financial incentive is, we don’t really know what to do with sale”. They don’t have that luxury anymore. And I think we know that California is going to be enforcing, and so it’s really time to take a look at these issues and to make some hard decisions about how you want to approach doing business in California. Next slide.

Consumer rights under the CPRA. You have a right to know what’s collected, right to delete, right to opt-out of your sales of personal information. This is very unique to California in terms of the definition and then how the approach must be taken, your non-discrimination. Many of the other states have these rights but that not all the rights are identical. Let’s go to the next slide.

Christopher Wall

And we should also mention here the interrelation between CCPA and CPRA with Federal law, and how those interact specifically with – HIPAA is the one most often cited in relation to CCPA or CPRA.

It does not apply to data that is otherwise subject to those Federal privacy laws or regulations, including HIPAA.

Jennifer Kashatus

And Chris, you said the operative word was “data”. It does not apply to data, but it is not a complete exemption, which is different from the other states. And this is an area for financial privacy and healthcare privacy that companies in those spaces really need to look at closely, because the operative word, as Chris said, is “data”

Christopher Wall

Yes.

Jennifer Kashatus

So, this is one of my favorite things to discuss is the second point that you have to have a privacy policy, you probably already have to have one anyway.

And one question we receive a lot is what should my privacy policy look like? I’ve got five states now. Here’s my US section for the states that have privacy laws. Do I have my – the whole privacy policy, I make it look like California. And the answer is it really depends. And it depends if you’re a global, it depends if you’re a B2B, and it really depends on what your appetite is for how you’re able to implement the laws.

So, you might say we’re going to treat everyone the same. We give up. There are too many states. We’re already saying no to people, turning people away. We don’t think it’s consumer friendly. So, we’re giving everyone the same rights. You can do that, great, that’s fine, that’s up to you.

But some companies are saying, “Gosh, California is so different, we want to treat California in one bucket and then everyone else in another”. That also can work.

So, these are decisions that companies will need to make internally, and part of what goes into the decision-making process is your resources.

Chris, you and I were talking a little bit about this yesterday. Do you think that companies that historically – do you think that we’re going to see a shift in consumer rights requests?

Christopher Wall

I do. Today, there are some companies that are receiving lots of these Subject Access Requests, or what we call DSARs, these Data Subject Access Requests. But I think the number of companies, and the number of these requests will go up significantly as more and more states allow for that kind of request.

And what those are, essentially, is you being able to ask a company to provide you with what information they have about you, what personal information they have, how it’s used. And in some states – and we’ll talk about this here in a second – in some states, you have the right to correct it, some states you have a right to have them delete it, some states you have the right to have that company export it and give it to you so you can take it to another company to use.

Anyway, to answer your question, Jennifer, I think in the coming year, coming two years, I think these Data Subject Access Requests are going to go through the roof. I think the number of requests, as people become more familiar with this right that’s available to them, that belongs to them, that they’ll start exercising it. If for no other reason than they can.

Jennifer Kashatus

Exactly, and we’ve also seen companies that are out there that will put in a request for you, including to companies that you’ve never interacted with.

And what’s interesting is some of these companies do this by going in and looking at your email, which to me seems strange, if you care about your privacy, but different. Everyone likes to take a different approach. But we mention this because it may not be – looking at historic trends of we’ve never received a Data Subject Access Request, and we’re not going to receive any more. In the future may not be the right answer. We just don’t know. In the B2B space, maybe it is. In the consumer space, maybe it isn’t.

So, those are considerations to think about as you design your privacy policy. We want to be careful and note that the content requirements for privacy policies are extremely specific, some of which require very specific terminology. The approaches for opting in and opting out for some items are also different. And so, you really need to know exactly what you’re doing.

And Chris was saying at the onset, he knows what data is being collected about his house. And this is a great analogy for the business house. You need to know what your business house is collecting so that you can figure out how fast to implement these policies.

Christopher Wall

Yes, again, we looked at those two cases at the beginning, both Sephora and Flo, and both were related to notice, the privacy policy, or the notice that they share with their customers and consumers. And those two cases are practical, cautionary tales, I think, about (1) having a policy in the first place, and (2) following it.

I don’t think that necessarily having – well, even if they had an agreement in place with the companies to whom they sold customers’ information or consumers’ information, that wouldn’t have necessarily made their problems go away. It was that notice that they gave to their customers and what they did with those customers in relation to that notice. In other words, they contravened the notice that they were giving to their clients.

The notice that they give to their customers has to be easy to understand. It’s got to be in a language, obviously, that the business provides most of its information in, most of its info about. It’s got to be readily and obviously available on their website for anybody doing business with that company. And reasonably accessible across the board.

Sorry, I’m trying to answer questions here. We had a question about whether a solid DPA (data processing agreement) with those entities, Google or Facebook, in the case of Flo for instance, whether that would have helped.

It might have helped. It might have helped Flo be more aware of what they’re doing with data in relation to their privacy notice and privacy policy. But to really keep them out of trouble, I think what they needed to do was make sure that their privacy policy and their privacy notice actually reflected what they were doing with consumer data.

Kevin, let’s go to the next one.

Jennifer Kashatus

This is just to highlight, briefly, some of the differences among the state laws. I could have made this slide three pages, but I just wanted to focus on a few key points.

One is what’s the scope? California is going to cover employees and applicants, also B2B contacts. That is a big page. This also means your knowledge on the privacy issues, you’re dealing with employment privacy issues. So, bringing your employment counsel as well because California is California. They all give data civic rights, but they’re not all equal.

For example, the correct right is not available everywhere. The right to appeal also not available everywhere. Not noted in here, but also important to note how you describe the right to appeal and what you have to say and where you have to say it differs for the states. Some states are very prescriptive. This is going to make your privacy policy 37 pages just for the US.

And how you make that easy to understand is different. They all have data minimization, and I wanted to flag that because it’s not often a concept in the US. Here, it’s how long can we keep the data? And if it’s in file boxes dating back to 1901, is that okay? I know where everything is, it’s labeled. Data minimization, important.

Christopher Wall

We probably have a lot of legal professionals on this webinar too, and we’re the worst. We love to keep our documentation. And I was kidding with you about this yesterday, Jennifer, it used to be that information governance was something that we always knew that we needed to do. It was like those good eating habits that we should be doing. We knew 20 years ago that we needed to be mindful about where our data is and what we did with it, and what controls we had in place to protect it, and to make sure that we weren’t keeping any more than absolutely necessary for too long.

Well, along comes our midlife data doctor checkup, also known as the CCPA, and the GDPR, and now four additional US states. And during that checkup, the doctor tells us, “Hey, your privacy is out of whack, especially your DML (your data minimization level)”.

And so, it’s privacy that makes it abundantly clear just how important it was all along for us to be doing those things for the last 20 years.

Again, not to put too fine a point on it, but privacy is the cholesterol of the information governance world. And now, that our data hygiene is called into question because of these privacy regulations, we need to take a good look at what our practices are. And it’s that catalyst that makes companies realize that “Well, it’s time we do something about our information governance and our cyber policies and practices”. And if privacy is the catalyst that gets it done, then fantastic, because it’s something we probably should have been doing for a long time.

Sorry, I’m going to step off my soapbox.

Jennifer Kashatus

I agree with you there. I see the same thing on a daily basis. And just the last point on sensitive data, we almost didn’t include it because there are restrictions on the use of sensitive data, and consent meaning, in that side, is affirmative opt-in. But there are restrictions and opt-out options. So, I would take a look at if you are processing data that is deemed sensitive under any of the applicable laws, please read those provisions carefully and understand the nuances and the differences in the consent models, and the opt-in and opt-out rights, because they are quite strikingly different from one another.

Kevin Clark

So, what does this mean in practical terms?

In other words, we have these five cases, what do we need to do to prepare and make sure that we’re getting our privacy houses in order?

Christopher Wall

So, the deadline for GDPR was in May of 2018. And then CCPA was on January 1st in 2020. And we were all prepared, we’d like to think we were all prepared going into CCPA especially. And then bam, what happened in January of 2020? We had a pandemic, or we have a pandemic, and that pandemic lasted through December of 2019 through – well, whenever it officially ends.

What progress did we make before it made us take this hiatus?

So, I think a lot of organizations simply during the pandemic said, “All right, we’ve got other things to focus on and maybe didn’t focus on privacy”. Because we didn’t have GDPR or CCPA to focus on. So, I think we have to assess where we are as companies, as organizations.

And I think the further question that we have to ask ourselves is what do companies need to do to go back and make sure that our compliance programs that maybe we had in place before – that we stood up before to be compliant with GDPR or with CCPA – are meeting all of the privacy requirements that they need to today? How do we keep them evergreen? And how do we keep them well documented? And what processes and procedures do we have in place to make sure they stay audited? So that we can augment those internal systems with whatever we needed to do, maybe we didn’t finish, maybe we didn’t get ourselves up to speed January 1st of 2020, or maybe we still never did get ourselves GDPR-compliant. But what do we need to do there now? This is a good time for us to take stock and see what we need to comply in Virginia, Colorado, Connecticut, and Utah. And with the amendments in California.

I don’t know, anything else you want to add there, Jennifer?

Jennifer Kashatus

I think it goes back to what we’ve been saying all along is you have to know your company and know what data you collect. And that doesn’t mean what you do in your interaction. I cannot tell you how many times we see companies that have personal data all over the place, and the people in charge of, essentially, the legal group or compliance or wherever this falls in the organization don’t have full visibility, either for confidentiality reasons or for other reasons, or because you’ve got developers that are so creative, they don’t think to consult with legal.

And so, you have to know what you collect, and you have to know why you collect it. As a nod to the state laws, they also have unique provisions on deidentified data, so something to keep in mind even if you don’t think you’re dealing with personal information. But you have to know what you’re doing.

And it is really important now that you’re accurate about what you do. It always has been, but it’s even more so now. You can’t build a privacy program unless you know what you’re doing. And that collection of data is really time-consuming, or it can be very time-consuming.

Christopher Wall

So, we’d like to leave everybody with some takeaways here in our last five, six minutes here of our webinar.

And we want to start with privacy practices. That starts with asking the question, do you have a privacy policy? Do you have a privacy notice that you give to your customers? And do you follow it? When was the last time you looked at it? Do you know where your data is? The second piece of this question is do you know what data you have? And are only keeping that idea of data minimization – are you only keeping that data that you must have?

Again, we’ll keep coming back to this idea of data minimization again and again and again. It’s the cholesterol of the information governance world.

Jennifer Kashatus

Data processing agreement. So, many of you all who are familiar with GDPR, you might have had these in place. It is now going to be really critical that you have these in place under the US state laws, and to consider some of the states have very specific requirements. Also, some of them have specific requirements for flow down for sub-processors, not unlike GDPR.

So, you want to think about, do you need one? The answer is yes, you do, and why you need one.

Chris, do you want to talk about the discovery, one I think is a great point to add?

Christopher Wall

A data processing agreement, do you need one? The threshold question there is are you processing personal data for any reason? And if you, as a company, are asking a sub-processor or a contractor to process data on your behalf, yes, you need a data processing agreement.

And so, if you think about – again, a lot of legal professionals on the call here – do you have a data processing agreement in place with your outside counsel? Your outside counsel is definitely making decisions with respect to that data, maybe at your behest as a company. They may be a co-controller with you, but that needs to be outlined in that data processing agreement, which covers the scope and the purpose of any processing you’re doing with that data. It covers what data is going to be processed, how the data is going to be protected while it’s being processed, the bounds of the relationship between you and your vendors, or whatever that vendor might be. And the applications of each party with respect to that data, for instance in the event of a data loss or a data breach.

We’d be happy to talk to anybody, if you want to reach out to us after the webinar, and provide you an example of what a DPA looks like.

But as I talk with a lot of companies, they ask me what the heck is a DPA? Why would I need a DPA? A data processing agreement or an addendum, it’s required by every one of these five states and the GDPR.

Kevin Clark

Now, DPAs, what about a DPIA? Just another alphabet soup word. What is that Jennifer?

Jennifer Kashatus

So, this is going to be an impact assessment. It’s used if you’re processing certain types of data, so data that would be considered sensitive under the law or data that falls into particular situations where it’s going to impact on the individual.

This is an area where companies that are service providers also need to think about this because they might not necessarily know what they’re processing, or know the full content of it. Or to the extent that they are, they may need to help their customer in preparing one. So, if you are processing data that falls within these sensitive categories of data, or within certain enumerated situations, you’re going to need to have one that shows, essentially, a balancing and a minimization and all the concepts that we’ve discussed before to ensure that you are, essentially, protecting the rights of people.

But again, for service providers, you may need to help your customers in putting it together, because it’s more than just saying, “Oh, we think it’s fun”. It’s going to look at the security as well of this data.

Kevin Clark

Well, we are running out of time, so we are going to jump ahead here.

Christopher Wall

Yes, I think if we’d like to… we had one last slide there that talks about how to keep your privacy practices and privacy policies evergreen. Look, this deck will be available and a recording of this webinar will be available after the fact.

But if we could leave you with three takeaways here. Jennifer, let me ask you for your first takeaway here because I think I’d like to leave three real key ideas.

Jennifer Kashatus

Know your data.

Christopher Wall

Know your data. Know what you’ve got. I would say the second thing there is put DPAs in place with your vendors or your controller, if you’re a vendor, to make sure that you’re compliant in that respect. If you don’t do it, make sure you have those in place by January 1st.

And then, look, I keep coming back to these original – those case studies we looked at, Flo and Sephora. If you don’t have a privacy policy and effective privacy notice, put one in place. And if you do have one in place, make sure you review it, and make sure that you’re actually doing what you say you’ll do.

Jennifer, anything else you want to add? I’ll give you the last word here.

Jennifer Kashatus

Train your folks. Some of the laws require it, but if you’re still cleaning up on the back end, you want to make sure people on the front end know that there actually are privacy laws that you’re compliant with when people call you.

Kevin Clark

So, that is it for today. I want to give thanks to Jennifer and Chris for sharing their wealth of information and insight today. We also thank you for everyone that’s participated, taking time out of your busy schedule to attend today’s webcast. We know that your time is valuable and we truly appreciate you sharing it with us.

Today’s webcast will be available on-demand on our website beginning tomorrow, Friday. And a full transcript of the webcast will be available early next week. We also hope that you will have an opportunity to attend our October webcast. It’s currently scheduled for October 19th.

This upcoming webcast will feature an expert presentation and discussion titled Now You See it, Now You Don’t: eDiscovery Challenges in Apple’s iOS 16 Release. And it’s led by John Wilson, an industry-recognized expert in digital forensics and HaystackID’s Chief Information Security Officer and President of Forensics.

You can learn more about and register for the upcoming HaystackID’s webcasts and review our extensive library of on-demand webcasts at haystackid.com. The link is on your screen right now.

Again, I thank the panelists, I thank you all for attending today, and have a fantastic day. Take care.


About HaystackID®

HaystackID is a specialized eDiscovery services firm that supports law firms and corporate legal departments through its HaystackID Discovery Intelligence, HaystackID Core, and HaystackID Global Advisory offerings. In addition to increased offerings, HaystackID has expanded with five investments since 2018. Repeatedly recognized as a trusted service provider by prestigious publishers such as Chambers, Gartner, IDC MarketScape, and The National Law Journal, HaystackID implements innovative cyber discovery services, enterprise solutions, and legal discovery offerings to leading companies across North America and Europe, all while providing best-in-class customer service and prioritizing security, privacy, and integrity. For more information about its suite of services, including programs and solutions for unique legal enterprise needs, please visit HaystackID.com.