top of page

Simplify for Success - Conversation with Colleen M. Yushchak


This year witnessed several state-wide and comprehensive privacy laws introduced, some of which are going to be effective from the coming year. 


- Are we going to witness the regulatory landscape further evolving in 2024 and new privacy laws being introduced? 

- Do the nuances and differences between different state privacy laws matter when ensuring compliance? 

- What  type of personal information had more focus by the enforcement agencies like FTC this year? 

- What tools/ technologies impacted user privacy the most in 2023? 



Tune into our podcast on  #SimplifyForSuccess with privacy experts Priya Keshav , and Colleen M. Yushchak as they discussed these questions. The experts also talked about the significant changes in the privacy world, citing different laws introduced by the lawmakers.


Thank you to Fesliyan Studios for the background music.   


Views and opinions expressed by guests do not necessarily reflect the view of Meru Data.





Transcript


Priya Keshav 

Hello everyone. Welcome to our podcast around simplifying for success. Simplification requires discipline and clarity of thought. This is not often easy in today's rapid paced work environment. We've invited a few colleagues in the data and information governance space to share their strategies and approaches for simplification. This will be our final podcast for the year and as we approach the final week for the year, we have Colleen Yushchak on the call with us today.  

Hi Colleen. How are you? 

 

Colleen Yushchak 

I'm good. How are you? 

 

Priya Keshav 

I'm great. And thank you for joining us on the show today. So, before we kick it off, would you like to introduce yourself to the audience? 

 

Colleen Yushchak 

Sure. So, my name is Colleen Yushchak. I am a senior managing director at Ankura Consulting. I work out of the DC Office, and I specialize in privacy consulting. So, really helping companies to operationalize their privacy programs, often using technology. So, I've been doing that for about 8 years, starting with, you know, clients helping them to get ready for GDPR, and then more in the US and now pretty much globally. As you know, there's regulations everywhere from Canada to China to Australia. So that's sort of what I've been working on. For the last 8 to 10 years. 

 

Priya Keshav 

So, Colleen, as I just mentioned, we have about what one more week left in the year. And as companies think about their current state and gaps in implementation, and you know, compliance from for the existing privacy laws, they have to also consider a whole bunch of new laws that go into effect in the next 12 months. In fact, I was kind of intrigued to read a post today, sometime back actually, with Baker Hostetler where they were talking about by the end of 2024, nearly 40% of the people in the United States will be covered under one of the US comprehensive privacy laws. So, what are your thoughts on some of the laws that go into effect next year and, you know, preparing for it from a compliance perspective? 

 

 

Colleen Yushchak 

Yeah. So, I mean, I think we've all known this is coming. You know, when it started in Europe with GDPR and then came to California, at least folks in the privacy space geared up for additional states just with no federal regulation currently close to, you know, being in place. So, we sort of saw this coming. And as you mentioned, you know, we have at least 3 states that are going to have privacy regulations that go into effect in 2024. So that's Texas, Oregon and Montana. And for my clients, the one that I think is causing the most work in Texas. I have at least 2 new clients that were previously not impacted by any other states but are impacted by Texas. So, I do feel like that that state is definitely going to cause companies to have to engage in compliance activities. And then beyond that, there's 5 more states that go into effect in 25 as of now, right. There are 2 more states that have proposed regulations; that's New Hampshire and Wisconsin. And if they go in, if they get finalized, they would also go into effect in 25. So that would be 7 states in 2025. So, I think that those are good estimates that you know, at some point or another, 40 to 50% of all folks in the United States are going to be covered by these regulations in some form or fashion. And I will say most of the regulations that I'm seeing come out are pretty much -- there's so much overlap between either Colorado or Virginia or California, that kind of same story where if you prepared for one of these other regulations, especially for California, you're probably pretty, pretty ready for these other regs but there's still little nuances and differences. 

 

 

Priya Keshav 

Yeah, I think it's the nuances that you have to pay attention to as you kind of look at operationalizing or getting ready for these states; small things but they matter. Like for example, Texas doesn't carve out small businesses and require them to get consent for selling consumer sensitive personal information. You know, if you look at Oregon, you're supposed to produce, I mean, and this is something that was part of, if you remember the initial draft of the Colorado regulations, we were talking about list of specific 3rd parties and then somehow it at some point it changed into categories of 3rd parties. But we're back to square one in terms of you have to produce the exact list of specific 3rd parties to whom you disclose sensitive personal information. So, it’s those nuances that matter. And also, I think there another place where there's probably variations is definitions of what is sensitive data. You see variations in the age where you have to do age gating for kids, whether it's 18, 16, 13, I mean there seems to be more and more variation around it, which kind of also impacts compliance. 

 

Colleen Yushchak 

Yeah, absolutely. I think, and regarding the sensitive data, you're right. I keep a little tracker and I have, you know, all what is it? 12 to 20 different regs state regs across it. And when you look across all the sensitive data elements, each one in some way or fashion introduces some new sort of sensitive data element. So, it's really a matter of just tracking everything and then figuring out which states are in play and making sure you've got that covered from the sensitive data angle. 

 

 

Priya Keshav 

So, what are your predictions for 2024? Are you expecting more states to pass laws? You know, introducing a comprehensive privacy law or the trend to continue, I suppose. But what are your thoughts? 

 

Colleen Yushchak 

I love this question because it makes me sound like I can predict things because normally I have no idea. But I feel pretty confident that there will be more state laws than are both passed and kind of like enforced in 24. And you know, I'm sure we'll talk about this in a little bit about California, but we're sort of at the beginning of the enforcement, so I just think that when 24 comes, that's what we're going to see is, you know, more states passing rights more and the state to go into effect kind of causing some heartburn for companies. And then enforcement really kicking in and that I think is what is going to derive a lot of compliance. As companies see what these different states are focused on, then they can take that right back and operationalize it to reduce their risk. 

 

Priya Keshav 

So yeah, you were talking about enforcement, right? So shortly before CPRA modifications were set to become enforceable in July of this year, Sacramento Superior Court judge issued a ruling on June 30th pushing enforcement of CPRA from July 1st, 2023 to March 29th of 2024. Now that we are close to March 2024, what does that really mean from an enforcement perspective? I mean, obviously last year at this time, we couldn't stop talking about Sephora. But you know, it's been slightly quieter on the on the California enforcement by this year. 

 

Colleen Yushchak 

Yeah, that's true. So, I do wonder what's going to happen on that front. I mean, I feel like the California AG and their new enforcement, you know, agency really put forth a lot of guidance. So even though you know the big enforcement that we're aware of is the Sephora enforcement, and there's probably been a lot of things on the side happening. I think that they've been really clear about what they are going to enforce. So, in some ways, yes, I think it's going to get busy with enforcement potentially in March, starting in March, April, May as California starts to enforce some of the parts of their regulation. They obviously can't enforce the 3 rulemaking categories that they're still working on which is the cybersecurity audits, the risk assessments and the automated decision making. But a lot of the other areas of the CPRA can be enforced and I don't think we're going to be surprised by what they're enforcing. I think if anything, it'll just create a little fire under companies that we're holding off for whatever reason as they start to see companies beyond just sephore start to get fined. But I'm curious about your thoughts on what's going to happen with California enforcement. 

 

Priya Keshav 

I wish I had a magic ball, and I could look up and tell you. But some things are very clear right, like obviously you know, we were talking about tracking technologies, and we've been talking about tracking technologies for the last, you know, few years where we went from no sale to yes sale to yes, we need to really watch pixels, we really need to watch cookies and I think, to some extent, you see a lot of maturity. Then there is also still quite a bit of tracking that happens that is kind of in my opinion, you know, just a mix. I was on a website just this morning and I noticed that I couldn't find a reject button. I had to go in to manage cookies and read really carefully for a second to understand whether I'm rejecting or not. And that you know, you'd kind of think that by now that should be standard, but it's not so. Some of that might be enforcement that might drive change. But you also see, I mean, one of the things that was the December 8th meeting, I think some topics came up and one of them was the need for GPC to be universally present in all browsers. And I think if you also look at the Colorado, at least what has been proposed, it seems like the universal opt out and most of the options that was being suggested were very simple. So, from an enforcement perspective and as well as strength perspective I would kind of assume that you know simple the better and make it easy for people to opt out. And when they say opt out, it is true opt out. So, that's one area where you can clearly see the trend for enforcement will continue.  

But beyond that it's kind of hard to really predict. But you see the areas that might be again relatively important, might be healthcare and children's data. Yeah, it's like I said, I wish I had better insights into what they expect, but you can now when you already see some trends. 

 

Colleen Yushchak 

That would make our jobs so much easier, right? Like, listen, we know this is coming. We have a crystal ball. But going back to that board meeting on December 8th that the CPPA held, I actually attended that, and I'd never attended one of those before, but I carved out some time because I was really curious to hear what they were going to discuss around the risk assessment piece of it. And I actually thought the meeting was really interesting. It was about 2 and a half hours long. And again, I'd never attended one before, but it was really eye opening to hear kind of how they were talking about things and what was important. And they talked about a variety of things, but they started off talking about those 3 areas of rulemaking: the cyber audit, risk assessments and automatic decision making technology. And with the cyber audit, it was interesting that they were talking a lot about the threshold that would trigger the audit and they talked about the different leaders, whether you know it would be focused on revenue or would it be employee count or would it be the number of individual data that's processed. And they were talking about how sometimes these cyber audit costs could cost upwards of $100,000 for the larger companies, so they have to be careful about kind of who gets caught up in the requirement, which I thought was interesting. They talked a lot about they have to go back to economists and look at numbers again. So, it sounds like that was pretty far off from getting close to final.  

And then the privacy impact assessments or the risk assessments, you know and we talked about this, maybe we even talked about this on the last call, but I don't recall. We talked a little bit about how the proposed draft regulations are really saying that companies are going to need to submit these risk assessments to the CPPA and that there's like an abridged form that they're going to need to submit but the fact that they have to submit them to the CPPA means if companies don't submit anything, it's pretty clear they're not running these risk assessments. So, I feel like there's a huge risk area for companies that don't have anything in place, and we'll be ready for this.  

We've got time, but it's still quite an undertaking to implement and as you know, because we're both trying to help a lot of companies implement these risk assessments. So, I thought that was interesting. 

Oh, and another requirement is that the business is highest ranking executive who's responsible for oversight of the risk assessment compliance has to sign a certification of compliance with these risk assessments, which is crazy. We have one client that's refusing to do, like basically went to his team and was like I will not sign this until certain things get in place. Like they're just not ready and he's not ready to put his neck out on that on that front. So, I thought that was interesting.  

And then there was a lot of discussion about, you know, how often would you have to submit to the CPPA? Is it just the first time and then any time you make a significant change or is it annually? It's written as annually right now, and they were talking about how time consuming these assessments can be and that they can be 40 to 80 pages long, which seems really really long. I just don't think I have any companies that have a risk assessment quite that long yet. So, it's kind of, you just sit there and listen. And I'm like, wow, this is so interesting. All the things that they're talking about and sort of, what they're worried about and what they're expecting from some of these risk assessments. So that was interesting.  

And then they talked a little bit about automated decision-making technology and really just right now how broad the definition is. And that was a sticking point for a lot of the people involved in the meeting, just kind of saying that maybe it's too broad. And so, I think they'll kind of revisit some of that too. So that that's all up for change potentially.  

And like you said then at that meeting, they also approved a legislative proposal that is going to require browsers to offer the opt out preference signal. So, as you know, there's what is it like DuckDuckGo, and there's a couple other ones that actually have that opt out natively within the browser. And I think it represents about 10%, let's say of all the browser market share have these opt outs natively, but most of the browsers that people use it, say the Googles, don't have it natively in. The only way you can get these opt out signals to work right now is by adding a browser plug in and so this switch that they're talking about would make that go away, would basically make all browsers have to have this opt out natively. And like you said, the Colorado AG is also, you know, just recently submitted like 3 different kinds of on opt out solutions that they're considering under their regulation that would qualify and work for the opt out of data sales.  

So yeah, I just thought it was really interesting and I had never, I'm embarrassed to say I'd never listened in on one of those calls, but it was valuable. 

 

Priya Keshav 

No, I agree. And I think that one of points that they were raising was for needing to kind of revisit the automated decision-making technology piece, the rule making process. I think it was interesting to listen to the fact that they thought it was important to look at, you know for example, it's important to have automated decision-making technology that monitor truck drivers for safety and so they kind of had some disagreements on whether the employee should have the right to out of some of those technologies always. So, it definitely looked like maybe the cybersecurity one was ready to move forward but the other 2 were going to go back. But yeah, I mean some of those provisions there are difficult to implement or maybe more onerous and will add to the existing list of onerous requirements that the law themselves play, so it's kind of interesting to see how these things progress.  

So, moving on, should we talk a little bit about, I know I mentioned briefly, I see California is not the only enforcement agency. We also have FTC, but maybe we can talk a little bit about FTC and FTC in particular has been focusing a lot of efforts on enforcing health data. Maybe if you have any thoughts around it? We can talk, you know. I'll let you talk first, and then I can kind of continue. 

 

Colleen Yushchak 

Yeah, sure. So, if we want to talk health data, we can talk about FTC, there's also the new health regulations that are out. So, we can talk about that. But let's focus on the FTC. So, you mentioned relating to health data, there's been a lot of violations under the FTC Act and the FTC, the Health breach notification rule that have happened in the last year, so that's clearly an area of enforcement for the FTC and as a result, I am seeing a lot of the companies I work with get advice from outside counsel to take a closer look at their data inventories and try to figure out what data they have. You know, currently that they're processing that would fall under the definition of the health breach notification rule. So, what data would fall in scope with that. And it's broader than just HIPAA, right? So, it's going to be data like health-related data that falls outside of HIPAA. In some cases, it's identifiable health information, and it can be from a, you know, employee health data, it could be like if you're a company that runs an app that captures health data, right?  

So, there's a lot of different ways companies might kind of fall under the, you know, the scope of this breach notification rule. And so, I think that's something a lot of my companies have focused on and using the data inventory, taking a real close look at the definition and then trying to figure out what they have that would require that notification if they had a breach. 

 

Priya Keshav 

Yeah. And I think FTC sent letters to approximately 130 hospital systems and health providers regarding potential risks associated with online tracking technologies. Like in particular, the letter highlighted, you know what kind of information could be sent via these tracking technologies as part of your mobile app or website and things to kind of watch out for. And they've been, they've been sort of looking at their enforcement, you know, has been focusing on multiple aspects of you know health data, so one of the key parts of it has been around tracking technologies and passing on health related information to, you know, the Facebook of the world. But they've also been looking at the need for express consent before somebody collects sensitive or discloses sensitive health information, clear disclosure.  

And then also having a robust security program to protect the sensitive health information and also paying more attention to, you know, the sensitive nature of health information in particular. And I think some of their enforcements in the case of BetterHelp, GoodRX or for that matter their litigation with Kochava you know has been centred around pretty much those topics which has been interesting to watch because obviously like you said, you know. That plus, some of the regulations around health-related data like the Washington My Health My Data Act and other states have passed a similar legislation and that has made everybody go back and take a look at what types of data that they have that might qualify under some of these laws. 

 

Colleen Yushchak 

Yeah. The 3 state laws that are, you know, cover health data, which is the Connecticut Data Privacy Act, they did some amendments that kind of got a little more specific around covering health data. Then there's the Nevada Consumer Health data law. And then in the Washington My Health My Data Act, which is in effect now at least for the geofencing requirements and then non-small businesses I think are in in effect as of March. So, we're only like 3 months away from that and then small businesses need to be compliant by June, so again pretty close, close time frames.  

But what I'm seeing companies struggle with besides just figuring out what data falls under the scope of these regulations is that they're kind of uncertain about how to comply. I think they're having problems with implementation because, uh, even if they can identify what's in scope, lot of their systems and processes and operations have been in place for a very long time, you know. Years old processes going on here, really old systems, old kind of software and solutions that are processing this data and to really kind of get them in a position where they're not processing data that falls under these regulations, they have to change some significant kind of work streams and the way that they are naming data, the way they're using data.  

And I think that's what I'm hearing from companies, is that it's kind of, it's going to be a long process to make the changes that need to happen and it's going to be very disruptive and so that's what I'm seeing, at least with companies trying to kind of manage the My Health My Data. Plus the FTC kind of enforcement related to health as well. And like you said you mentioned the Pixel litigation and then the 2 enforcement actions against the telehealth companies.  

But I also had seen that the agency was looking at online tax preparation companies like H&R Block as other kind of uh, because of the regulations curtailing commercial surveillance. So that's another area. And then another area that's similarly related because its health related data is biometric data. So, I believe that the FTC kind of put out a statement around combating unfair or deceptive acts and practices related to collection and use of biometric data and then marketing and use of biometric information technologies.  

I was also reading the other day about how Macy's and Walgreens in particular got in trouble because, I didn't realize they were doing this, but they were using facial recognition cameras and I think a lot of companies out there that aren't in the retail space are probably like, oh, yeah, we do that, you know, that we did that all the time. I didn't realize it was actually being like facial recognition cameras being used to basically to manage shoplifting. But then, you know, as they do the marketing teams were like, great, let's also use this to understand buyer behaviour and like tracking emotions in people 's facial expressions and trying to understand behaviours. Uh, so that's what they got in trouble for. And of course, nobody was notified that they were being, you know, recorded and facial recognition technology being used. So that's sort of why they got in trouble. But when you talk about FTC enforcement, I mean, there's so much that falls under that health space related to that. 

 

Priya Keshav 

Yeah. Som we'll probably talk a little bit about AI later on. But you know, when you talk about facial recognition, the FTC yesterday talked about Rite Aid and so we should talk about that too, but we'll probably cover that later. But before we move on to AI, you know, we should probably talk about Canada. A vast majority of, about 20 fives amendments to the Quebec Privacy Act came into effect in September 2023. You know, so should we talk about Quebec and other parts of Canada before we move on to artificial intelligence and facial recognition. 

 

Colleen Yushchak 

I love it. I think if you would have asked me about Canada on our last call, I would have told you there was really nothing we need to talk about there. I mean, it has been on the back burner for every company that I've ever supported. It's so low priority and I think you know the current privacy regulation PIPEDA just doesn't have a lot of teeth. It's not nearly as robust as GDPR, and so I think it's just been a backburner thing for a lot of companies.  

And what I've noticed in the last 6 months is with these new laws that you mentioned, the Quebec bill and the digital Charter Implementation Act or Bill C27, are regulations that I think are going to bring PIPEDA more in line with the GDPR. So, all of a sudden, companies need to refocus on compliance from a Canada perspective.  

Some of the requirements under these under the Quebec bill and then the Bill C27 include that you have to have a formal privacy program, you know, with like policies and practices and procedures, you need to identify somebody that is going to be like the equivalent of a DPO, kind of like the GDPR approach. Which, you know, right now, I mean, there's a couple of states in the United States, there's at least one that has something like that, but a lot of them don't have that requirement that you have to have a DPO exactly. You have to opt in consent for marketing activities and processing sensitive data. So, getting kind of more, uh, more close on this, opting in for sensitive data to some of the US states. There is storage limitation requirement. You have to have contractual requirements for vendors, and you have to conduct the privacy impact assessments and implement privacy by default.  

Plus, there is this new requirement where you have to add in your privacy notice if you're using an automated process to make a decision about personal information, you actually have to include that in your privacy notice. So, from an operational standpoint, what I'm trying to do with a lot of the clients I work with, is first and foremost working on privacy notices, kind of like we do in the United States. Because right now those aren't going to be compliant. There's a lot of additional rights that are given to folks and things that need to be kind of divulged to them in the privacy notice. So almost starting there and then working from there to kind of get all of the back end systems able to kind of manage an opt out request or something like that.  

But what? Uh, what else are you thinking of or seeing on the Canada front? 

 

Priya Keshav 

So, it's also the logistics, you know? You just talked about the back end and I just remembered something that I wanted to mention on the healthcare side, but it kind of applies to Canada as well. So, Canada sort of mimics a little bit more, more like the GDPR where you need opt in consent for a lot of things. And it’s little more stricter than the US you know ways to approach privacy, so obviously keeping that in mind, you know, as you sort of implement solutions. There are many companies that are probably global and have already implemented the GDPR version, right. So, it's not that complicated, but then for the for a U.S. company with just exposure to Canada, then it just becomes a new thing because the opt out process that you use in the US won't really work from a Canada standpoint.  

So being able to kind of look at those new ones and adjust for it becomes a little bit of a challenge. And the same applies, and I'm kind of jumping here back to healthcare, but the same issues also existed, you talked about the deletion, so some challenges that I see is like the deletion has to be absolute. So, there are no exceptions allowed under like for example the Washington My Health My Data Act, right? So, which essentially means that and of course the scope of what you're deleting under it is just health data versus everything. So being able to know those nuances and as the number of laws increase and nuances also increase. And you can't quite streamline and make it all the same in this case because like I said, if the exceptions don't apply, the exceptions don't apply.  

So, which essentially means that you have to look at actually deleting the health data completely and properly, and so, again the scope is not all personal information, just health information. So, being able to implement those little things. 

I think I would I maybe I've mentioned this to you, but you know while we may be having done this for so many years, it's so easy to kind of think by now you should understand the definition of what personal information means by now, you have to understand all of these things, but for an average person to be able to kind of comprehend all these nuances as they operate, the training becomes much of a challenge as well, because even if you write an SOP, you know, nobody's going read a 50 page document. Then being able to kind of know the differences between X&Y&Z.  It becomes a quite a challenge. Even if you automate there are pieces of it, that requires human intervention, and that requires your entire team to be on board with all the requirements, which I think is a particular challenge that we see with laws. 

 

Colleen Yushchak 

I was thinking about how your you were saying, you know, for the layperson, the common person, just the concept of personal information. And I was laughing a little bit in my head because I was thinking about every project I've ever worked on, we always start, you know, with a training session around privacy, and we cover the definition of personal information and we talk about things like IP address or somebody's favourite colour, or you know their title, or you know all these data elements that we know about people with all personal information, even things like device ID and people are like, yeah, yeah, that makes sense. And then we'll kick off the data inventory and we revisit, and we train again and say don't forget personal information is this and then we start asking questions, you know - What are you guys using that involves personal information? And it's still like a disconnect because I think that no matter how many times you hear it, until you live it and you're thinking about data the way that you and I think about data, Priya, it's not natural for folks to kind of get that in their head. So, I'm just laughing because I'm like, that's most of my job is trying to teach people what personal information should be. 

 

Priya Keshav 

Yeah, and also you know when you're looking about looking at delete, you could just say I removed Prius information from the search history and now my search history is no longer relevant. But when I'm looking at copy if I've got my Prius search history, individualized Prius search histories associated with Priya, then that search that I typed, the fact that I type for, let's say, Quebec or whatever that is, that particular searching is personal information. That's not something that, I guess you know, like you said, you live and breathe this, and you understand it, but not for someone who’s day job is something totally different and they start having to think about it is it's complicated. 

 

Colleen Yushchak 

It's so complicated, yeah. 

 

Priya Keshav 

And then you go well in the case of health data, there is no exception. And this is the health information. But in the case of you know regular deletion, there is exceptions. 

 

Colleen Yushchak 

The definition for health data right under each of the new state health laws is slightly different, but generally the same but then if you look up the definition for health data under the FTC 's, you know, under the Breach notification rule, that's just different, there's just different words involved right and words mean a lot, especially if you're a lawyer. So, it's interesting that across all these states, the definition for personal information, while it all sort of means the same, it's not word for word the same definition. So not only are you trying to get folks comfortable with these concepts, these new concepts of what these things mean, but then all these different regs have slightly different explanations as to what they mean though. It just keeps it funny. 

 

Priya Keshav 

So, we couldn't really talk about privacy without talking about Artificial Intelligence. In fact, I feel like if you had to mark, you know, it would probably be appropriate to mark 2023 is the year of artificial intelligence, not because artificial intelligence didn't exist before, you know, it's just I think, simply ChatGPT and generative AI that that maybe, you know, made such a big impact that you know it's pretty much been the year about AI. And so, you know, we've talked about the need for governance around AI, the need for regulation around AI. There's been a lot of conversations about AI. And there's also been conversations about how some of the things that we're doing in privacy also applies to AI how they both go together. So, we need to maybe start with the new EU Act because probably will be the first and probably more foundational regulation around AI. I'll let you talk about the new. Yeah, I act and then maybe I can add more to it. 

 

Colleen Yushchak 

Sure. Yeah, so, as you mentioned, the new EU AI Act in early December, just a couple of weeks ago, uh provisional agreement was reached between the European Council and the European Parliament on the text of this new Act, this AI Act. And my understanding is there's some work that still needs to be done before it's truly finalized, but for most purposes, we're assuming that it's agreed upon. And it's going to go forth and it'll be enforced in 24 and there's it's a, it's a very interesting model in that it's very risk based. So, I found online this great little chart that I think the IAPP put out that really kind of just explains it all in like a nice little chart format and explain some of the basics about the act and sort of the approach. And like I said, its risk based and they basically define different types of AI activities into the activities that are prohibited, can't do them at all. And interestingly, there's, you know, things like emotion recognition at work and an education or social credit scoring or behavioural manipulation on targeted scraping of facial images.  

Again, some of the stuff that we've already talked about today that would not be allowed at all. So, there's all these prohibited AI. Then there's high risk AI. And then there's general purpose AI. All of these different categories of risky AI require different kind of things in place to kind of protect the, you know, protect it and kind of control it. And there's penalties and enforcement and they're definitely higher than we and other regulations, so 7% of global annual turnover or revenue at the kind of the high end if you're doing prohibited AI violations. 3% for most other violations. So, they're definitely hefty.  

And what I think is the most interesting about this. I mean, aside from the prohibited AI piece, I think that's interesting. I also think, what I think it's going to do is kick off a flurry of other AI regulations globally because I feel like this is what happened with GDPR, right? They were the ones that put out the first you know, on this kind of privacy regulation and it totally influenced all these other countries and I think now that we actually have something from Europe, I think it is going to cause maybe even the US to kind of get closer on some of the regulations that we put forth and it might influence them. 

Because when you think about the, the CPPA board meeting, there was a lot of discussion about what can we learn from GDPR and a lot of like how did GDPR handle the DPIA? And what can we learn from them? So, you know that's going to happen in the United States because this law is going to go into effect. It's going to maybe be enforced. It's going to kind of be operationalized so we can all see how it, you know, achieves its end goal and then hopefully learn from it. 

 

Priya Keshav 

No, I agree. And I think, I mean, we can talk, uh, you know, do a separate podcast on just the EU AI Act and what it means from a regulatory standpoint and also talk about, you know, the good bad and the ugly parts of the AI Act. But I think overall I agree with your assessment that it's probably going to, you know the number of proposed regulations if you look at AI, it seems like it's following the trend of you know, privacy laws in terms of the states are all considering passing regulations around AI. So, we'll probably see one from each state and they'll probably be all slightly different. 

I'm not sure what will happen on the national side. I'm thinking that, you know, I'm not good at predictions but my guess is probably not much. It's just going to be the same thing. Though I do think that you already see enforcements around AI.  

I said that I'll talk about the facial recognition technology, the FTC enforcement on Rite Aid. They found Rite Aid was tagging consumers, particularly women and people of colour as shoplifters, and they were not using, there wasn’t adequate governance to kind of look at what kind of data was being used and maybe the data was biased or not and they felt that, you know, there wasn't enough effort put into governance from Rite Aids perspective and they obviously there was an enforcement around it where they Rite Aid had to agree to, there was a proposed order that the Rite Aid will implement comprehensive safeguards to prevent these types of harms to consumers in the future, so it's an interesting read. It came out yesterday. So, it's kind of hot off the breath, so to speak.  

But you know you see already a lot of enforcement on the AI side, especially with respect to biometric data and usage of you know images you know with more regulation with that also means that there's probably going to be more enforcement as well.  

So, it's an interesting area to watch out for. A lot that is already happening. You see a number of companies have already started looking at AI and looking at the kind of exposure they have due to AI and you know, maybe at this point all they're doing is waiting and watching or maybe they've learned from some of the privacy related activities that has happened in the past that have started either maybe doing data mapping, maybe setting up governance structures or steering committee to kind of manage how AI is, you know, sort of being used and governed within the organizations. But it's an area that is hot. 

 

Colleen Yushchak 

Yes, I here's what I think. I think during this transition period, you know kind of before a lot of these AI regulations become active, before they're enforced, the deal is companies are just going to have more, more compliance questions than answers, right. And I think that it's just going to take time for it to get all settled out. So, I think a lot of companies, even the company I work for, we're just kind of monitoring it. We're keeping you know; we're keeping our eyes and ears open as to what struggles companies are having around this space. We're making sure that our team is up with AI kind of experience because it's not going away, right. And so, I think it's more like we want to be ready when things start to shake out, but until they do, everyone's just going to be talking about it and wondering kind of what's going happen. 

 

Priya Keshav 

Well, there are many topics that we haven't talked about. I think it requires mentioning at least that there is a growing trend to sort of manage data brokers. There's definitely, you know, you see Oregon and Texas and even California and Vermont passing laws around, you know, data brokers; having them register with the state. And so that's something that we haven't talked about, but perhaps maybe in our next episode. I think we haven’t also had a chance to cover children's data, which is a topic in itself but, any other closing thoughts before we sort of wrap up? 

 

Colleen Yushchak 

Uh, no, not specifically. I actually had notes for 2 other things that we could have talked about which we didn't, which is the, you know, New York Department of Financial Services amended cyber rules and some of their new biometric laws that they have and there's new regulations in India and you know there's the EU-US privacy framework. There's so much going on right now that it's really hard to get it all in. So, I feel that 2024 is probably going to be more of the same. That's what I'm fully expecting, and I think that thing that companies will struggle with the most is where do they put their time and effort.  

The prioritizing is going to be the hard part and obviously it it's going to involve scoping, right? Do I need to? Worry about this or that from an in scope perspective. But even then, companies don't really have, a lot of companies and the bigger ones do but more of than mid-size and smaller companies, they don't have a big budget for privacy, they don't have fully built out teams. They're still leveraging consultants. They're leveraging technology and these additional requirements are going to not all be able to be addressed. So, it just needs to be a risk-based approach where they prioritize what they're going to focus on. 

 

Priya Keshav 

No, I think it's all about that, right? Every day think about what's the most important thing to tackle because the list of what needs to be addressed just keeps growing and it's going to be a matter of like you said, prioritizing for sure. Well, thank you so much for joining me. Colleen, it was a pleasure to talk to you. 

 

Colleen Yushchak 

Always a pleasure, Priya. Hope you have happy holidays and talk to you soon. 

 

Priya Keshav 

Same to you. Thank you. 

 

 

 

 

Featured Posts
Recent Posts
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page