top of page

Simplify for Success - Conversation with Samantha Ettari



Samantha Ettari was on #SimplifyForSuccess, a podcast series presented by Meru Data and hosted by Priya Keshav.


Samantha spoke about biometrics and the various biometric laws in the US. She also discussed cases regarding the use of biometrics and the regulatory aspects that organizations should pay attention to.







Thank you to Fesliyan Studios for the background music.


*Views and opinions expressed by guests do not necessarily reflect the view of Meru Data.*






Transcript


Priya Keshav:

Hello everyone, welcome to our podcast around simplifying for success. Simplification requires discipline and clarity of thought. This is not often easy in today's rapid-paced work environment. We've invited a few colleagues in data and information governance space to share their strategies and approaches for simplification.

Today, we will be speaking to Samantha Ettari from Perkins Coie. We'll talk about biometrics and the various biometric laws in the US. But before we get started, few disclaimers: The information covered does not and is not intended to constitute legal advice. All views expressed are those of the individuals in their individual capacity and not of the firm they represent.

Hi Samantha. Welcome to the show.

Samantha Ettari:

Hi Priya, thank you for having me, I really appreciate being here.

Priya Keshav:

Tell me a little bit about yourself and what you do.

Samantha Ettari:

I am an attorney at Perkins Coie in their Dallas office. They're a national firm, but I'm based here in Dallas. I am a long-time litigator. That was my initial practice area since I graduated from law school eons ago in 2005, and I spent many years litigating in various subject matters and areas of the law.

Priya Keshav:

So how did you get into privacy?

Samantha Ettari: As privacy really took a stage here in the US in part following the GDPR over in Europe, and American companies realizing that if they did any business in Europe or triggered any of reaches of the GDPR that they would be within scope. My practice began to expand into privacy as I learned along with many other American lawyers. GDPR was a brand new law in 2018 and the US quickly, many states started adding consumer privacy laws well, primarily California, and now we have two other states that are going to be adding consumer privacy laws next year. And so as this area, which already had sectoral privacy laws, people are familiar with HIPAA, GLBA, there were privacy laws in many sectors already, but as that consumer privacy law base started to grow, my practice in privacy grew with it to the point where I am primarily a privacy lawyer,advising companies on privacy compliance. I am still a litigator and I'm sure we'll discuss today, some of the litigation that is well and active in the privacy space.

Priya Keshav:

So we plan to talk about biometrics today, right? So biometrics has become increasingly pervasive, but I don't know whether we realize that or not. But when we use smartphones to unlock, we're probably using facial recognition technology or fingerprinting technology. Our photographs are automatically detected, which again is probably using facial recognition technology. We're using maybe fingerprinting to record attendance to unlock locks and things like that. So, there have been a number of laws that have been passed in the US, mostly state based around biometrics. And as soon as you think about biometrics, for some reason, BIPA seems to be the most popular. Do you want to talk a little bit about BIPA and what it is?

Samantha Ettari:

Yeah, sure, no, you're absolutely right. It's fascinating to watch the explosion in biometric data collection, use and technology in some really incredibly innovative ways that, you know, I personally think are great for society. You mentioned the security aspects, the kind of business drivers that make things more efficient, like the ability to unlock cash registers or access cash rooms that just bring a heightened sense of security to the business place or to the home if people are using devices that have some kind of facial recognition, for security purposes or as you said, unlocking the phone. There's so many uses for biometrics in technology and they're ever growing. Just such incredibly creative ways that tech companies are thinking about efficiencies and convenience and often that is tied to how can we use biometrics? What is biometrics, just to take a step back? You know, how can we use fingerprints, face geometry, iris scan? And how can we use these things to increase functionality of products, whether they're business, consumer, home or the office? And there are lots of creative ways that these tools are being developed and biometrics are being used. And you mentioned BIPA, which is the Illinois’ Biometric Information Privacy Act which was passed, I believe, in 2007 or 2008 and it is probably, I would agree with you, the most well-known, particularly amongst businesses that are possessing, collecting, using, storing, retaining biometric information. And not because it's the oldest, but because it is unique in that it provides a private right of action, which means that consumers, employees, individuals whose biometrics are collected in a way that they believe runs contrary to the requirements of BIPA themselves can bring litigations in state or federal court to pursue relief for the alleged violations. There are other biometric laws that don't have that private right of action. They primarily vest or exclusively vest enforcement in a state AG or DA’s office, by some law enforcement or regulatory body. And generally we have not seen publicly, a lot of enforcement around those laws. But with BIPA, with that private right of action, there is just a plethora, there's not a word for the amount of litigation that has been brought in the past last decade. It's well over 1000 putative of class actions brought under BIPA. And that is why it is so well known and there have been settlements, some fairly significant, and so businesses that are collecting, using, storing biometrics, that's the statute that sort of puts the fear of God in them because it is just more likely that there could be enforcement, through a private right of action under that particular statute.

Priya Keshav:

So let's talk a little bit about some of the...as you mentioned, right? BIPA has obviously been more popular because it's got the private right of action and people have taken advantage of it, and number of cases have been filed under BIPA and so we can't obviously cover all of those cases. But maybe we can talk a little bit about, you know, a few of them and some key takeaways that have sort of gotten attention and things that sort of seem to be problematic from a BIPA perspective.

Samantha Ettari:

Well that's a great question and these cases fall into a handful of different categories that we can walk through. I think it will be helpful to the listeners to have just a little bit of an understanding of the basics of BIPA, it is a little bit longer, more detailed than maybe some of the other biometric statutes. And it has set out requirements for entities that collect biometrics, including public notice that they are, you know, a biometrics policy that is available that states if they are collecting biometrics and how they're using it, and getting consent. A lot of the litigations that we're seeing are alleging either that, let's call them data subjects, they can be employees, consumers, whomever. But the individuals from which the biometrics are being collected are alleging either that they didn't get that disclosure around the biometric collection and use or and/or that they did not give their consent. BIPA also has requirements around retention and deletion, which also has to be disclosed and obviously enacted on. So there are claims that are arising around allegations that that individual did not receive notice about the retention and deletion policy, or that the policy wasn't followed. So there are other nuances, but at a high level, I think that is kind of sufficient to understand the underpinnings of many of the allegations that are brought in these litigations.

And so, as I mentioned, these data subjects can fall into a number of different categories. You're seeing a lot of litigations brought by putative classes of employees at companies or businesses, saying you [collected] used my fingerprints as a way for time keeping. Right, this is becoming a very popular method of time keeping rather than, think about the old punch cards or the little, you know, clock-in machine at the door of the store or the restaurant where you punch in your employee ID. Many companies have moved to fingerprints as a method of clocking in. So, with the rising use of biometrics in the workplace, whether it's clocking in or accessing a cash register, et cetera, we are seeing a category of class actions or putative class actions brought by employees where they are alleging: one, that they were not given the notice that the biometrics were being collected and all the various components of what the notice requires and/or they are alleging that they did not give consent under BIPA. It is a written affirmative consent for the collection of biometrics or they might bring allegations around the retention and disclosure, either that they didn't get the policy or that the company or employer is not actually following the policy around the biometrics. So that's one area that has generated a lot of litigation in kind of the employment context.

Another area where we are seeing a lot of litigation is the consumer context, right? As these biometric components of products are rolled out or used at, for example, entering into amusement parks or venues, there are a lot of tools now that will allow subscription or season pass holders to enter into a venue using biometrics. And we've seen litigations around that from consumers saying that again alleging they did not get the notice, did not give the written consent, retention, destruction kind of similar claims, but in a consumer context. We're also seeing also in the consumer context, but not necessarily in the venue access, there's lots of other similar kind of groupings of litigations. For example, we've seen some really, I think, quite interesting products like mobile app products where individuals can scan their face and try on makeup, try on clothing, try on jewelry, and those are, in my personal opinion, really neat products, but there are allegations that they're collecting the kind of biometrics that would trigger BIPA and whether or not individuals are giving informed consent or getting the notice that is another area of cases we're seeing. Very interesting cases coming out of the kind of restaurant and retail space. For example, there are some cases working their way through the courts now around the use of AI and voice in ordering at fast-food chains when ordering at the drive-through. There have been

some fairly, you know well-known fast-food restaurant chains that have had actions brought concerning use of technology where the allegations are that the collection of the voice to process the order using AI technology and the way this technology works is that an individual would pull up to the order station at a fast-food chain and they would say their order and to facilitate the placement of the order by the employees inside the fast-food chain, there is some AI technology that picks up [you know] keywords and then those keywords appear on the screen to help facilitate the human being who's actually placing the order. So for example, I pull up to a fast-food chain and I say I want fries and a burger, but I want my burger with no tomatoes and no pickles, only lettuce and special sauce or something like that, and this AI technology hears that and then puts it on a screen or in some other fashion fries, burger, no tomato, no pickles, lettuce, special sauce. So it's a way to, again, drive efficiencies at these fast-food chains and we're seeing litigation arising around that.

And these cases, it's so interesting because the statutes are, can be somewhat perplexing. There are terms that are not always defined, so we're seeing a lot of legal issues being worked out around how to interpret these statutes through these cases and is this actually capturing biometrics? Is it recording it? How does the product actually work? So there are really interesting litigations all around.

One more that I think is noteworthy, and this is kind of a lot coming from, I think, arising out of the pandemic, we saw a lot of online test-taking tools become very prevalent. I personally sat for some IAPP certifications during the height of the pandemic and those have tests at the end, which are typically given in-person pre-pandemic and because of the pandemic, they were being given remotely using testing software and that was very common right at the height of the pandemic. Schools were administering tests, all of these tests were being administered through remote software tools. And there are now, a number of lawsuits against those companies alleging that they have used biometrics and did not get the proper consent or give the proper notices. So it's just these litigations are springing up in any way you can imagine where people would be relevant, which would be Illinois residents, BIPA does not reach outside of Illinois. But for companies that operate nationally, if they are collecting or using biometrics, they should be looking at their programs to make sure they're complying with BIPA if they're doing any activity or have employees in Illinois.

Priya Keshav:

Now you bring up some really great examples and points, right? Like we talked a little bit about security, you know? Security has been a major concern with a lot of breaches and we've only seen the security concerns grow, and obviously, biometrics is kind of viewed as a way to sort of get rid of passwords. And the other example that you brought up is with respect to the testing, even voice recognition, right? So we have all made large shifts from going from maybe a physical way of interacting with customers to more online, more remote. And there has been a lot of technologies being adopted and some of this is looking at efficiency, some of it is looking at scale. And obviously, biometrics plays a role because there are cameras involved, there are voice recognition technologies involved and so. But when there are voice recognition technologies and cameras involved, you somehow hit the biometrics laws that have been put in place under various states. So it's easy to sort of not notice how, you know, maybe it's not so obvious as you're implementing some of these technologies that these are actually biometrics that you're recording and storing, and so when consent and disclosure and of course retention requirements are sort of not taken care of, it leads to problems which kind of also has resulted in class action lawsuits, so, makes sense.

Samantha Ettari:

Well, yeah, and you really hit on something that I think is important is that this is not uniform, right? We have three states that have specific biometric laws, we have BIPA in Illinois that we talked about extensively, we have CUBI in Texas and [we have] Washington state has a biometric law. Then we have these consumer privacy laws that are, you know, defining the data within its scope to include biometrics and we have the state consumer privacy laws layered on top. So right now, we have the California it’s the law that is enacted at the moment, but it's being amended in January of 2023 by the California Privacy Rights Act, the CPRA, which puts heightened obligations on businesses that collect biometric information. And it has a very expansive definition of biometric information, which is another point to the point you were just making which is, companies have to be thinking about where do we operate? Are we in these jurisdictions that have biometric laws? Are we in one of the three states with biometric specifics? We have some municipalities that have biometric laws like New York City has two, Portland, Oregon has one. So are you in a city that has some specific biometric ordinances, or are you in a state that has consumer privacy regulations and statutes that capture biometrics within a broader definition of personally identifiable information?

So California, or soon to be Colorado and Virginia. And it looks like Utah is on the horizon, so you've got this kind of mishmash of states, or municipalities. And then, layer on across them, there are differences in the definitions. So as I mentioned, CPRA has a very broad definition of just like a number of biometric identifiers that are listed but it has, I would say, a limiter and that all those identifiers to be captured by the statute have to be used singularly or together or other information to identify an individual. So that is, I don't know how much that will actually limit, but there is that limiter. With CUBI, the Texas biometric law, the limiter first of all, has a kind of a shorter definition of biometric identifiers than the CPRA, but it also, to come within the scope of CUBI, the information has to be used for commercial purpose. Now we can talk about what does that mean, it's not defined under the statute, but again, it just goes to that point you made a few moments ago that if you are a company that is collecting, using, storing, or processing biometrics, any or all the above, you need to be mindful of what jurisdictions you're in and what is the definition. How are you using this data? You know, being very cognizant of what products or services you are offering that is collecting biometrics, and what biometrics and then figuring out how does that tie to the definitions in the jurisdictions that you're operating in.

Priya Keshav:

No, you bring up a really good point. I remember having a conversation... this is probably a year ago with someone around the definition of biometrics. And then I started going down the path of definition only to realize all the nuances with every jurisdiction and the nuances being so complicated. So yes, you bring up a very, very good point that there are so many nuances and definitions on what is biometrics for each of these jurisdictions as well. But we talked a little bit about CUBI or mentioned CUBI, it was largely forgotten till a recent incident sort of brought back CUBI, so maybe we should talk a little bit about CUBI and when it was enacted and what it is about?

Samantha Ettari:

Yeah, definitely well, especially as we stand here in Texas right now, me in Dallas and you in Houston. So CUBI, while it sounds like the cute acronym amongst the Biometric and Privacy alphabet soup,. And as you said it is the oldest biometric law on the books in the US. It was enacted in 2001, which I don't think

many people realize that it predates BIPA or by how much it predates BIPA because as we talked about the outset, CUBI is not one of these biometric statutes that has that private right of action. It is only enforceable by the Texas Attorney General and we are unaware, I am unaware of any public enforcement action under CUBI and certainly no litigation until just this past month. So, two decades, CUBI has been on the books and companies that collect biometrics, who are thinking about compliance, it's not like it's an unheard-of concept, right? They were typically, if they were operating in Illinois, they were setting their compliance program to BIPA because BIPA is a little more rigorous than CUBI in its requirements. So, if you generally are complying [with CUBI] with BIPA, you are likely complying plus with CUBI. So most companies, if they were operating in Illinois, they were thinking BIPA and then if they were aware, like CUBI or the Washington biometrics, but a BIPA-compliant program would satisfy that compliance, although now we're learning a little bit more through this recent Texas AG action, which suggests that CUBI might have its own, again going back to these complexities, might have its own complexities around biometrics that BIPA does not. So, just to pull back, companies that were not Operating in Illinois might not, obviously some were aware of CUBI or Washington if they're sitting in those jurisdictions, or they've got really vigorous compliance program. But I would say that CUBI was definitely a sleeper, right? I mean if BIPA wasn't on the radar, it would be really stellar compliance program that had CUBI on the radar and now CUBI is definitely on the radar because the AG last month brought this enforcement action against Meta.

Priya Keshav:

So let's talk a little bit about the recent Meta case, right? What can we learn from it because obviously, it was a surprise to see the enforcement action happen? And you mentioned, maybe it's a long question at this point if I ask this, but what are some of the significant differences that you see between CUBI and BIPA apart from just the private right to action?

Samantha Ettari:

So as you said, I think the case was a surprise because CUBI had so long not been enforced by the Texas AG, at least not publicly. And then in the middle of February, this complaint was filed against Meta. You know, alleging that Meta had violated CUBI, and a general Texas consumer statute in its collection of biometrics allegedly for use in its tagging, you know the feature that Facebook, and according to the allegations, Instagram had previously had of when individuals uploaded pictures, the tagging feature would prompt: is this is so and so. And the allegations in the complaint are that through that tagging functionality there, the allegations in the complaint are that there was not the proper notice that biometrics were being collected or the consent that is required under CUBI. And so that is, that's the basics around the allegations. And to your question about what does this show, what does this enforcement action around CUBI teach us about CUBI because CUBI is a really short statute, right? It's much shorter than BIPA. It has kind of a similar biometric identifiers definition. CUBI also was narrower in that it wasn't just the capture of biometrics and biometrics as defined under CUBI, but use for commercial purpose. But as I mentioned, what did commercial purpose mean that was not defined in CUBI and then there was there's an obligation under CUBI, a retention obligation, that within a year of the expiration of the purpose of the collection of the biometrics, it be destroyed. And then there are some limited carve-outs and BIPA explicitly carves out information that's regulated by HIPAA or GLBA, CUBI doesn't even have that carve-out, it has a very narrow carve-out for some GLBA voice data, so again, even if we kind of narrow or carve out them BIPA, so it's a very short statute; doesn't appear to

require express consent. It appears to require notice that biometrics are being collected and that it has some language about consent. But it appears from the reading that consent can be implied, but again, many of these biometrics, the actual language of the statutes, is being fiercely contended. Same with BIPA, through the courts, what is a violation? You know that is not defined in these statutes. Is it every time an employee goes and scans their hand, is it per employee? Is it the first time?

So the way that the CUBI complaint reads the Texas AG complaint reads if it, the Texas AG's alleged in the complaint double digit millions of Texas residents as Facebook users. The allegation and this is statutory is 25,000 per violation. So take that and multiply it by millions. It's an astronomical figure, but what will the per violation be if this is actually litigated. Again, another point of litigation that might arise through this action is what is commercial purpose. As alleged in the complaint, it seems to be so the complaint does not allege any kind of sale of data. The commercial purpose appears to be grounded in the use of biometrics to improve internal products. So that's interesting. What is commercial purpose?

Another thing that's interesting is CUBI has very strict restrictions on the sharing of biometrics, they say sale and sharing, and then there's a number of there's four or five enumerated instances in which data can be shared, but they are so narrow I have them in front of me and I can read a few just to show you how narrow they are. The individual has provided consent for identification purposes in the event of the individual's disappearance or death. OK, so not consent. Like freestanding, but in the event of their disappearance or death so very narrow carve-out for sharing. Or that it's necessary to complete a financial transaction that the individual requested or authorized. That makes sense, because in the kind of financial transactions context, there may need to be sharing of biometrics for security, and authentication purposes, but again, a very narrow exception for sharing and sale and the rest are kind of like law enforcement based or permitted by federal or state law. So think about uses in litigations or things like that but very narrow interpretations of law enforcement and judicial process type exemptions. Other than that, sharing is generally restricted under CUBI, and many of these biometrics have similar restrictions and there are very narrow instances where they can be shared. What is interesting about this Texas AG complaint is that the allegations of sharing where the Texas AG is alleging that there was violation of CUBI for sharing. The allegation in the complaint is that the sharing was amongst subsidiaries and affiliates of the brand. So that is a very expansive view of sharing because that's within brands and affiliates, so that will be another area to watch. Or does that get litigated? It'll be very interesting to see if that component gets litigated because it was seen to restrict sharing amongst affiliates completely without getting without notice of sharing and getting consent for sharing. So it's like all of that initial CUBI compliance for the company that is doing the collecting, if they're going to share amongst affiliates, this litigation suggests that all of the same CUBI compliance has to be handled and implemented in the compliance program for any affiliates that data sharing will be done with.

Priya Keshav:

We talked about a lot of things. So what would be, I mean, if I'm a company thinking about biometrics or maybe thinking about implementing technology that might even be closely related to biometrics. You know, we talked a lot about consent disclosure, being mindful of sharing, retention or I mean is it just a matter of walking through some of these things and staying careful to not violate, or would it be stay clear as much as possible. What would be your thoughts? Or seeking legal advice, which can be one of them so.

Samantha Ettari:

Seek legal advice because, as I think, has been clear from our conversation today, it is a complicated area, jurisdictionally the definition of what exactly is a biometric. All of these things are complicated. Seeing how CUBI is being enforced has added layers and layers of complexity to kind of plain language reading of the statute. So yes, definitely, if you are an entity or within the scope of any of these statutes or regulations and you are collecting biometrics, I think that is wise to consult with a lawyer who understands this space, but then to step back and get into like some of the granular compliance, you said, maybe almost jokingly, like just don't collect biometrics. Yeah, but that is one thing to think about. Like there's a concept of privacy by design and that is thinking about privacy from the very beginning of conceptualizing a product, ingraining privacy in with your engineers and your software developers so that privacy is a main consideration throughout the development of a product. So when it comes to biometrics, does this product really need to capture biometrics to function? So that should be question number one at the privacy by design, how are you going to capture the biometrics? Do you need to have other identifiers, other PII? Because if not that might take you, and if you can successfully separate the biometrics so that there is no ability to identify an individual, that may take you out of the scope of some of the biometrics laws, so that is like at a privacy by design right at the outset of a product. Then you've got the products where, no, actually this product will not be functional without biometrics, the functionality or the utility or efficiencies of the product do require actually identifying an individual, and so you've now kind of like moved through that privacy by design and now you're at the place where you're now coming within some of these statutes and regulations. And so, yes, to your second point, making sure that you have the disclosures that you are making your consumers or employees aware that you are collecting biometrics. The scope of the collection, their retention policy, how you're going to destroy, how you're going to use. Very interesting in again, circling back to CUBI and the Meta allegations, there's allegations in there that the term biometric was not used in the disclosures. That's one of the allegations. So does that suggest that the Texas AG expects the privacy policy or the biometric policy that's disclosing the collection of biometrics to actually use that word? It might be suggesting that, so it might not be enough to just say we are collecting your fingerprints for the following purposes. So yes, going back and looking at your policies, are you in a jurisdiction that requires written consent? Affirmative consent? Can you rely on, click throughs, checkboxes? These are all questions that the answers may differ depending on your jurisdiction and which biometric law you fall under.

So and then, of course follow the lifecycle of the data from privacy by design, and the initial conception, through roll out to consumers, and, we've got all those consumer interfaces like the privacy and biometric notice. Do you need consent, all the way through? How long are you keeping this data? You know, CUBI says, if it's employee data and the employee’s terminated or that that relationship no longer exists, the company has no business retaining the data after a certain point, so making sure that if you're operating in Texas, then you are implementing the right kind of retention and deletion program. If you're collecting from employees, so that that is one of the features of your separation process is that if you've collected biometrics as you're separating that employee, are you also ensuring you're complying with the CUBI data retention and destruction. And that kind of similar data retention, destruction issues are typically present across these biometric regulations. So looking at your jurisdiction through the life cycle of the data, there are a lot of touch points where these laws require action.

Priya Keshav:

I agree you brought up something very important right? Data minimization, we all need to take a pass and think about, do we really need this data? And that's something that takes getting used to. Any other closing thoughts?

Samantha Ettari:

Well, I think that is such a great point. The data minimization, I mean, that is a driving principle, it is embedded in the GDPR, it is embedded in the CPRA. If you look at all these biometric laws and consumer protection laws springing up, it is a driving principle. Do you need to collect this data, for what purpose, for how long? And there's like the whole other issue right of the vulnerabilities that exist and the security incidences and the breaches. If you have biometric data that you didn't need anymore, you're putting your company in a vulnerable position and there are consumer privacy laws like this, the CCPA and CPRA, which are primarily enforceable by AGs. But when there is a breach, there is a private right of action. So do you need to hold on to this data? If your company has a security incident, are you going to be vulnerable to litigation or regulatory action? That if you had just kept a cleaner house with your data and purged data that you no longer needed, you would have obviated or eliminated the risk, so I think that's such like a great point to end on. And it's all data, particularly acute with biometrics, because biometrics is getting treated as a sensitive data, it's categorized under the CPRA as part of sensitive information and you can see from how states are calling out biometrics, and we've highlighted the three states today, Illinois, Texas, and Washington. But there are a number of other states with biometric laws working their way through the legislatures. So again, just to circle back to that data minimization concept, if you don't need to collect the biometrics at this heightened sensitive form of data, or if you do fine but rigorous compliance and rigorous retention programs. When you don't need it, get rid of it.

Priya Keshav:

Thank you so much Samantha, for such a great conversation around biometrics. It's an area that we all need to pay a lot of attention to. We do, but I think there's still quite a bit of complexity when it comes to implementation of technology and the pace at which we're implementing technologies and companies. So, thank you so much.

Samantha Ettari:

Thank you, my pleasure.

コメント


Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page