top of page

Simplify for Success - Conversation with Debbie Reynolds



Debbie Reynolds- the Data Diva, the founder, CEO and the Chief Data Privacy officer of Debbie Reynolds Consulting, was on #SimplifyForSuccess, a podcast series presented by Meru Data and hosted by Priya Keshav to discuss IG programs.



Debbie discussed the right to privacy of individuals and transparency around data and how companies can use privacy as a business advantage.



When asked about her take on third-party cookies, Debbie said the future of advertising will be about the consent of individuals. She also stressed on the importance of data minimization for businesses to comply with the dynamic regulatory landscape.








Listen to the podcast here:

*Views and opinions expressed by guests do not necessarily reflect the view of Meru Data.*


Transcript:


Priya Keshav:

Hello everyone, welcome to our podcast around simplifying for success. Simplification requires discipline and clarity of thought. This is not often easy in today's rapid paced work environment. We've invited a few colleagues in data and Information Governance space to share their strategies and approaches for simplification. Today we will be talking with Debbie Reynolds.

Debbie the Data Diva is the founder, CEO and the chief Data Privacy Officer of Debbie Reynolds Consulting, LLC. Miss Reynolds is a world-renowned technologist, thought leader and advisor to multinational corporations for handling global data privacy, cyber data breach response, complex cross functional data driven projects. Miss Reynolds is the internationally published author and highly sought speaker and top media presence about global data privacy, data protection and emerging technology issues. Miss Reynolds has been named to the Global Top 20 cyber risk communicators by the European Risk Policy Institute in 2020 and recognized as one of the stellar women who knows cyber and cyber security ventures in 2021. Hi Debbie. Welcome to the show.

Debbie Reynolds:

Hi, thank you for having me. I'm pleased to be here.

Priya Keshav:

So, Debbie, we were just beginning to talk a little bit about privacy. Given the number of regulations and the complexity around complying with the various regulations that are out there. We were just talking about how much of this is cross functional.

So, what are your thoughts about the challenges for the privacy team in engaging the cross-functional people within the organization to talk about privacy and understand the privacy requirements.

Debbie Reynolds:

Yeah, that is a great question. I would say, privacy is a right of individuals to understand and have transparency about their data and how it is used. So, when companies are talking with people about privacy, you know everyone has their part to play. I think the problem with privacy and I don't think this is necessarily privacy problem, but it manifests itself in privacy very much because a lot of times the people who touch data or people who are in both technical roles, they are very siloed from maybe the legal folks or compliance folks that are thinking about regulation.

I think trying to bridge the gap between those groups is very important and then also anyone in privacy, you have to be able to talk to people at all levels of the organization. So, you know many organizations have operated in a silo fashion, and that's created privacy problems because a lot of times there's a disconnect. So, when companies get in trouble for privacy or have cyber issues. It's always about operations. You get fired for, not what you say you do, but what we actually do or what's actually happening.

So, there's kind of an empirical nature, right? That happens with privacy where you can demonstrate right now the way technology is, you should be able to demonstrate in an empirical fashion you know what you're doing with data, and so I think being able to connect kind of that legal regulatory sense with the fact that companies are stewards of data of individuals and that they need to find ways to change their business about how that data is handled and you can't really do that by what I call like paper promises. So the on-paper stuff is only going to get you so far, so you really need to dig down into what you have and be able to have some empirical evidence about that.

Priya Keshav:

So, you brought up some great things. You talked about being stewards of the data and We talk a lot about you know. Yesterday I was talking to somebody in the analytics team, and somebody brought up this idea of self service, which is you know giving people the data that an organization has and empowering them to do analytics on their own, and I mean the other side of the coin to something. It's a great idea, obviously because you know it empowers people to be able to use data as an asset, make informed decisions, and we've all kind of talked about how data is one of the most important things that a company can own.

But what are your thoughts around stewardship, as an organization and as an employee of some organization, both of us and you know that's true of others whom we meet. Do you think stewardship is only a theoretical concept? Or how do you even develop that as a privacy professional? Managing privacy for an organization, developing that culture of stewardship, saying that the data is important, you own it, but it's given to you by a customer and if you don't take good care of it, that sort of affects their life.

Debbie Reynolds:

Right, right exactly. So, you know the stewardship thing. I tell people, I tell them that the data belongs to the individual that gave it to them, and then the company is a steward. So, that data is being loaned to you, right? For a certain purpose, for a certain time period and your ability to be transparent with the Individual and also take care of the data has two different impacts, actually three.

So, one is obviously the regulatory issues. So, companies may be concerned with fines. So, let's say companies are concerned with getting fines from regulators, from privacy issues. That's one.

The second one may be cyber security where if companies aren't protecting that data in a certain way, they may also end up in hot water with insurers and regulators. It may take a reputational hit from consumers as well.

And then the third is, individuals will do more business with companies that they trust. So, if your customer does not trust you, they would not use your service. I guess that that's a double-edged sword, right?

You know. I always tell people that companies can make privacy a business advantage and we are seeing Apple right now show that that's true. I don't think it's a coincidence that their biggest quarter they ever had last year. It was their biggest quarter and a lot of that is because of their privacy push because people are really interested in having businesses help them protect their data. So, I don't think companies should see privacy and the things that they do and privacy to be as a tax or like a penalty for them doing business. It should be an advantage that will draw more people to them, so the more customers trust you, the more data they will share with you. And then if you take care of it. You know that can increase your bottom line.

Priya Keshav:

No, I agree with you. I definitely think about that a lot. You know, in my choice to whether I pick up an iPhone or an Android phone simply because I feel there is a level of trust from a privacy standpoint that helps you.

But let's, let's play the devil's advocate here, right? So, you saw massive breaches with let's say target. That has been a very old one. Equifax, SolarWinds and some others are making the news, Columbia pipeline so, but my question to you is, you also see those brands bounce back. And the reputational damage that they suffered is short term. At least that's been documented and I'm just kind of playing devil's advocate here. I truly believe in the fact that customers trust matters and beings towards the data actually directly impacts your bottom line, but I'm just kind of asking your thoughts on those who point out that ‘hey, I see that the reputational damage is short lived’ So, what would be your thoughts around you know those who make that argument?

Debbie Reynolds:

I think a lot of these bigger companies have a lot more money, right? They have a lot more support. Equifax is a totally different thing and I will separate them out. It was almost better for their business that they had a breach because they had products for credit monitoring that they were able to sell, so they actually made money from that.

But I think a lot of times when people see, you know, not every company is a target, right? Like they say in in the US for example, they say 99% of businesses are considered small businesses. When you see you know Target is a not small business, Equifax is not a small business. You know these companies have money. They have these huge cyber policies to get paid out. You know they have a board of directors. They have shareholders and stuff like that.

But the average person or average company that has experiences ransomware, malware, some type of cyber-attack. A majority of those who are small businesses end up going out of business within six months, so that's part you don't see because those businesses aren't big.

So, I think taking these things seriously, you know not only on a customer trust level, but also a security level is very important. You know and even when we think about Equifax and Target, you know, the privacy regulations weren't as strong as they are now. So, I think today that would be a very different story for those companies.

Priya Keshav:

And you also see them coming back and showing to the world. Because one of the things that happened with those companies is also that they invested heavily in cyber security, which they had ignored before or in privacy and in risk management in general.

You know I was talking to a Chief Compliance Officer, a few podcasts back, in one of the podcasts and he was telling me, his old job and position was created after a massive incident that led to a huge fine and that's when they realized, well, they need to take compliance seriously. So sometimes it takes us to go through an incident to realize how important it is. But hopefully most of us would probably do that before.

Debbie Reynolds:

Right? Yeah, I would love to see more focus on proactive, so we're seeing a lot of things in the news about ransomware and people doing these task force about, you know, someone had the cyber breach. But you know the best thing or the best investment for any company will be to be proactive because you don't want to be the lowest hanging fruit, right? You want to be able to have some maturity or security program and you want to be able to stay in business.

A lot of companies that pay ransom, it's because you know the choice, they have to make is, ‘can I do business? Will I go out of business tomorrow?’ You know what I'm saying so? They feel like they are in a bad spot, but a lot of that can be mitigated proactively if they just take, you know, even some foundational steps about trying to get more mature in their cyber security stand

Priya Keshav:

So, coming back, we just saw the news that Colorado passed the privacy law. I haven't tracked whether the governor has signed it, but he has 30 days to sign it.

What are your thoughts around state level privacy regulations? Do you think we'll see a national one yet? Uhm, also your thoughts are on, even as somebody who lives and breathes privacy every day, sometimes it's overwhelming to keep track of all the different regulations and the nuances and definitions and changes in definitions.

I don't know how much of this we are going to have to keep up with before there's some standardization across. I guess maybe to expect it globally would be impossible, but what are your thoughts around that?

Debbie Reynolds:

Yeah, I would love to see some agreement on higher levels in the US on what we think is, you know should be federal, right? And the problem is the way Congress is we can't seem to agree on almost anything so. I don't think we're going to see anything on the federal level for several years now, if at all.

And if we do see something on the federal level, I think it'll be very thin, like a wafer-thin law that doesn't have like a ton of features because it's just hard state by state. You know the states can pass these laws much faster than federal government, and they have been taking the reins over the years in the US on privacy, and so I think we're going to see a lot more state level privacy laws before we see anything happen on a federal level.

So, for me, you know a lot of these laws, they get thrown into the legislature and a lot of them don't ever get signed or get into law. So, for me, even though I keep an eye on what states have things moving I don't really pay too much attention to it unless it actually becomes a law because a lot of them don't reach the finish line.

Priya Keshav:

That's true.

Debbie Reynolds:

Yeah, I mean I just always say ‘let me know when the cake is baked’. So once the cake is baked then I'll look at it. Because a lot of times if they pass, they change a lot, so they may have to make some compromises or stuff. So being able to see what the final looks like, I think, would be helpful, but it's nice to know that there is at least some movement on state levels.

Priya Keshav:

It's such a moving target with respect to compliance.

Debbie Reynolds:

Oh, it is very much a moving target. I try to tell people instead of trying to lurch from one lot to the next. They need to sort of think about the high-level principles of why these laws are being passed, right? So, part of it is that consumers or humans want transparency in how their data is being handled. They want the data to be secure. They want to be able to make sure that the data is used for the purpose that it is supposed to be used for. If companies keep that in mind, I don't think they'll be in so much trouble in terms of regulation, but what we're seeing now is a huge shift or two big shifts.

One is transparency or what I call the ‘rise of the individual,’ so it's kind of the rise of the individuals' rights where before it used to be like, well, the company has more rights, so once we give the data to the company, the individuals rights sort of stop, so what these regulations are saying is ‘no, the individuals rights stand, and you have to respect the rights of individuals’. And for companies that can't be transparent with individuals about their data, they're going to have a hard time in the future. So, a lot of data systems that were built in corporations are not meant to be transparent, right? So, companies have to rethink privacy by design and how they are handling that data so that they make sure that they're being transparent with the individual.

Also, a lot of privacy laws have things in them about data retention, meaning that once the company has completed their business process with data, they're supposed to delete it or return it back to the individual, and that's unprecedented in this day and age. So, a lot of the time when companies have been asked to delete data, for a statutory reason, like you know you have to keep certain data for seven years or ten years or 20 years.

So, this is a very different animal where they're saying you know, think about your business process. Think about what you're doing, and then once you no longer need the data, you need to get rid of it.

Priya Keshav:

Yeah, and like you said most people thought about those statutory retention requirements as a minimum number of years that they had to keep the data for and so all they had to make sure was that it was not going to delete it before, but nobody cared about getting rid of it, except, yeah, storage was a concern, but, storage was cheap as well.

And there was always this thought process that maybe it will be useful for me tomorrow, so I like to store everything forever and go from there and as you store everything forever and you know one of the things that happens is when your garage is full, you have no idea what you have any more so you can't separate the good from the bad and it becomes an impossible scenario to be able to even start the clean-up process and most corporations are in that mode, they're like OK, I know I care, and I want to be able to bring transparency but I have a garage that has been full

Debbie Reynolds:

I think it's a mind shift from that data obviously is an asset right to your business, but if it is out of date or you know saved up, it has less usefulness within the business. And you don't know you know where things are. You don't know what's in that data. It may be, you know, should have been deleted a long time ago. That data becomes a liability.

Priya Keshav:

Yep, agreed. So, let's talk a little bit about Cross-border. You know you see, obviously it started with Europe, but you see a lot of countries adopt a more stringent process around how the data moves across borders, which has been the opposite of the trend to globalize, which was the trend in the 90s.

Now, that comes with its own set of problems. We have, as global organizations, we've put it data everywhere. Cloud and SAS makes it easy for us to not be very clear about how the data moves. And so once you have cross border restrictions on the movement of data and on processing of data, unraveling that has been a challenge.

Debbie Reynolds:

Right, yeah, uh, yes, absolutely. These cross-border data moves have been happening for decades. I think the challenge now has been that we're seeing the rise of regulations where they're saying, you know, instead of there being a firehose of data going back and forth. You need to really tie your data movements to a purpose, making sure that you're just not indiscriminately capturing data and moving it around.

And like you said, with the cloud. The thing that people liked about the cloud is that you didn't need to know exactly where in the world your data is, but now we have to sort of think about that, and now companies, when they store stuff, they do ask those questions. They want to know where the data is in the cloud and what regulations apply to that data.

So I think that there are companies who have been doing these moves for decades and now I think because of regulations, there are newer companies that didn't have to think about that, think about those data moves or think about standard contract clauses or something before and they now have to think that way and sometimes it has to happen on a case-by-case basis as opposed to like a wholesale, you know data dump type basis.

Priya Keshav:

What about we were just talking about cookies and all the changes that are happening with the Apple opt out, you mentioned Apple taking a more proactive approach and then Google making a decision to phase out third party cookies which means, and of course the Facebook LDU to opt out California customers or LDQ option where you can turn off and not send Facebook information for tracking California customers.

So, you see a lot of changes from an ad tech and ad tracking and marketing perspective and there are a lot of options, but they are new at least there isn't an emerging trend yet. So, what are your thoughts around some of the challenges? Some of the good things that you're seeing from the from this trend. And what would you advise your customers on some of these changes?

Debbie Reynolds:

Yeah, I guess the big change that I see. Well, first of all, there are a lot of cookie lawsuits that are happening in Europe right now against some of these bigger players about how they how they do their advertising and what we're seeing with Apple and Google at a high level. They're trying to reduce their third-party data risk by limiting the data that they're sharing with other parties and trying to shore up their first party relationship with individuals, right? So, if you’re an individual who consents to using Apple products or Google products, they can do more with you and with your data than they can if you don't consent.

You know what I'm seeing is business trying to also reduce their risk in that regard. So that's part of the reason why they're trying to get rid of third-party cookies, mostly because a lot of these websites really harp on the cookie part of things. And then the other part is, you know, from an Apple perspective, they're like, ‘we're going to give third parties less information if they want more information, they can ask you and then you can consent’. So, a lot of what we'll be seeing in the future is about the consent of individuals, especially.

You know B to C type situations to see how much of this advertising stuff they're going to buy into and then we also see companies like Google trying to do this flock thing where you know their ad businesses have not changed, but they're changing the types of information that they're giving to marketers about individuals, but they're not doing flock it. They're not intending to do flock at all in the EU, so right now this is like kind of a US experiment or other countries that have kind of lower data privacy laws about how data gets disseminated about an individual.

So, the thing that I advised my customers to do is, you know, limit what you ask for, think through the data that you need. Make sure you're not asking for more data. I recently had a situation where you know someone wants to do some SEL and there’s nothing wrong with us doing SEL, but you know we also have to think about you know, does it make sense to do or use some of these methods that have been used before like tracking pixels and you just have to have to make a judgment for yourself.

So, one thing I would advise my clients to do is Think about what data you're collecting, Minimize the data, If you know you don't need it, don't ask for it, don't collect it. Be a good steward of the data that you collect through individuals.

I think that Marketing is going to change dramatically and have a result of a lot of these privacy laws, including GDPR, really clamping down on this third-party data transfer issue. So, where before it used to be like, and I've actually had conversations with developers about this too. It's pretty funny so you know. They're proud of the data of the stuff they've created and traditionally, if they build something, for example. At the behest of a company, you know the regulator, if they have a problem, they'll go after that company, right? Their initial company, not third-party.

So, what a lot of these privacy regulations are doing. It's putting more skin in the game for the third parties and for the first party data holders and then what we're seeing is a company that has a first party relationship with an individual, they're being a lot more circumspect about who their third parties are and how what data that they're sharing with those third parties.

So, if you think about the Facebook Cambridge Analytica issue, you know that was a big deal because Cambridge Analytica was a third-party company that took a lot of data, and a lot of people were asking like you know why were they given all this data? So, I think that companies are really rethinking that third party data thing. There's responsibility on both sides. The responsibility is not all on the first party data provider, and it's not all on 3rd party, but there is more responsibility in that that relationship that you need to watch out for.

Priya Keshav:

No, I agree. And when I talk to the clients too like it's about, before it used to be, let's just collect it's. If there are twenty options, let me collect all of them, because who knows what I would do with it, right? So, I'm just going to keep it and now you have to spend some time thinking about, Do I need? Am I using it for anything? and if I'm not using it, why collect and store and increase my liability around it? And if I'm using it, then use it wisely and disclose of course and obtain consent.

But once it's done, you know it makes sense to get rid of it, because sometimes some things are done for user experience and I like it too, and you probably will like the fact that if I have been browsing for something and let's say my browser crashed, the fact that I can go back to the same place makes a lot of sense.

But at the same time, it's spooky. The other day I went to a website to look for an airline ticket and I got 3 emails right away saying make your decision, go to XY and Z and I'm like, I don't think I gave my Email address to you. I was just looking for, if I should travel and there it is, I get emails back from them. When I have never, I didn't proceed to that stage. So obviously they've found my email through either LinkedIn or Facebook or some other way of tracking that information, because I don't even have an account with them, so interesting, those kind of things is where you sort of start saying, OK, that's not something I expected to see.

Debbie Reynolds:

Right? Yeah, that's creepy, definitely, yeah, right. Entirely too much data sharing, entirely too little transparency. We want to minimize the data sharing to what's necessary right or what people consent to, and then we want to raise the level of transparency so that people really are clear and understand. If someone asked you if you're searching for a plane ticket, do you want to email from other people about your purchase, and you probably say no, right?

Priya Keshav:

This wasn’t even a purchase; it's just browsing with no personal information entered into any of the sites.

Debbie Reynolds:

So yeah, yeah, you would probably say no to that right? If someone asked you. But yeah, that's kind of the problem that we need to get over in in marketing. And so, I think it's going to be very interesting, probably within the next, I don't know 18. Months or so.

Priya Keshav:

Yeah, and I also look at the data, right? So just because somebody sent me an email, do you think I'm more likely to buy? Or am I going to make my decision based on other factors that are more influential for me? So how many customers do you really convert from those kinds of transactions? Versus how many consumers do you feel may start feeling like, oh, I can't trust this brand, you know?

So, it's a question of what is too much. And I think some of this, we're talking about data being an asset so you can make decisions about it in a very logical fashion as well.

Debbie Reynolds:

So right, yeah, yeah. A lot of that is based on psychology too, so it's like you know, if you don't buy it all you know that's bad. It's called like the abandoned shopping cart thing where they were, they're like, hey, come back here, you know, do your purchase. I guess if even one person decides to buy after that, they think that's a win, right? because otherwise they would have lost the sale altogether, so it's just crazy.

Priya Keshav:

That's true, that's true. So, I just saw your post about Ethics in AI. What do you think about ethics in AI? It's a large topic so I can spend an hour, or you can spend an hour talking about it.

Debbie Reynolds:

I think the thing that concerns me, and I've been concerned about this for many years, and to me it's almost like the TV shows and they talk about ethics or robotics. So, a lot of times, in movies it is like an evil robot, right? So, humans aren't working with robots. The robot decides that they're sentient and they take over and they, like, wreak all this havoc.

In a way I feel like, you know, in the in the real world we think about AI, people somehow abdicate their responsibility when they're thinking about AI because they think OK, this artificial intelligence is really smart and so then it helps me not to make decisions on my own. because I think in order to have ethics, you have to have humans involved because I don't think AI can be ethical on its own because ethics require judgment, right so computers are doing what you tell them to do. It's not making judgments, right? You can say, well, one choice is better than the other, or one is different than another, but I don't think that you can replicate ethics in technology.

I really don’t. So, I think for me, it's important that humans don't abdicate their responsibility and their judgment to AI and artificial intelligence. In a way, I almost think about people who, I'm not picking on Tesla, but the people who have like a Tesla car and they decide to get out of the driver seat like to me, that's like an example of abdication of your responsibility as a human, when you're dealing with technology.

Priya Keshav: That's true, but most times I also think that they don't realize that they're abdicating. What are your thoughts around assurance? Do you think that assurance as a function, especially from a risk management point of view? We just talked about so many meaty topics and these are not easy topics, right?

Changing the way we think about data. Which kind of is everywhere within an organization. Thinking about privacy and consent and transparency with your customers, which is not something we did ever before, thinking about how you're managing, I mean they're all related, but they're all like individually such big shifts from even two to three years ago, and so you're talking about again, AI understanding decisions, understanding ethics and decisions, keeping the responsibility within humans, when we don't even know, probably some of these decisions of as they are being incorporated into AI probably is not being transparent to the human who is kind of responsible for that particular software.

Debbie Reynolds:

Right?

Priya Keshav:

So, what do you think of assurance? Do you think Assurance as a function? And again, this may be an interesting question, or a debate in terms of do you need to bake it into your culture, or do you need to kind of do assurance where you sort of police for some of these things on a periodic basis? And maybe both are required, or should one be adequate? What are your thoughts around that?

Debbie Reynolds:

It can't be like a one and done thing. It can't be OK, we built this product and it's great. And then we're going to go out and have it make all these decisions. And so there has to be kind of a calibration and kind of a continual look and change and adjustment that has to be made.

And then you know, there's been so much black box technology. It's just kind of been rolled out and they have all these promises like you know, like there are all these things about emotional AI where they can till people are happy or sad and stuff like that. And to me that's more of a novelty. I don't think there is any science or any you know, for example, just because I smile, that doesn't mean I'm happy. So, if you are thinking that the AI is going to tell you what my internal emotions are, that's not a credible way to be able to get that information.

Right, so I think part of it is knowing how far you can go with AI and then knowing that it needs to be transparent, and it needs to be constantly looked at and not every algorithm is made for every purpose. So, you have to be able to make sure that it is utilized, and it's implemented in a way that does not harm individuals, and there's this transparency.

Priya Keshav:

Any closing thoughts?

Debbie Reynolds:

Closing thoughts, closing thoughts. I'm really interested to see what's going to be happening in the marketing space as it relates to this whole cookie thing. Uh, you know, not a lot of the tracking and tracing that people had experienced in the past, I don't think you can blame it all on cookies, so I think if we're only looking at cookies, we're not looking at the bigger picture, right?

So, I think the idea is how do we get transparent with users, how do we get their trust? How do we come find a way if there has to be a third-party transfer, make sure that it's tied to a purpose and it's not like indiscriminate data transfer and then hold companies accountable that are stewards of data, whoever they are.

Priya Keshav:

Well, thank you for your time. I think it was a very interesting conversation to talk to you. And as usual it's been a while since we both connected. So, it was, it was fun.

Debbie Reynolds:

Oh, thank you so much. You have great questions so it's always interesting for me to be able to have these chats.

Priya Keshav:

Thanks Debbie.

Debbie Reynolds:

You're welcome. Have a good day.

コメント


Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page