top of page

Simplify for Success - Conversation with Robin Meyer



Robin Meyer was on #SimplifyForSuccess, a podcast series presented by Meru Data and hosted by Priya Keshav.


Robin discussed the technique of tokenization and the advantages offered by it. She also spoke about her professional journey and TokenEx’s cloud data protection platform.





Thank you to Fesliyan Studios for the background music.


*Views and opinions expressed by guests do not necessarily reflect the view of Meru Data.*







Transcript


Priya Keshav:

Hello everyone, welcome to our podcast around simplifying for success. Simplification requires discipline and clarity of thought; this is not often easy in today's rapid paced work environment. We've invited a few colleagues in data and information governance space to share their strategies and approaches for simplification.

The importance of priority for data privacy and security has increased tremendously today for individuals, companies, and regulatory agencies. Businesses are having to rethink how to approach privacy, given the increasing awareness among consumers and the escalating focus from regulators. The cost of privacy compliance can be significant and it is very difficult to address privacy issues retroactively. A proactive and pragmatic approach to privacy is the only way forward. Companies are realizing this and incorporating PETs into their privacy-by-design process.

PETs or privacy enhancing technologies are a broad range of technologies that, in conjunction with changes to policies and business frameworks, make it possible for companies to be data-driven without compromising the privacy of their customers and employees. PETs help address privacy and security challenges in numerous ways to enable anonymity, pseudonymity, unlinkability and unobservability of data subjects. Today, we will talk about one such PET which is tokenization with the general counsel of TokenEx Robin Meyer. But before we get started, few disclaimers: the information covered does not and is not intended to constitute legal advice. All views expressed are those of the individuals in their individual capacity and not of the firm they represent.

Hi Robin, welcome to the show.

Robin Meyer:

Hi Priya, thanks for having me.

Priya Keshav:

So tell me a little bit about yourself.

Robin Meyer:

I am General Counsel for a company called TokenEx, a company that provides business to business services. We do data tokenization, which I think we'll talk about a little bit more in a minute, and before this I was in public sector. Our state where I live had a consolidated IT environment and I was the first IT lawyer they hired and covered gosh everything in tech from fiber telecom, hardware software professional services, whatever it was and it came through us and I was embedded with the CIO and the CISO and the infrastructure group and all that. So one thing that came up pretty quick was following the data. And so that just stayed with me. And here I am today at TokenEx.

Priya Keshav:

Tell me what is tokenization? It is a new term for most of us. What does it even mean?

Robin Meyer:

So what's funny about that... in spite of what I just said about having looked at every kind of contract, I thought that there was in it, when this job came up and I knew I was going to be pivoting out to private sector. It just came up one day on LinkedIn and I looked at it and I thought Oh my gosh. This is exactly me, except I didn't know what data tokenization was, so I had to go out to YouTube and dig around and find our CEO explaining it to other people who didn't know what it was. And then, to find out it's in the middle of things we do all the time, but tokenization is invisible to most of us.

So here's how it works. I'm just going to make this up. Let's say, I get on Virgin Atlantic website tonight and I think that's an airline and I'm going to make an airline reservation. And so I've put in my credit card information, I put in where I'm going, what I'm doing, and maybe some other personal information. And TokenEx is a company that when I put in the credit card information on the website, one of our solutions it's invisible to the person, but it picks up that personal data that's been identified by the company.

Let's, in this instance, it would be an airline. It's picked up by TokenEx, it's put through an algorithm, and so it transforms, let's say, that credit card number. It transforms it and then sends it back to the airline so they never get, they don't have the full credit card number or other information about it. They have what they need, still showing so that they can work with it in their business. But if they had a breach, then my credit card number is not in their environment, and the way I just described it, it's not in the TokenEx environment either. It literally transforms, so what used to be perhaps a credit card number or some other information about the credit card is no more. It is turned into a format-preserving field, so it's like if it was 16 digits long, well, then it's still 16 digits long. That's what I mean by format-preserving, but in place of the numbers there are non-sensitive characters, so it kind of scrambles most of it.

But the beauty of it is that the business can still use it, so that number in my reservation can still flow through their system without them having to see the whole number or seeing none of it, which is how it goes with encryption. So that's basically what tokenization is. So when you're on a website, you put your credit card information. If somebody is on their game, TokenEx or someone like TokenEx, but let's just say it should be TokenEx and steps right there, picks it up, transforms it through the algorithm and sends it on. And another way is now there are card present applications, but anytime particularly this amped up like we all know during COVID. It's called card not present, meaning you're putting your card number in a website or something like that. You could be giving it to someone over the phone, but that's the idea.

Priya Keshav:

So tell me a little bit more right. We are all used to encryption and encryption, kind of does something similar in the sense it takes the credit card number and changes it into something else in a way that somebody cannot identify the credit card number, but it's different from the TokenEx tokens and quite different from encryption. And why would I choose token? Why would I choose to tokenize it versus encrypt it?

Robin Meyer:

Right, right? So encryption is more or less, it's either encrypted and you're not going to see it or it's unencrypted and you are going to see it and so to the extent that you have data in your environment that is unencrypted. And let's say, you've got several departments that need to work with whatever that piece of personal data is. If it's just merely encrypted or not encrypted when they're working with it, then that, what we call is clear data, the whole number of whatever it is, maybe it's a Social Security number, it could be any number of things. But if it's just encryption alone, you've kind of either got it in the closet or you got it out of the Closet, but you don't have both.

Tokenization allows you to get the same effect as far as and if there's a breach, the bad guys don't get the information. But it doesn't leave it in a format that, like for instance, a Social Security number It doesn't leave the social security number in the environment. But yet the company still needs it, they need it a lot of times. We may talk to somebody on the phone, they'll say what's the last four of your credit card number? What's the last four of your social? What's the last four of your phone number or whatever it is? And that's clearly something they're using to identify us in their system. Maybe along with a few other things, so having tokens allows you to keep using the data. And you don't have to worry about the fact that it's, you've got all of it for the world to see, and if there's a bad guy in your system, the other thing that happens too and everybody kind of knows this now, but a few years ago, this wasn't as apparent.

When you have a bad guy in your system and well, you don't always know it. And you try to put in security layers to figure it out, but that's how they get information, right? As they come in and you didn't know. And let's say it's encrypted and you unencrypt it to do something, the business does, that they need to do. And then there's a bad guy in this system. Well then the bad guy has the data. But with the token, that's not the case, because that piece of data isn't there. Does that make sense?

Priya Keshav:

Yeah, so, in other words, you still keep it in a 16-digit format, so it's a number that can be used, but it's a number that represents the transaction but it’s not a number that represents your credit card, so if somebody took that 16-digit number, in other words, they couldn't just go, use it and shop for and start buying stuff.

Robin Meyer:

Exactly and if the business then wants to send it and to another merchant or some other company that they have a business relationship with. When they want to send it over, they can put it back through to be detokenized and we can detokenize it and then send it over to whoever they direct us to. We don’t see that customer data, so that's another thing that's really good about the way the tokens work is that we don't have in this way that I was describing with the algorithm, we don't have the personal data in our environment either, right? And you have a token, and then you have something that needs to be used, there's a couple of things that need to come in to TokenEx for us to realize it's really our customer and then we'll detokenize it and send it where they tell us to.

Priya Keshav:

I'm going to take a little bit of a step back, right? GDPR introduced these terms, in terms of pseudonymization and anonymization, and they define pseudonymization as a term where you're masking it in a certain way but it's not permanent. So it’s masked or it's maybe separated or something happens in a way, it is secure and it protects the information, but it can be reversed or it can be linked back and those kinds of things versus anonymization is a true anonymization, where if I anonymized it, I pretty much can't go back, like there is no connection to the original data in any way, shape or form, so once it's anonymized, it's irreversible and not traceable to that transaction or person or in any shape or form, right? And they kind of also establish that, if it's pseudonymized, it's privacy protection, but it needs to be managed and within the scope of the privacy regulations. But if it's anonymized, it is pretty much out of the scope of privacy regulations because it's safe, right? It's not identifying information anymore, it's pretty much out of the scope. So with that context, how do you sort of look at TokenEx? It's more of because you are able to detokenize it back, it's more pseudonymized data as opposed to anonymized data. Am I correct?

Robin Meyer:

Absolutely correct, yes. OK, so we are still, for instance, we're a processor under the GDPR for a lot of our clients. It would be nice if it were anonymized and we could fall out of some compliance, but that is not the case. It's pseudonymization.

Priya Keshav:

OK, but it's still a very important technology and it's privacy-enhancing and as GDPR and other, both ICOs in the Europe as well here, the AG's office and the recommendations are just pseudonymize as much as we can, because when you pseudonymize, you do secure the data. So it's one of the technologies that would be called a PET, am I right?

Robin Meyer:

Oh absolutely, absolutely. Well, there's several things I keep saying one thing, and then I say 10 more, but TokenEx is a dual solution. That's the way I look at it. It's a security layer because you're securing the data by not having it in the environment and that helps companies with audit requirements, helps companies reduce scope of otherwise burdensome standards or laws like the payment card industry, data security standard. If you tokenize the data then there's a lot of compliance obligations you meet by tokenizing it. The NIST Privacy framework has the idea of tokenization that's scattered throughout that framework as a way of observing and the NIST privacy framework kind of holds hands with the NIST cyber security framework. And tokenization is a method that you can use with the data that goes to both of those, so it's nice, there are some security solutions that a company can implement and its security, and that's that. There are some that might be privacy that they're just not about security but tokenizing data checks both those boxes.

Priya Keshav:

What other use cases do you see for tokenizing beyond payment ? Obviously like you said, there are some obvious use cases because you kind of meet the PCI requirements and then of course it makes a lot of sense, right? But I'm sure there are people using TokenEx for other reasons as well. What are some of the use cases that you see.

Robin Meyer:

Right, well, there's some use cases now and there are some additional more expanded use cases that are coming this year that I find so exciting. But the ones that are there right now are ones that and whatever the environment is, if you've got structured or semi structured data then that can be tokenized. So it could be, like I said earlier, could be a Social Security number, if course, the payment information, it could be some information. We've got some clients who use it in relation to HIPAA data. Wherever the personal data is. If it's in a structured semi structured format, it can be tokenized.

We recently had an enterprise customer ask us about their legacy systems. They have a lot of disparate systems, and some of them are legacy and I mean really old. Not legacy of three years ago, really legacy systems and they were asking us about protecting that information because there's a lot of those legacy systems that they have in place aren't interoperable with a lot of other technology that might be out there today. And so that was a question that came up and they said, “We've got databases with all these different fields. Can you tokenize those so that we can get out of the world of just being encrypted or unencrypted?” Because there was still business use for those databases, but they're just based on really old technology, so that was a use case.

Some things that are coming up, one in particular, I'm just super jazzed about, it’s a tokenization instead of a structured or semi structured data element like we just talked about, it's the ability to tokenize a record. So when I think about a record, I take everything back to a file cabinet. And maybe that's the number of birthdays I've had, but once I think about it in a file cabinet, I'll make sense. So in the file cabinet, if I pulled out a form and I had a magic marker and I wanted to, how we did back in the day, if we were going to put something in the trash, we would take the magic marker and we would cross out our account number or we cross our birthday or we cross out whatever. And that's a way to think about tokenization electronically. That's what we're doing, technology-wise, that's what we're doing.

And this new technology that's coming along is going to be able to take what I just described, and whatever, all the information that would have been on that. Let's say it's an explanation of benefits and I've gone to the doctor and it tells where I went and what I did and a lot of things. And being able to tokenize a whole record about somebody is really important. It will stop and there will still be use cases for what we've been talking about. But for a customer who has their records set up in a way that they would rather tokenize the whole record rather than just kick out fields that they want to tokenize, it gives them that ability, so that immediately makes me think of, and this is probably just from my background, but immediately makes me think of healthcare records. It makes me think of insurance records, those two industries have just such a crazy amount of information about all of us, so the ability to tokenize a record versus certain fields that are structured. It's a game-changer.

Priya Keshav:

No, you bring up a really good point, right? I think we were talking on another podcast and I was talking about basically using synthetic data which is not the same as tokenization, because in that case you're creating fake data, but one of the things that was brought up in that podcast, and I think it's interesting, is because before you could generate synthetic data, what happened was the data was sensitive so nobody could look at it. There had to be a lot of restrictions with access and so, when you create a fake data set that is a representation of the original data set, people had access to it because it's no longer the same data, right? It's not sensitive because it's not real data and so people started noticing things that because it was more available for analysis. And they learned a lot about the data itself, and the same could be true, right? Like if you are, if you have a credit card number and because of PCI compliance, you have decided to sort of restrict the access to the database, the entire database. Then there's so much that you cannot do because of the need for compliance and restrictions and security. Whereas if it is a token, at that point, the rest of the data is now usable. So in some ways, it makes a big difference to be able to kind of mask or change the sensitive information, not just from a security standpoint, but also from a data utilization standpoint, I'm presuming.

Robin Meyer:

Absolutely what you just said is so spot on. When you're able to tokenize a record then you can make pieces of it available to, say for instance, different departments. And so I might need to see one piece, you might need to see another piece, somebody else might need to see half the record and somebody else see all of it. And so it really gives the ability to split up the access based on the role and the principle of collect as little as possible and use as well as possible, disclose as little as possible, etc. So you're spot on about the way it opens up the ability for different roles to have access to it without having access to all of it.

Priya Keshav:

No, I agree right? Like I think , when you work with large companies and you work with the data teams, one of the things that they always sort of look at is oh, privacy means I'm not going to be able to use data, and it's not just about not using data, it's about collecting as little as possible when it comes to sensitive information. And then masking or using PETs to sort of protect or anonymize the sensitive information, and then you can still decentralize or democratize data, it's just a matter of doing it the right way without kind of exposing the customers or the employees or any other data subjects to risk, right? Because technically if you look at the data analytics team, they're not really interested in that sensitive information, they're mostly interested in the aggregate patterns, and so it's a way to sort of make data democratization and analytics possible without sort of compromising privacy.

Robin Meyer:

Exactly, that's exactly right.

Priya Keshav:

But if I may ask you a question, Robin. You mentioned that you will be able to detokenize so that means that you have an algorithm or a way to kind of reverse the tokenization, which essentially means that if somebody can get hold of it, they can technically reverse engineer it to detokenize all of the credit card information. Is that true?

Robin Meyer:

Right, so for instance, our customer would end up with a token. And then they end up with something, I would call it and this is not the technical phrasing, so for anyone technical listening, just close your ears right now, but so the customer would have a token in their environment somewhere else in their environment, they would have an API key that would be the call back to TokenEx and then TokenEx also has, and I don't remember the technical term for this, but basically, we if an API call came in and here's the API call to detokenize it. It can't just come from anywhere, it has to come from a specified place in the customer’s environment, so those together are what we'll detokenize. So on one hand, you've got a breach situation where the clear data is not there like we've already talked about. And the key is you wouldn't ever store, which is just logical, though you wouldn't ever store a token right next to the very next thing is OK, and here's how I did tokenize it and they literally stored together, you wouldn't do that, but you would have to. Which API call connected over and then it would have to come from a certain place in that customer's environment in order to even do it. So if it got taken and then a bad guy tried to send it over together, it would be rejected because it wouldn't be coming from the place it's supposed to be coming from. And my guess is there's more technical layers to it, but that's not my complete knowledge set, and I know that there are multiple steps to being able to detokenize, so you can't do it with just having the key.

Priya Keshav:

OK, I this might be a more of a personal question, but I like to ask this because one of the things that I always talk about is my own personal journey and the satisfaction that I get from sort of working on privacy and information governance. Because I just feel like through my work, by helping companies sort of improve their privacy posture, I feel like I give back to not just myself even that overall I'm improving the posture for all of us, right from a data perspective, but I was just kind of asking you because you came from a very different environment to a startup working for TokenEx. Any personal thoughts in terms of how you feel about the job that you're doing?

Robin Meyer:

Well, I should start at first by saying I love my job and I love TokenEx and it’s a really good company from the inside out that was really important to me coming from public sector to private sector. Because what I did was I jumped the fence from being a buyer to being a seller. And having seen a lot of technology sellers, some of them it just wouldn't be good alignment for me, to work for their organization, I just I don't want to work for an organization that browbeats a customer or plays hardball with their customers in terms of just their relationship. I mean I'm in contracts all day long, every day. However, I will say I was just greatly relieved when I took a look at our terms and there were some of the most fair terms I'd seen. And then I see that that's exactly how we treat our customers. So one of my first days that I got there, it just happened, this just kind of happened. It wasn't really about me being there fresh, but I heard our CEO on a call and I don't know if he was talking to someone externally, internally or who, but I've heard him say this since then too. He made the comment that we want to delight our customers.

And so on one hand, you have this technology service, this business to business and on the other hand, which doesn't sound like I don't know delightful to me, sounds kind of like a picnic in the park or something, but that that was the phrase he used and he's used it since. And so that that was something that was really important and that makes it a really good fit, being with an organization that provides value that's easy to understand, saves our clients a lot of money when they put us in as a tokenization layer. There's also recently there's some other additional solutions that are coming along that have to do with fraud prevention, that have to do with increasing authorization rates and all those mean money to our customers. So that's a nice part too. And it's and I don't have to convince anybody about data privacy. I talked to my friends and they tell me, OK, really, you just need to stop, we don't want to hear anymore. But I don't have to convince anybody at work because it's the very core of our solution. So everybody already understands.

It's just been wonderful to talk to you. It's just been a great conversation.

Priya Keshav:

Thank you so much for joining us, Robin. It was a great conversation.

Comentários


Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page