Apr 14 2010

Forrester: Explaining tokenization & E2E in layman’s terms

Published by at 9:14 pm under General

If your company is trying to understand tokenization and end-to-end encryption (E2E) you could do a lot worse than purchasing a copy of the Forrester research paper, Demystifying Tokenization and Transaction Encryption.  And by ‘you could do worse’ I mean that I haven’t seen a better paper that tries to explain tokenization and end to end encryption but I do have a couple of bones to pick with the paper.  As someone who’s looking at both of these technologies on almost a daily basis, I think John Kindervag glossed over several important points people who are considering either technology need to be aware of.  But if your company is wanting to be proactive about removing credit card data from your environment and wants to learn about these emerging technologies, there’s more than enough in this paper to get you started.

First off, let’s talk about what I like about the paper.  The opening sections are all about comparing the credit card ecosystem to a back alley poker game.  For me, this metaphor works and it works well; there’s something about comparing the credit card companies and merchants to a gambling house and the players that’s less than flattering, but does an excellent job of highlighting the fact that most of what goes on behind the scenes in the card processing is a complete mystery to most merchants and all but the most savvy of customers.  And quite frankly, like back alley poker games, it’s not something most of us deal with (or want to deal with) on a daily basis. 

The metaphor is extended to help explain tokenization.  In and of itself, a poker chip is of very little value, but it can be turned in to the house to get money.  Similarly, if your company has tokenized the credit card numbers you store, those tokenized numbers no longer hold much value to a thief and there is no longer any financial incentive for the thief to target your company. 

The explanation of tokenization was good, but this was one of the first issues I had with the paper.  It assumes that tokenization is happening between the merchant and the acquiring bank.  In other words, once the credit card has been authorized, rather than the merchant storing a credit card number, they store a token that the acquiring bank has provided to be a placeholder for that card number in the merchant’s database.  This is one form of tokenization, but it completely ignores another form of tokenization that’s been on the rise for several years; internal tokenization by the merchant with a (hopefully) highly secure database that acts as a central repository for the merchant’s cardholder data, while the remainder of the card flow stays the same as it is now.  There are several companies selling solutions that let merchants perform this internal tokenization independent of the acquiring bank using their own hardware and software.  Several of the solution providers that were interviewed for the paper offer merchants this form of tokenization, so it should have at least been mentioned.

Another criticism I have of the paper is that while it does a good job of explaining that true end to end encryption is from the POS to the acquiring bank, it doesn’t do as good a job in explaining the complexities and pitfalls of point-to-point encryption(P2P).  It may be that I’m dealing with this on a daily basis and see all the pitfalls of the different points that a point to point solution can have with encryption, but I wish more time had been spent filling some of these out.  And then there’s the issue that a P2P solution can be combined with an internal tokenization solution to really complicate things.

I think that overall these are minor critiques of the paper by someone who may be too close to the issue.  If you’re looking to educate yourself and your company on both of these technologies, this paper is an excellent start.  But you have to be aware that this is just a primer, a starting point to truly understanding the complexities of the emerging technologies of tokenization and end-to-end encryption.  And this also only the first part of the paper, there may be more to follow that answers some of my gripes. It’s rightly pointed out that there’s very little set in stone about either technology, so you owe it to yourself to start getting educated now so you can know the difference between the a bluff and a full house when it comes time to place your bets.

No responses yet

Trackback URI | Comments RSS

Leave a Reply