Tokenization and your store

New approach shapes how retailers secure private information and consumer confidence against data breaches

With stores located in various states and, in some cases, overseas, chain stores face a unique data security challenge. The plethora of recent State Breach Notification Laws and European privacy laws, as well as industry mandates such as the Payment Card Industry's Data Security Standard, put a lot of pressure on chain store CSOs to come up with foolproof ways to protect consumer information against a data breach.

Source Article

Many retailers have already adopted localized encryption and follow data security best practices but, for some companies, this may not be the most efficient way to protect credit-card numbers and various forms of personally identifiable information (PII), including customer loyalty data, and employee social security and commercial drivers' license numbers, etc.

With traditional localized encryption, the encrypted data is stored in applications and databases in place of the original unencrypted data, which means it is located in many places throughout the enterprise. Every system that contains encrypted data is a point of risk and remains "in scope" for PCI DSS compliance and audits. What's more, encrypted data takes more space than unencrypted data, requiring costly programming modifications to applications and databases, along with increased data storage costs.

To solve these challenges, a new data security model -- format preserving tokenization -- is beginning to gain traction with retailers. Tokenization reduces the number of points where sensitive data is stored within an enterprise by replacing encrypted data with data surrogates (tokens) and storing the encrypted information in a central data vault. This makes data security easier to manage and provides an extra layer of security, but it also takes systems "out of scope" for PCI DSS compliance.

Tokenization explained

With traditional encryption, when a database or application needs to store sensitive data, those values are encrypted and the cipher text is returned to the original location. With tokenization, a token -- or surrogate value -- is returned and stored in place of the original data. The token is a reference to the actual cipher text, which can be stored locally ("in-place tokenization") or, as in the newly-emerging model in a central data vault. As long as the token is format-preserving, it can be safely used by any application, database or backup medium throughout the organization. This minimizes the risk of exposing the actual sensitive data and allows business and analytical applications to work without modification.

Format-preserving tokens can either match the expected data type or expose a subset of the original value to simultaneously protect the information and enable applications and job functions to continue unmodified. For example, the token could expose the last four digits of the social security number or credit card number to enable call center operations.

Tokens use the same amount of storage space as the original clear text data instead of the larger amount of storage required by encrypted data. And since tokens are not mathematically derived from the original data, they are arguably safer than exposing cipher text. They can be passed around the network between applications, databases and business processes safely while leaving the encrypted data they represent securely stored in a central data vault. Authorized applications that need access to encrypted data can only retrieve it using a token issued from a token server, providing an extra layer of protection for sensitive information and preserving storage space at data collection points.

Encryption, tokenization, or both: What's right for your enterprise?

There are two distinct scenarios where implementing a token strategy can be beneficial: to reduce the number of places sensitive encrypted data resides or to reduce the scope of a PCI DSS audit. The hub and spoke model is the same for both and contains these three components:

* Centralized encryption key manager to manage the lifecycle of keys.
* Token server to encrypt data and generate tokens.
* Central data vault to hold the encrypted values, or cipher text.

These three components comprise the hub. The spokes are the endpoints where sensitive data originates such as point-of-sale terminals or the servers in stores, various departments at headquarters, a call center or Web site.

In the traditional model, data is encrypted at the stores (spokes) and stored there; or encrypted at headquarters and distributed back out to the stores. Under the tokenization model, encrypted data is stored in a central data vault and tokens replace the corresponding cipher text in applications available to the stores, thereby reducing the instances where cipher text resides throughout the enterprise. This reduces risk because the only place encrypted data resides is in the central data vault until it is needed by authorized applications and employees.

In the second scenario, the model is the same but the focus is on using only tokens in spoke applications thereby reducing scope for a PCI DSS audit. In this case, employees only need a "format-preserving" token where the token provides enough insight for them to perform their jobs. For instance, the token will contain the last four digits of a credit card. In the traditional encryption model, cipher text resides on machines throughout the organization. All of these machines are "in scope" for a PCI DSS audit. In the centralized tokenization model, many of the spokes can use tokens in place of cipher text, which takes those systems out of scope for the audit.

Format preserving tokenization is ideal for some chain store enterprises, while a hybrid approach is better for others. Localized encryption is the default when stores are not always connected to a central data vault. In instances where stores are electronically connected to the data vault, tokenization is often the solution of choice. For many chain store companies, using a combination of localized encryption and tokenization is a practical approach for improving data security.

Format preserving tokenization protects payment-card information and employee information as well as all types of customer PII and loyalty data collected by many chain store marketers. Not only does the technology provide an extra layer of security in an extended enterprise, but it reduces storage space requirements and the scope of PCI DSS audits.

Gary Palgon is VP product management for data protection software vendor nuBridges, and is a frequent contributor to industry publications and a speaker at conferences on eBusiness security issues and solutions. He can be reached at gpalgon@nubridges.com. 


Recent Entries

How PA DSS Will Change the Application Business Forever
By David Taylor -- Most merchants and application vendors seriously underestimate both the scope and the force of the Payment Applications…
Tokenization and your store
New approach shapes how retailers secure private information and consumer confidence against data breachesWith stores located in various states and,…
Americans prefer online banking - ABA survey
For the first time, more US bank customers express a preference for managing their finances online compared to any other…
IKEA Execs Discuss Launch Of US Loyalty, Use Of Mobile Medium
Written by Amanda Ferrante   Tuesday, 15 September 2009 00:00Well known for its innovative approach customer relationship management, home furnishings retailer IKEA has…
First Data And RSA "Legitimize" Tokenization-Then What?
The conventional wisdom is that when large vendors enter a niche market, those vendors "legitimize" that market. But the announcement…
New driver license legislation proposed
Some believe that new proposed driver license legislation would help states better secure IDs while also protecting citizen privacy. Others…
Patients are keen on self-service healthcare
American are taking a shine to self-service healthcare.They may not be snatching the scalpel out of their doctor's hands and…
Touchscreen Technology Website
News from 3M on multi-touch and also launch of new "education" site touchtopics.com which is to explain all various touchscreen…
PCI Best Practice Supplement for Merchants
August 2009 release of best practice doc, PCI_skimming_prevention_form.pdf, directed at skimming attacks. Illustrates how exposed terminals in POS are targeted by…
Cloud Computing - Does Amazon fail PCI Compliance?
There's an ongoing debate about the ability of cloud computing services to meet enterprise regulatory compliance requirements, including the Payment…
End-to-End Tokenized Encryption
EPX now extends data protection to what I call the 'first inch" of a transaction, i.e., from the plastic to…
Guidelines - PCI DSS Wireless Guideline Supplement
Dcument purpose  - This document provides guidance and installation suggestions for testing and/or deploying 802.11 Wireless Local Area Networks (WLAN)…
Healthcare - Building Kiosks From Scratch
In an era of consumerism, physician group practices are looking for ways to improve customer service and gain loyalty. So…
Trends - Number of retail medical clinics shrinking
Projections that showed there would be 2,500 retail clinics operating by 2010 are coming up short as the industry has…
Wireless transactions and PCI DSS 1.2 Compliance
Article covering wireless transaction and protocols in context of PCI compliance. Amazing that 11% use WPA2. Gist of article is…
EMV Level 2 - Just what does it mean?
The purpose and goal of the EMV standard is to specify interoperability between EMV compliant IC cards and EMV compliant…
CUPPS: The Platform of the Future (Airline Kiosk)
CUPPS has been architected as the platform of the future, able to accommodate many things even beyond the agent-facing applications…
EMV takes aim at U.S.
Nice article on SecureIDnews covering EMV. by Andy Williams, Associate Editor, Avisian PublicationsLike a massive tidal wave, EMV continues to roll…
Tokenization and Enterprise Security
Nice article on tokenization which also highlights lack of formal standards for tokenization at this time.¬†Credit Card Tokenization: Put All…
Wal-Mart's Kiosk Trial Raises Serious PCI, Data Ownership Issues
Wal-Mart this month became the latest major retailer to experiment with self-service kiosks, selling space in 77 stores for units…



  |