A recent study conducted by PriceWaterhouseCoopers on behalf of the Payment Card Industry Security Standards Council shows that end to end encryption and tokenization are the top choices for companies seeking to employ new emerging technologies to protect payment card and other critical data. And both approaches have their public proponents, including Heartland Payment Systems (HPY) CEO Robert Carr, who's been encryption's most vocal supporter in the wake of his organization's historic breach.
But what are the pros and cons of each approach? We turned to a panel of information security experts for their analyses of tokenization vs. end to end encryption.
Defining the Solutions
A quick look at the essence of these two solutions:
Tokenization replaces sensitive card data information with unique id symbols that keep all the essential data, without compromising its security. This approach has become popular as a way to increase security of credit card and e-commerce transactions, while minimizing the cost and complexity of industry regulations and standards - especially the Payment Card Industry Data Security Standard (PCI).
End to end encryption, also defined by Visa as data field encryption, is continuous protection of the confidentiality and integrity of transmitted data by encrypting it at the origin, then decrypting at its destination. The encrypted data travels safely through vulnerable channels such as public networks to its recipient, where it can be decrypted. One example is a virtual private network (VPN) that uses end to end encryption.
The question for many organizations is not either/or, but rather which approach best fits into the organization's existing security architecture?
Pros and Cons
Size is a factor for organizations weighing tokenization and end to end encryption, says Dave Shackleford, former chief security strategist at EMC, and now principal at Blue Heron Group. "I would probably choose tokenization for smaller organizations, but larger ones will likely benefit more in the long run from looking to implement robust encryption practices and technologies," Shackleford says. Tokenization may not encompass all the data that needs to be protected by larger organizations, he adds.