Organizations that struggle with mobile data security find tokenization to be an efficient way to protect data...
that travels outside the corporate firewall.
The technology, which replaces sensitive data for a representative token value, is primarily used by retail merchants to obscure credit card information for Payment Card Industry Data Security Standard (PCI DSS) compliance. But tokenization is also applicable for organizations wary of cloud security.
The traditional firewall isn't the existing security line for IT organizations anymore, and they need new security tools to account for that, said Kennet Westby, founder of Coalfire Systems Inc., an independent IT compliance auditor based in Louisville, Colo.
The technology allows IT to secure older systems while taking advantage of the cloud's flexibility.
"The new modes of working require us to bolt security onto these older systems and that's what tokenization helps to solve," said Lawrence Pingree, a security analyst at Stamford, Conn.-based Gartner Inc. "It addresses new emerging risks by enabling older infrastructure without a significant investment in complete rip and replace."
Tokenization pros and cons
One large financial institution headquartered in New York City considered cloud applications in mid-2011, but the organization's IT department had reservations about placing sensitive client and customer data outside the firewall.
They began using a product from PerpecSys Inc. to tokenize data being sent out to cloud apps. The token tables generated through PerspecSys remain behind the company's firewall for security purposes.
"We are not tokenizing just one or two fields for credit cards," said the financial institution's director for strategic projects, who requested anonymity. "We tokenize anything and everything. We won't send any client-related information outside of our firewalls.
"It is peace of mind and enables us to take advantage of the cloud offerings that require us to have data outside the firewall," he added.
It's also a cost-effective approach to data security. For instance, organizations with strict PCI compliance requirements have to re-architect their entire back-end systems and applications to handle the necessary encryption, segment the necessary data by isolating it through a network separator, or swap out that data with tokens, said industry watchers.
"The first is overkill and won't happen in enterprises," Westby said. "Same with the segmentation approach. For every system requiring compliance, you are reducing a tremendous amount of cost with tokenization. It's the simplest and most cost-effective of those three strategies."
While there are a number of benefits of tokenization for data security, there are also several drawbacks.
There are numerous token formats that have created vendor lock-in -- despite the PCI Security Standards Council creating standard guidelines for the tokenization process last year. Also, some tokenization systems have scalability issues; some can't work with sensitive data beyond a credit card number, and worse, data collisions occurs periodically, Pingree said.
"There are only so many tokens you can generate before you mathematically wrap around to the same token," Pingree explained. "Different data sets end up being represented by the same token, if you have a huge data set and not a large enough token table. You tell the token server to look up the data you need and then it accidentally returns the other associated data set represented by the same token."
To counter the problems of data collision, Voltage Security Inc. released a stateless tokenization feature to its Voltage SecureData Enterprise data security platform at the end of December to combine encryption, tokenization, data masking and key management into one product.
Stateless tokenization removes the need for a generated token every time a database call is made. Instead, a pre-computed table unique to each customer sits in a virtual appliance and generates a random token when needed.
More on tokenization
Analysis of PCI tokenization guidelines
Understanding PCI encryption requirements and tokenization
Tokenization for PCI is the future of PCI DSS encryption requirements
Voltage remains one of the few vendors out there that offers stateless tokenization to enterprises, Westby said. Tokenization is a relatively new security tool outside of retail merchants, and the stateless version has yet to gain traction, he added.
Crutchfield Corporation, an ecommerce consumer electronics company based in Charlottesville, Va., began a beta test of Voltage's stateless tokenization in the summer of 2012 in an effort to migrate away from encryption and the need to do an encryption-key rotation process.
"The worry was the encryption keys could be stolen," said Alex Belgard, information security engineer for Crutchfield. Now, the goal for Belgard is to get the new tokenization system up and running full-steam ahead of the company's next PCI assessment.
So far, the new security approach has reduced the amount of servers needed from 30 to three to hold its entire customer payment information. Each of those servers is located in a separate, physically secured data center with extremely limited Internet access as an added precaution.
Further, the use of tokenization has helped reduce the company's overall PCI compliance footprint from 300 servers to approximately 15-30, depending on which applications are accessing credit card information at any time, Belgard said.
Voltage and PerspecSys aren't the only tokenization security vendors. CipherCloud, nuBridges and various third-party service providers also offer tokenization.
The Voltage SecureData Enterprise is licensed by the number of data centers and applications an organization has. Pricing varies, but a spokesperson for the company said a typical enterprise would cost in the low six figures.
James Furbush asks:
What is your organization's risk tolerance for using cloud applications?
0 ResponsesJoin the Discussion