Tokenization: Today's Challenge

Tokenization is a data protection technique that is comparable to encryption; it offers a different approach through which applications can render data unreadable such that it has no value to an attacker if it is stolen or accidentally exposed. While encryption transforms data using a specific algorithm, tokenization substitutes surrogate data (the token) to replace the data that needs protection. Multiple methods exist for generating tokens and protecting the overall system; but in contrast to encryption, no formal tokenization standards exist. One common approach is to deploy a centralized tokenization service that generates tokens, performs the substitution, and stores the token and corresponding original data, allowing it to de-tokenize (substitute the original value for the token) when an application needs to use the original data. Alternative approaches avoid the need for a central token service and repository by utilizing secret, pre-generated look-up tables that are shared with applications. Often the distinction between encryption and tokenization is blurred, but both approaches involve the use of secrets to protect data and in both cases the data that is protected is only as safe as the transformation process itself and any secrets that underpin it. Just as encryption keys need to be protected from unauthorized use, then, the tokenization process and token store must also be protected.

Learn More

As with encryption, and Point-to-Point Encryption (P2PE) in particular, tokenization is commonly employed by applications in retail operations subject to PCI DSS compliance; that is because the protection of tightly formatted data such as cardholder data is a natural fit for substitutional approaches such as tokenization. The tokenization process helps to reduce the scope of compliance audits because customer credit card numbers, for example, are exchanged for tokens as soon as they are captured at a point-of-sale terminal, after which that data is no longer in compliance scope because the data no longer contains actual credit card numbers. Data remains in tokenized form by default, so any system that cannot access the de-tokenization service has the potential to be out of scope. For organizations to take advantage of the potential to reduce scope, they need to follow the guidelines issued by the PCI Council regarding the deployment of tokenization. Hardware security modules (HSMs) can play an important role in ensuring adequate levels of security, just as they do in encryption systems, since all tokenization system deployments depend on the use of cryptography.  

Hide Section

Risks

  • Tokenization systems are attractive targets for attackers as they can gain access to large volumes of valuable information.
  • The various approaches to tokenization, the security practices associated with token generation and token stores, and other aspects of access control are not standardized and best practices and product certifications are still immature compared to those for encryption techniques. This places greater onus on organizations and auditors to validate the security properties of the systems they encounter.
  • Software based tokenization systems are vulnerable to attack—as are any software-based security applications; attackers who gain access to the tokenization systems can steal account data. Organizations should consider the use of hardened security platforms such as HSMs, just as they would for any encryption-based systems.
  • Tokenization is best suited to situations where data is highly structured and of a specific format; when compared with encryption, it may not be as extensible to other data protection needs outside of PCI DSS.
  • Many approaches to tokenization rely on real-time connections to central or even cloud-based tokenization services, a practice that can raise concerns over resilience and latency—and potentially impact business continuity and capacity.

Tokenization: Thales e-Security Solutions

Products and services from Thales e-Security can help you implement effective, high assurance tokenization solutions to protect customer information, reduce scope, and contain the cost of compliance. nShield HSMs  are independently certified to meet FIPS and Common Criteria standards and are approved for other approaches that can reduce scope such as Point-to-Point Encryption under PCI DSS guidelines. Using HSMs, organizations can protect token stores and the tokenization process and increase the performance of token generation. nShield HSMs create a trusted environment where tokens can be generated, stored, and managed and tokenization/de-tokenization performed safely and securely. This trusted layer overcomes the fact that a purely software-based environment in which applications typically execute is not, in itself, sufficiently trusted to meet the needs of a tokenization system.

Whether you tokenize account data using your own in-house developed software, a third-party commercial tokenization product, or a shared service, nShield HSMs can play an important role. These devices are already certified to integrate with many leading tokenization products, assuring you of fast deployments and seamless integration with your existing systems.

Benefits:

  • Deploy high assurance tokenization solutions to protect account data and reduce compliance costs.
  • Utilize industry best practices recommended by auditors and PCI DSS guidelines to protect the integrity of tokenization systems.
  • Accelerate deployments; nShield HSMs are pre-qualified to integrate with products from leading vendors.
  • Take advantage of purpose-built cryptographic offload capabilities to accelerate the generation of tokens, particularly in situations where token values are cryptographically related to the source data.
  • Take advantage of a choice of performance ratings and HSM form factor options; deploy exactly what you need and only what you need, and upgrade easily as your needs change.