Management, compliance & auditing

Want to Limit PCI DSS Scope? Use Tokenization

Security Ninja
September 11, 2014 by
Security Ninja

Every organization should follow a proactive rather than a reactive approach to protect against threats, risks and vulnerabilities, to which if their IT infrastructure is exposed can lead to data loss, regulatory penalties, and lawsuits and damaged reputation. Moving on the same lines, to reduce the credit card fraud via its exposure, a standard known as Payment Card Industry Data Security Standard (PCI DSS) was formed. In this article we are going to learn about various ways in which PCI DSS scope can be reduced using Tokenization.

Introduction

Payment Card Industry Security Standards Council (PCI SSC) has developed a standard known as PCI Data Security Standard (PCI DSS), which comprises of 12 core security areas to protect credit card holder data from theft, misuse, etc. These requirements apply to all entities involved in payment card processing including merchants, processors, and 3rd party service providers that store, process and transmit cardholder data. PCI DSS originally began as five different programs:

  • Visa's Cardholder Information Security Program.
  • MasterCard's Site Data Protection.
  • American Express' Data Security Operating Policy.
  • Discover's Information Security and Compliance.
  • JCB's Data Security Program.

Every organization who wants to be certified with PCI DSS has to go under a very detailed audit of their PCI DSS environment. Reducing the scope of the PCI DSS environment will not only lower the cost around PCI DSS for organization, but will also significantly increase the chance of PCI DSS audit success.

What exactly is in the PCI DSS scope?

According to the PCI DSS council the scope of PCI DSS is as follows:

The PCI DSS security requirements apply to all system components included in or connected to the cardholder data environment. The cardholder data environment (CDE) is comprised of people, processes and technologies that store, process, or transmit cardholder data or sensitive authentication data. "System components" include network devices, servers, computing devices, and applications. Examples of system components include but are not limited to the following:

  • Systems that provide security services (for example, authentication servers), facilitate segmentation (for example, internal firewalls), or may impact the security of (for example, name resolution or web redirection servers) the CDE.
  • Virtualization components such as virtual machines, virtual switches/routers, virtual appliances, virtual applications/desktops, and hypervisors.
  • Network components including but not limited to firewalls, switches, routers, wireless access points, network appliances, and other security appliances.
  • Server types including but not limited to web, application, database, authentication, mail, proxy, Network Time Protocol (NTP), and Domain Name System (DNS).
  • Applications including all purchased and custom applications, including internal and external (for example, Internet) applications.
  • Any other component or device located within or connected to the CDE

So it is clear that the scope of PCI DSS can be significantly reduced if the contact points w.r.t to cardholder data, like where the card holder data is getting stored, is reduced.

Tokenization

Tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens. In PCI DSS, tokenization as per process replaces the sensitive PAN i.e. Permanent Account Number with non-sensitive tokens. A token is a string of random numbers generated from a token generator and poses no threat to the security of cardholder data if stolen by an attacker.

How Tokenization and De-Tokenization works

Tokenization and de-tokenization operations should occur only within a clearly defined tokenization system that includes a process for approved applications to submit tokenization and de-tokenization requests. Below are the steps used for tokenization and de-tokenization for a respective approved application.

Tokenization

  1. User accesses the application like a POS terminal and provides the authentication information like PIN.
  2. Application captures the information and passes the PAN along with authentication information for verification.
  3. The tokenization mechanism deployed will parse this request and check for authentication.
  4. If authentication fails, the tokenization process fails and information is logged for monitoring.
  5. If authentication check passes, the tokenization system generates a new token for respective PAN and will do a mapping between the generated token and the PAN in the vault
  6. Then the generated token is returned back to the application.

De-tokenization

  1. User accesses the application like a POS terminal and provides the authentication information like PIN.
  2. Application captures the information and passes the PAN along with authentication information for verification.
  3. The tokenization mechanism deployed will parse this request and check for authentication.
  4. If authentication fails, the tokenization process fails and information is logged for monitoring.
  5. If authentication succeeds, tokenization queries the card data vault for a record associated with the token and retrieves the PAN if found. If no such token exists, the de-tokenization operation fails, and information is logged for monitoring.
  6. Tokenized system returns PAN value retrieved from the card data vault.

How Tokenization reduces PCI DSS scope

Below are the tokenization properties that organizations should understand to implement tokenization in their PCI DSS environment.

No relationship between token and data values

Token is in no way related to data value, i.e. there is no way in which the data value can be retrieved from the token. Thus the only relation between them is referential. Tokens are used in the card holder environment (CDE) and are of no threat should they have been compromised. Tokens can be passed around the network between applications and other business processes safely, all the while leaving the encrypted data it represents securely stored in a central data vault. Whenever any application needs access to the data that is stored in encrypted format in the vault, the application request has to be passed through the tokenization engine. Thus tokenization provides an extra layer of security around the sensitive credit card data.

Centralized data vault

All the credit card data is encrypted and is stored in a central data vault. This encrypted credit card data has a reference of token outside the data vault. This referenced token is used in applications or databases. Thus tokenization helps to reduce risk of credit card data even in encrypted format outside the data vault except the capturing of data in the initial transaction and when later accessed by authentication of application and user.

For example, suppose a credit card is used for shopping. In the initial transaction, the credit card value is transmitted in clear text and the request is parsed by the tokenization engine. Tokenization engine generates a token for the sensitive credit card number and is returned to the application for the future reference. This token is used for all other reference for business purpose like fraud detection, analytics, etc. The data vault where the sensitive credit card value is stored in encrypted format is only exposed to the token server, and the real credit value is never stored in plain text anywhere.

Centralized key management

With tokenization in place, cryptographic keys are of limited use, thus keeping the distribution of the keys to a minimum. This helps to minimize the scope of PCI DSS compliance and reduce the risk of a key compromise. In a tokenization solution, cryptographic key management applies to keys used for encrypting PAN in the card data vault, as well as any keys used in the generation of the tokens themselves. It is important to note that if token generation is based on the use of cryptographic keys, compromise of keys could result in compromise of tokens. Thus, in this case, secure management of cryptographic keys should be in place.

Token are used as reference

Prior to tokenization, all the credit card data is stored in encrypted format and that encrypted data is used in place of original data. This leads to slow processing and requires need of more space, as the encrypted data requires more space. With tokenization, a reference to the cipher text is done and the cipher text is stored in a vault which is only exposed to the tokenization server. Thus the original and even its encrypted format, i.e. cipher text, is never exposed, and the corresponding token is only used in all applications and databases. This significantly reduces the IT and business impact associated with PCI DSS compliance.

PCI DSS Scoping of a Tokenized Card Holder Data Environment (CDE)

  • All components of a tokenized environment are part of CDE and will be in scope of PCI DSS.
  • A system component with access to tokenization/detokenization process is in scope of PCI DSS.
  • Tokenized system component which does an exchange of PAN with token and which does an exchange of token with PAN is in scope of PCI DSS.

References