Blogg

Can tokenization reduce PCI DSS audit scope?

nov 22, 2020

2 min

image of a data server and a credit card, in a cartoon style

Background

In a not recent, but still valid Gartner report, Using Tokenization to Reduce PCI Compliance Requirements, it was found that large merchants with an average of 100,000 customer accounts potentially store cardholder data in 10–20 different locations in-house.

Since the PCI standard mandates that every system in the Cardholder Data Environment (CDE) must be audited, this common scenario creates many potential vulnerabilities.

A large number of storage locations increases the audit scope, which in turn requires more resources and time, resulting in higher costs.

Can you reduce PCI scope?

Many merchants ask if there is a way to entirely eliminate the existence of cardholder data from the merchant environment in order to reduce audit scope.

The answer is yes, and the solution is tokenization.

What is tokenization?

Tokenization replaces cardholder data with an “alias”, a separate randomly generated value called a token. The sensitive data is stored securely in a central token vault, while only token values are used and stored locally in applications and services. When needed, the process can be reversed through de-tokenization, where the token is translated back into the original data.

Tokenization can be implemented in different ways:

  • Through in-house applications applied to databases and sensitive data stores
  • As a service (SaaS), where a cloud provider manages tokenization and storage

How does tokenization reduce audit scope?

With a tokenization solution delivered via a SaaS model, cardholder data (CHD) never resides within the organization’s environment. While encryption focuses on protecting stored data, tokenization goes one step further by removing the data entirely from internal systems. In simple terms, organizations do not need to protect what they do not store.

This significantly reduces:

  • The amount of sensitive data in scope
  • Infrastructure complexity
  • The need for encryption key management

It is important to note that encryption without proper key management is ineffective — comparable to using a strong password but writing it down next to your device.

PCI DSS places heavy emphasis on key management (especially in section 3), which can be complex and costly to implement.

Differences between tokenization and encryption

There are key differences between tokenization and encryption:

  • Tokenization separates sensitive data completely from the token
  • Encryption maintains a mathematical relationship to the original data

Encrypted data security depends on the encryption algorithm and the protection of encryption keys. Tokenization on the other hand allows flexible token formats and lengths, removed any direct relationship to the original data, and eliminates the need for local key management.

Business impact

By transferring cardholder data off-premise, organizations can reduce security costs, lower operational complexity, minimize PCI DSS audit scope. The less sensitive data is stored locally, the easier and cheaper it’s to secure systems and pass audits.