April 2010

Reducing PCI DSS Audit Scope

IT auditors can help merchants decrease the complexity and cost of PCI DSS audits while increasing audit success by introducing them to tokenization.

Gary Palgon
Vice President of Product Management
nuBridges Inc. 

The prevalence of digitally stored credit card information, combined with the increasing inventiveness and boldness of hackers and employee mistakes, make the perfect storm for theft. Incidents such as the 2008 Heartland Payment Systems breach involving computers storing more than 100 million payment card accounts have highlighted the risk to consumers’ credit card data. A similar incident in 2001 that compromised 98,000 customer credit card numbers at online retailer Amazon.com led the major credit card companies — American Express, Discover Financial Services, JCB, MasterCard Worldwide, and Visa Inc. — to create the Payment Card Industry Security Standards Council. In 2004, the council issued the Payment Card Industry’s Data Security Standard (PCI DSS) to help organizations that process card payments prevent credit card fraud through increased controls around data and its exposure to compromise.

Today, IT auditors play a critical role in helping merchants comply with PCI DSS. What’s more, the required annual audits provide a recurring opportunity for auditors to advise retailers on how to optimize their payment card number protection program strategy. In a PCI DSS audit all systems, applications, and processes that have access to credit card information, whether encrypted or unencrypted, are considered in scope. Simplifying and reducing the scope of compliance is an important part of any PCI DSS compliance strategy and can be achieved by shrinking the footprint where cardholder data is located throughout an organization. Reducing the scope makes it easier for auditors to complete their jobs and significantly increases the chance of audit success the first time, while lowering the cost and anxiety of compliance for merchants.  

At the 2008 PCI Security Standards Council annual meeting, network segmentation was discussed as a method to isolate cardholder data in a secure segment. As companies began implementing network segmentation, they needed to audit only the portion of the network holding the cardholder data, accelerating compliance and reducing the cost and complexity of annual PCI DSS audits. A new data security model — tokenization — complements network segmentation to reduce the scope of PCI DSS audits while adding another layer of data security (see “Scope Reduction Considerations” below).

Scope Reduction Considerations

Best information security practices suggest that organizations should consider several aspects of their credit card process and IT infrastructure to implement a tokenization strategy that takes systems, applications, databases, and processes out of scope for PCI DSS audits.

  1. Review all systems to determine which can take advantage of format-preserving tokens to reduce the risk of unauthorized access to credit card numbers.
  2. Determine which systems are recipients of tokens vs. which have the ability to convert tokens into credit card numbers. Systems that only use tokens are considered out of scope for the PCI DSS audit.
  3. Determine whether systems that only use tokens are on a segmented network separate from those systems that have the ability to convert tokens to credit card numbers.
  4. Review the requirements of each employee’s job description to determine who should have access to unencrypted credit card numbers; access should be restricted accordingly.
  5. Verify that employees who need access to encryption key functions to perform their jobs are the only ones who have access to them.
HOW TOKENIZATION WORKS

With traditional encryption, when a database or application needs to store sensitive data, those values are encrypted and the resulting cipher text is returned to the original location. With tokenization, a token, or surrogate value, is returned and stored in place of the original data. The token is a reference to the actual cipher text, which is stored in a central data vault. Tokens can be safely used by any file, application, database, or backup medium throughout the organization, minimizing the risk of exposing the actual sensitive data, and allowing business and analytical applications to work without modification.

Any organization that must meet the requirements of PCI DSS can benefit from tokenization. Consider requirement 3.1, which mandates that businesses keep payment data in a minimum number of locations. That is precisely what tokenization accomplishes — businesses reduce the number of locations where they retain cardholder information. Requirements 3.5.1 and 3.5.2 mandate that access to encryption keys be restricted to the fewest number of custodians and that keys be stored securely in the fewest possible locations. With tokenization, encryption is performed centrally when credit card values are tokenized, and keys are centralized on a secure server.

SIX ELEMENTS

Several elements of tokenization can contribute to reducing the scope of PCI DSS audits. In addition to centralized data vault and encryption key management, tokenization uses tokens as surrogates for data and masked data and to maintain referential integrity. Moreover, these tokens do not have a mathematical relationship to corresponding data values, enhancing protection.

1. Central Data Vault
Tokenization reduces risk because encrypted payment card data is stored only in a central data vault and is only available outside the vault when it is originally captured at the beginning of a transaction or accessed later by authorized and authenticated applications or users. This approach could have averted the security breach suffered last year by Westin Bonaventure Hotel & Suites in Los Angeles. In March, the hotel disclosed that its four restaurants and valet parking operation may have been hacked between April and December 2009, compromising customer names, credit card numbers, and expiration dates. If these items had been tokenized at the point-of-sale terminals, the stolen information would have been useless to the hackers while the encrypted data would have been stored safely in a central data vault.

2. Tokens Act as Data Surrogates
Encrypted data usually takes up more space than the original values, which requires changes to the applications and databases. However, tokens can be engineered to preserve the length and format of the original data, so they are nearly noninvasive to databases and applications and require no changes to database schemas, application screens, and business processes. This significantly reduces the IT and business impact associated with PCI DSS compliance.

3. Tokens Are Surrogates for Masked Data
Tokens can be generated to preserve parts of the original data values. A typical pattern in PCI DSS scenarios is generating tokens to maintain the original first two and last four digits of the credit card. A token strategy provides the flexibility to define the format of the tokens, including what part, if any, of the original value to preserve.

Frequently there are applications within the enterprise that need only the last four digits of a credit card to validate credentials. It’s in this scenario where the use of a format-preserving token — a new variation of tokenization that enables tokens to maintain the length and format of the original data — allows users to perform their job. Because the application does not contain credit card information — not even in an encrypted format — the entire application is removed from the PCI DSS scope. A word of caution, though: Appropriate network segmentation is still required.

Tokens and System Development

Using tokens as data surrogates, to mask data, and in a one-to-one token/data relationship solves an important security problem for today’s IT-dependent enterprises — the need to use production data for development and test environments. Often, when application changes are required, testing must be performed across the entire system.

Today, most enterprises allow developers to use production data, such as real credit card numbers, or go through great lengths to obfuscate the data to make it unrecognizable. The same applies to companies that use offshore development labs and don’t want to provide production data to them. By using tokens, most application testing can take place without having to access real credit card numbers. This decreases the operational effort needed to test applications and can reduce the scope for PCI DSS compliance for applications that only use the format-preserving tokens.

4. One-to-one Token/Data Relationship
Certain types of tokenization can ensure that there is always a one-to-one relationship between the credit card number and the token that is generated so that referential integrity is maintained across multiple systems. For example, when retail marketers want to understand the buying patterns of consumers, they sometimes want to push transaction data into a data warehouse for analysis. This analysis can reveal that one day a consumer bought a stapler, staples, and a notebook using a credit card, and a week later that same consumer, using the same credit card, bought staples, paper, and notebook tabs. The fact that the consumer bought a stapler followed by staples and a notebook followed by paper is important, and the only way these purchases were linked was through the same credit card number.

Maintaining referential integrity allows the data warehouse to perform transaction analysis with tokens, rather than credit card numbers, thus removing it from the scope of the PCI DSS audit. And the consequences of a data warehouse breach are minimized because any unauthorized access only yields tokens rather than actual credit card numbers.

5. No Relationship Between Tokens and Data Values
With tokenization there is no mathematical relationship between a token and data value — the only relationship is referential. This is not the case when using encryption or hashing where an algorithm mathematically derives the output cipher text or hash based on the input data. Tokens can be passed around the network between applications, databases, and business processes safely, leaving the encrypted data it represents stored securely in the central data vault. Authorized applications that need access to encrypted data can only retrieve it from the tokenization engine, providing an extra layer of protection for credit card data and shrinking the risk footprint that must be managed and monitored by the retailer’s security team.

6. Centralized Key Management
PCI DSS requires keys to be stored “securely in the fewest possible locations and forms.” Tokenization limits keys to use by the central token manager, minimizing the distribution of keys, as well as the scope of PCI DSS compliance, and reducing the risk of a key compromise.

AN EXTRA LAYER OF SECURITY

For IT auditors who want to help merchants reduce their PCI DSS audit scope, tokenization is a highly effective method that can be used alone or to augment strong localized encryption. The higher the volume of data and the more types of sensitive data a retailer collects and protects, the more valuable tokenization becomes. In addition to time and cost savings from reducing the scope of PCI DSS audits, tokenization also reduces storage requirements, and can be used to protect any type of personally identifiable information — not just credit cards — in both production and system development environments (see “Tokens and System Development” above). Moreover, it can provide an extra layer of security throughout the extended enterprise.

Gary Palgon, CISSP, is vice president of product management for Atlanta-based data protection software company nuBridges Inc. He is a frequent contributor to industry publications and a speaker at conferences on e-business security issues and solutions.

To comment on this article, e-mail the author at gary.palgon@theiia.org.

 


Share This Article:    


COMMENT ON THIS ARTICLE

Internal Auditor is pleased to provide you an opportunity to share your thoughts about the articles posted on this site. Some comments may be reprinted elsewhere, online, or offline. We encourage lively, open discussion and only ask that you refrain from personal comments and remarks that are off topic. Internal Auditor reserves the right to edit/remove comments.

Name:

Email:

Subject:

Comment:


To make something bold:
<strong>Text to bold</strong>

To make something italic:
<em>Text to italicize</em>

To make a hyperlink:
<a href="URL">Text to link</a>

 

Subscribe_June 2014 

IIA_AllStar_July2014

 IIA_AllStar_July2014

IIA Academic_Nov 2013

IIA SmartBrief

 IIA Vision University

 

 Twitter

facebook IAO 

IA APP