control, and governance
The Data Behind the Curtain
Beyond the big data hype are practical ways to probe information for insights on organizational risks.
Russell A. Jackson
Big data is a lot like the Wizard of Oz in the 1939 film classic. Like Dorothy and the citizens of the Emerald City, businesspeople don’t know exactly what it is, or what, precisely, it can do, but they know if you go about it correctly and do everything it tells you to do, it can make wonderful things happen. Pull back the curtain, though, and it’s just a more expansive way to analyze and leverage information to do your job better. “Big Data is just a lot of data,” says Norman Marks, a former chief audit executive (CAE) and current IaOnline blogger. “There is no difference except the volume.”
That understanding is crucial to maximizing data analytics for internal audit. “Too many shops make the mistake of starting with data and how they can run analytics against them,” Marks explains, “when they should be thinking about providing assurance and consulting services related to the risks that matter to the organization.” Internal audit departments, he explains, tend to run the routines that come with the data-mining software they purchase — or the ones they’re used to writing — and come up with interesting information that has a fatal flaw: It relates to risks that aren’t important in the big picture. “The better approach to data analytics is to know what risks you need to address — the risks that matter to the organization as a whole,” he comments, “Only then consider how analytics can help obtain insights into those risks.”
Data, in other words, is data. And data analytics is simply inspecting, cleaning, transforming, and modeling that data to highlight the useful information it contains; to suggest conclusions that can be drawn from the data; and to support decision-making based on those conclusions. The insights gleaned through analytics can be historical, real time, or predictive. They can be risk-focused, such as on control effectiveness, fraud and abuse, and regulatory noncompliance, or performance-focused, such as on increased sales or decreased costs.
PUTTING ANALYTICS TO USE
Key sources of data for analysis include the organization’s enterprise applications and databases — such as financial, human resources, customer relationship management (CRM), and purchasing systems — and external data warehouses. “Our audit team has a quick view into outlier transactions,” says Aneta Youngblood, IT audit director at Caterpillar Inc. in Peoria, Ill., “and we can focus on understanding them as part of the audit.” But many internal audit leaders have difficulty securing that data or running it through their internal systems, such as when different business units run on disparate IT solutions instead of a common enterprise resource planning system, she adds. “We found that partnering with business units we audit helps with obtaining the data as well as building out their existing business analytics capabilities.”
Another challenge is getting data for ad hoc tests, considering the time it may take to find someone able to download the information, and considering that different systems involved may not communicate with each other, says Patrick Gras, audit director–global functions at F. Hoffmann-La Roche Ltd. in Basel, Switzerland. In some cases, securing data is not possible; one example is when detailed information, such as expenses, is kept on paper. “We define our data analytics tests based on data being commonly available,” he explains, “so for standard tests, we are not having trouble getting data.”
Beyond that, the definition of data analytics is unique to each internal audit department. The IIA’s Global Technology Audit Guide 10: Data Analysis Techniques provides guidance on using analytics technologies. But individual departments need to decide which analytics technologies to use, which data streams to target, and what internal audit hopes to accomplish through analytics.
“The data do not answer questions,” emphasizes Kirk Tryon, director of process and continuous audit at Texas Instruments in Dallas. “They do help us form a hypothesis about what might be going on. Then, with that hypothesis, we can target our questions and the areas of the company where we need to focus.” Sometimes internal auditors need additional data or meetings with business-unit leaders or other additional sources before reaching their final conclusions, he explains.
In fact, data analytics often is a fairly blunt instrument, at least at first — and one that asks far too many questions. “The follow-up, the figuring out what questions need to be asked — and then getting those questions answered — is a very time-consuming process, especially with a new analytic,” says Todd Freeman, vice president, internal audit, at Chicago Bridge & Iron in The Woodlands, Texas. “Then you start to notice patterns that you can eliminate immediately from your results.”
First analytical reports too often provide information that requires additional, more focused analysis, Marks concurs. His departments “tried not to use analytics for general fishing,” he says. “We usually had identified the risk we were auditing, rather than trawling for odd-looking patterns.”
THE ANALYTICS PROGRAM
Auditors should expect that implementing a data analytics function in their department will require considerable time and resources. “We started data analytics and failed, then started again and failed again,” Freeman recalls. “We finally hired a dedicated resource for data analytics, and then things hit warp speed. It takes a while to build a house.” Once constructed, that house can serve myriad purposes, including making a basic internal audit function faster, or more thorough, because so many more bits of information are being examined. What each application of data analytics has in common is this: It solved a familiar problem more effectively because more information was available more quickly and more readily to assist in making an important decision.
“We are an assurance function,” Tryon explains. “We’re looking for instances where internal controls fail, or where we are at risk of policy violation, and through our data analysis, we can identify high-risk transactions and test at a very detailed level.” That approach helps his department — and Texas Instruments’ management — understand and assess the health of the control environment in a particular country or region, or companywide. In one engagement, data analytics helped identify employee fraud and policy violations. “In the case of fraud,” Tryon reports, “if we see a pattern, we can implement better controls to prevent it from occurring, or give management insight regarding the patterns so it can execute appropriate oversight.” In the case of policy violations, the internal audit department can use the test results to evaluate the policies and recommend changes or reinforcements to the policy where needed.
Freeman’s department at Chicago Bridge & Iron implemented a data analytics program to detect duplicate payments — what he calls “a direct bottom-line, money-saving activity” — and now the system has migrated to accounts payable (AP), which detects duplicate payments before they’re made. “Now when we audit, we look at whether AP is running the program,” he says. “They run the audit for duplicate payments right before they print checks, so they can capture a duplicate in advance and kill it.” The company’s AP functions in London, Dubai, and Perth, Australia, all use the analytics program now, too.
INSIGHTS AND INVESTIGATIONS
The information gathered through data analytics shows up in the reports internal auditors make to clients, senior management, and their boards. Often, the analyzed data, as well as the summarized results, are presented to other parts of the organization. “We share the data with the business leaders of the organization we audit,” Youngblood says. “They often are impressed, as they don’t have the same analytics capability within the IT solutions supporting their business operations.” Additionally, she involves the business side of the enterprise in data validation.
The data is shared with audit clients at some organizations. “Data analytics results are quite often used to generate our samples and consequently are partly reviewed with auditees,” Gras says. “Full results can be shared with them, as well, if it’s deemed appropriate.”
When he was a CAE, Marks often applied data analytics for insights that contributed to his departments’ ability to provide assurance. In one such engagement, an audit of legal reviews of sales contracts, his analysis “demonstrated that the amount of time and effort given to small deals was very similar to the amount given to large deals, enabling us to demonstrate the need to cut back on small deals and spend more time on large deals, which improved the efficiency of the process,” he reports. That analysis helped free the legal team to focus on other important activities.
In another data analytics-based engagement, his department identified a high risk that an impending acquisition would cause the enterprise to lose key employees — owners of key controls for compliance with the U.S. Sarbanes-Oxley Act of 2002. “We used analytics to pinpoint where we were losing, or about to lose, such employees, so we could work with management to ensure those key controls continued to be performed reliably,” Marks says.
Another of his engagements targeted one of the company’s biggest risks: revenue fraud, where sales executives would collude with customers to inflate orders and give them a credit note early in the next quarter. “We used analytics to monitor the level of credit notes in the first month of each quarter,” he says.
While analyzing CRM data, auditors at F. Hoffmann-La Roche found evidence of ineffective customer segmentation and related activities, which were not aligned with internal guidelines. “Because the business was not monitoring it, it would have been difficult to get these insights,” Gras reports. Auditors run targeted data analytics. “We know why we run a test, how to interpret results, and what action has to be taken, depending on the test’s outcome,” he says.
Getting to the point where data analytics provides meaningful insights into specific risks takes time and effort. But making sure their analysis yields actionable information is a process familiar to all internal auditors: Test it, and ask more questions.
“We use an iterative process,” Tryon explains. “We create an analytic based on an assumption of a particular risk the company will face, then we run the analytic.” Typically the results will yield some false positives; Tryon’s team tests to separate those false positives from the legitimate issues and concerns. “Where we can eliminate the false positives without also eliminating the legitimate issues or concerns, we will modify the scripts and make our analytic more efficient,” he observes. “Also, we can combine data from different sources, so if some data do not have all the information we need to make the item actionable, we can add the information from another source to complete the data.”
Freeman agrees that a lot of work is involved. “When we’re digging through and eliminating false positives, we go down and talk to management and say, ‘Here’s what we found. We’d like to dig deeper,’” he says. “Sometimes they can give an explanation, or it can become an exercise that takes a few days to follow through to the end.”
THE ANALYTICS DIASPORA
Data analytics becomes even more time-consuming when programs migrate outside the internal audit department. In some organizations, the internal audit function has sole access to the program; in others, internal audit implements and customizes the program with the intention of pushing it out to business units to analyze their own information. Which matrix an organization uses is a function of which department or function is footing the bill for the analytics technology, what management’s overarching goals are for the analytics function, and which specific types of analytics are being implemented.
There’s a role for internal audit in the data analytics diaspora. “Work with clients to understand what’s in the data,” Freeman advises. “They’re managing their own risk and monitoring their own data with it, so when we go back to do an audit, we look at what they’ve done with it. We over-test it to make sure they’ve done what they need to do to follow the process through to the end.”
Data analytics is difficult, and it takes a lot of time, but it also “has the potential to be one of the biggest game changers in the internal audit profession,” Youngblood asserts. That’s why audit executives such as Freeman so strongly recommend dedicating resources to its development and implementation. Freeman also stresses the importance of securing explicit support from management and the audit committee; otherwise, it will fail.
An essential element of marshalling support for data analytics is making sure stakeholders know exactly what internal audit is doing — and why. “Communicate, communicate, communicate,” Freeman says. “If your board, your senior management, and line management don’t know what you’re doing, it can trip you up.” If they see the value of data analytics, and internal audit does it right, it can make many aspects of auditors’ jobs much easier.
Russell A. Jackson is a freelance writer based in West Hollywood, Calif.