Keep it real : what is real-time auditing?

“Data is the new oil.” When mathematician and Tesco Clubcard architect Clive Humby made this proclamation in 2006, Facebook was just two years old and Apple’s iPhone was little more than a twinkle in Steve Jobs’ eye. Today that quote feels like an understatement. Microsoft, Amazon, Apple, Alphabet (Google’s parent company) and Facebook are the five most valuable businesses in the world; there is no oil company in the top ten at the time of writing.

The internet and smartphones have made data abundant – and it is becoming ever more abundant as more devices connect to the web, from wearables to cars and even homes. Businesses are now using algorithms to harness this big data, predicting everything from when customers are likely to transact, to when core assets will require maintenance work.

As data increasingly underpins businesses’ operations, processes and decision-making, it is incumbent upon internal audit to use the tools at its disposal to meet companies’ assurance needs within this digital environment. Periodically sifting through samples of payslips for evidence of payroll control weaknesses will no longer cut it.

Internal audit must learn to be more agile and forward-looking. The goal is to provide real-time (or close to real-time) assurance through data analytics, allowing for quick and responsive changes to audit plans and scoping in audit engagements, and enhancing the depth and breadth of fieldwork without expending extra time and effort.

This is critically important as expectations of the third line of defence continue to rise. According to a recent survey by Gartner, the number of key enterprise risks that internal audit is expected to cover, without additional resources, doubled from six to 12 between 2015 and 2018. This is absorbing the buffer that audit functions once had to adapt their plans and assurance provision.

 

Analytics across the audit lifecycle

There is no function or domain that cannot be audited using data analytics in some way, provided relevant and high-quality data is available for analysis. This is especially true as organisations have come to adopt company-wide enterprise resource planning (ERP) systems such as Netsuite, Oracle and SAP. Analytics can also be applied across the entire audit lifecycle, from risk assessment and audit planning to engagement scoping, and then all the way through to the nuts and bolts of audit testing. 

From a planning perspective, data-driven dashboards help to highlight areas of the business that may require the attention of internal audit, as well as providing insight on business processes and activities. “In a procure-to-pay process, if a spend trend matches the size of the market, and retrospective spend is only for justified exceptions, then that is an area we may not want to look at in much detail,” says Amit Desai, a senior audit manager who leads data analytics within Vodafone’s internal audit team. “At the same time, if invoices are not being paid on time or there is large variance in terms of goods receipts and invoices, that may be an area that requires more attention. That is how analytics helps our planning, by determining those higher risk areas.”

Analytics not only informs the annual audit plan, but allows the third line to reflect on, say, a quarterly basis how relevant the plan is in light of incoming data. Vodafone completes around 300 audits a year across its global operations. If anomalous key performance indicators (KPIs) and key risk indicators (KRIs) begin to emerge from this continuous risk assessment activity, there may be a case for adapting the existing plan, bumping less urgent audits down the list in favour of higher priority domains, bringing assurance closer to real time.

 

In the field

Financial audit engagements are some of the commonest examples of analytics applications at the testing stage. Sarah Mason, director of integrated assurance at Centrica, says one of the primary applications of the energy company’s internal audit data analytics activities has been the continuous monitoring of its billing engine in North America.

“We came up with a set of 12 KRIs, such as the unit price on the invoice not matching the unit price in the database. There might be valid reasons for it, but we ran that every week for a month and each time it threw out exceptions by region, product and so on,” she says. “We’ve now handed that tool over to the business to understand some of the drivers behind the exceptions and to design controls around it. We’re going to revisit the process next year and see what’s been done.” In this way, data anomalies are less proof of malfunctioning controls, than evidence that processes may not be working as expected and therefore warrant further inspection by internal audit or by management.

One of the biggest challenges with continuous monitoring is that internal audit can quickly find itself drawn into second-line activity. “We’ve been having a debate within internal audit about whether this is really our job or if it is the second line’s job,” Mason says. “Our general premise at the moment is that we do it to support our ability to understand the effectiveness of processes and give an opinion on the effectiveness of the control environment, but going forward this monitoring should be passed to the second or even first line.”

The beauty of analytics scripts is that, assuming the systems and domains being analysed do not materially change, they can be left to run in the background once they have been set up. Internal audit can then build up a library of these tools over the long term and interrogate them as and when it chooses.

“There’s a maturity lens you can look at all of this through, which we call industrialisation,” says Paul Holland, head of internal audit technology at Vodafone. “If there’s something you’re going to look at frequently and regularly – and opex and procurement is a good example because it’s a fundamental process that is examined somewhere in the world every year – there’s a real justification for industrialising the data analytics so it becomes repeatable, standardised and well controlled. With sporadic audits the process or system might have changed the next time it is audited, so there is less justification for fully industrialising it.”

 

Slice and dice

Internal audit is uniquely positioned because it can access every corner of the organisation and take a view that cuts across every function. This is where the third line can get creative with analytics, by taking separate data points and contrasting them in new ways.

“When we overlapped customer debt with rebates and refunds, there were some unusual numbers that came up around: ‘If this customer is behaving like X then why are we treating them like Y?” Mason says. “The business had never previously put those two datasets together using the unique customer reference number and, for the first time, it showed activity in the business that didn’t look right. We’ve since handed that back to the business because we couldn’t immediately determine whether there was a financial irregularity, but it just didn’t look right. So the business is investigating whether there is anything that needs to be done with that finding.”

More unusually, Vodafone’s internal auditors also apply analytics to customer retention. “We try to understand the trends of churned customers, assessing the key reasons for the churn and looking at whether offers drive retention. By looking at anonymised data we also validated the existing churn propensity methods,” Desai says.

There are two angles to this type of fieldwork. One is leveraging the second line’s dashboard to gain insights and knowledge, as well as providing assurance on how the analytics are generated and how data outputs from that analysis inform first-line decision-making. Another angle is running independent third-line analytics in search of discrepancies that have not yet been identified by another line of defence, which “fuels the company’s process of continuous improvement”, Holland explains.

Applying analytics in innovative ways does not always deliver results, but this should not deter those tempted to experiment. However, Mason warns that running data analytics in the third line often raises more questions. “You have to have a relatively high risk threshold to try data analytics because sometimes you won’t get the answers you want or expect, or even get anything out of the data at all. There may be too many false positives because you haven’t specified the hypothesis correctly or the data quality isn’t high enough to allow the kind of analysis you want to do,” she says. 

“That might not be what you set out to prove, but it’s by no means useless. It can tell you something about the data quality or what the process was trying to achieve and why that’s not possible. It’s all a step along the path to maturity. You have to try it, but be prepared to fail sometimes.”

This is a key lesson for internal auditors who are usually less used to failure than innovative IT functions. Now that innovative IT is a critical part of internal audit’s toolkit, auditors need to learn how to fail and that failure is part of experimenting with data analytics. The third line must be more daring in its ambitions and scope, but must also expect to fail, and fail fast, when harnessing this era’s most precious commodity.

For a selection of resources on IT auditing and analytics, visit iia.org.uk/resources/it-auditing-and-cyber-security

 

This article was first published in September 2019.