ACCOUNTING & AUDITING

Auditing

The Hidden Risk in Analytical Procedures: What WorldCom Revealed

By Neal B. Hitzig

Technological advances have brought analytical procedures (AP) to auditors’ personal computers. A recent article in The CPA Journal (November 2002) described some techniques that are readily available. Auditors generally agree that APs are valuable tools for any audit. APs are similar to the procedures of financial statement analysis, but with one critical difference: In financial statement analysis, all the data are assumed to be correct. This is not the case in an audit application, where the determination of data correctness is the objective of the procedure. Thus, an AP entails the use of data to test their own accuracy, which gives rise to serious limitations.

The authoritative literature recognizes two types of substantive procedures: tests of details, and analytical procedures. Unlike tests of details, for which the auditor performs vouching or tracing of individual accounts or transactions, APs do not involve direct verification of the data under examination. Instead, APs view information “from the top down.” That is, APs analyze aggregated data, taking those data as they are presented. Consequently, APs are generally less expensive to apply than tests of details, but they are less reliable.

With the exception of purely time-series procedures (which use only the passage of time to explain or predict the values of the accounting variables under audit), estimates are obtained (derived) from other underlying data. To apply an AP, the auditor needs to specify a relationship between the underlying data and the data of audit interest. This aspect of analytical procedures stems from the basic premise underlying their application in an attest engagement: “[T]hat plausible relationships among data may reasonably be expected to exist and continue in the absence of known conditions to the contrary” (AU 329.02).

This fundamental assumption has serious implications for an AP’s use as a principal audit procedure, because it is too often false in those situations where its truth is essential.

When Analytical Procedures Fail: WorldCom

On July 8, 2002, Melvin Dick, Arthur Andersen’s former senior global managing partner, technology, media, and communications practice, testified before the House Committee on Financial Services:

We performed numerous analytical procedures at various financial statement line items, including line costs, revenues in and plant and service, in order to determine if there were significant variations that required additional work. We also utilized sophisticated auditing software to study WorldCom’s financial statement line items, which did not trigger any indication that there was a need for additional work.

Dick’s statement is an acknowledgment that APs failed to detect the greatest management fraud in history. Why?

While the details of Andersen’s APs have not been disclosed, it would not be unreasonable to assume that Andersen used sophisticated procedures. The nature of the problem the firm faced may be illustrated by comparing key financial statement ratios for WorldCom with those of seven other publicly held communications companies: Sprint, AT&T, Nextel, Castle Crown, AmTelSat, U.S. Cellular, and Western Wireless. Five ratios, all related to revenues, expenses, and (gross) plant and equipment are given in Exhibits 1 to 5 for the years 1997 to 2001. The information, taken from the companies’ SEC filings, is highly aggregated.

We know now that WorldCom’s revenues, expenses, and property and equipment were materially misstated in 2000 and 2001. The first two ratios—cost of revenues to revenues, and the change in cost of revenues to the change in revenues (Exhibits 1 and Exhibit 2)—show declining trends for WorldCom, but nothing that would be characterized as unusual. By 2001, WorldCom is in the middle of the pack.

The ratios presented in Exhibit 3, Exhibit 4 and Exhibit 5, which are formulated with property, plant, and equipment in the denominators, reveal greater volatility in the WorldCom values, but normal values for the critical years 2000 and 2001. If anything, the graphs show unusual ratios in the years preceding the fraud (1996–1998).

Each of these ratios, or some variation, might have been considered by Andersen. Because the ratios are presented at high levels of aggregation, they may not be sufficiently sensitive to display unusual behavior. One can only assume that Andersen’s “sophisticated auditing software” disaggregated the data and analyzed them at a more refined level. Nevertheless, these ratios suggest why no unusual behavior was revealed to Andersen: Management had manipulated the data to conform to expectations. Writing in the Mississippi Business Journal (July 22, 2002), James R. Crockett, an accounting professor at the University of Southern Mississippi, noted that “WorldCom had previously invested heavily in capital equipment and had quit making as much investment. By shifting expenses to plant and equipment accounts, WorldCom was able to disguise the changing conditions by meeting expectations” (emphasis added). In other words, the historical trend no longer applied, but management manipulated the data to make it appear as though that trend continued to be valid.

Meeting Expectations

What is surprising is not the failure of APs to detect the symptoms of the massive WorldCom fraud, but the persistent belief of so many auditors in the power of those procedures to do so. Auditors want APs to work. Their faith in the procedures has been buttressed by academic researchers, who devise “tests” of APs that are self-fulfilling and avoid exposing the procedures’ intrinsic weaknesses.

The values obtained from an AP are dependent on, and are derived from, the values of the underlying data that are used to form the auditor’s expectation, and on the resulting expectation (the reasonable relationship). The emphasis in an AP is on identifying the unusual deviation. Whereas the observation of significant deviations may signal material misstatement, the absence of such deviations cannot be taken to indicate that there is no material misstatement.

Among the most important issues that hinder the development of effective APs are improper specification of the relationship between the explanatory variable and the dependent variable, misstatement in the explanatory or predictive variable, and misstatement in the dependent variable (that is, variables describing the data under audit). Econometricians and statisticians are well aware of these and other issues, which auditors appear to have disregarded.

Improper specification of the relationship between the variables also undermines AP effectiveness. The AICPA’s Auditing Practice Release, Analytical Procedures, states that: “Forming an expectation is the most important phase of the analytical procedure process. The more precise the expectation (that is, the closer the auditor’s expectation is to the correct balance or relationship), the more effective the procedure will be at identifying potential misstatements.” Furthermore, the release states that: “The level of assurance provided by an analytical procedure is determined by the precision of the expectation. The higher the precision, the greater the level of assurance provided by the procedure.” Each of the foregoing quoted statements is incomplete and, insofar as its impact on the audit risk model is concerned, incorrect. Misstatements in the data that are used to form the expectation are perhaps the most insidious source of bias, especially if they exist in data that the auditor assumes to be accurate. If only data under audit are misstated, the AP’s effectiveness will depend on the auditor’s ability to specify the relationship, and on the volatility of the data themselves. Misstatement in historical data, however, becomes embedded in the estimated relationship, which results in audit period estimates that appear to be normal, and thus renders the AP ineffective. To the extent that the explanatory variable is misstated, any estimates derived from the AP will also be misstated. Although SAS 56 (AU 329) does not overlook this consideration, the statement also does not emphasize its importance. Instead, it offers only vague guidance, with no explicit statements as to how to apply that guidance.

What the Audit Risk Model Overlooks

By their failure to forthrightly address data and specification issues, both the authoritative literature and the academic literature fail to identify the real risks associated with the use of APs as substantive procedures. The authoritative literature accords to analytical procedures the ability to detect the presence of misstatement at some specified level of assurance: positive assurance. An auditor’s statement of assurance implies a risk of failure: the failure to detect the presence of material misstatement. The audit risk model expresses this risk as one of two components of detection risk (the other component is related to tests of details). Neither the authoritative literature nor any commentary or research on the audit risk model recognizes that AP risk also consists of two components. The first component of AP risk is associated with the audit period data themselves. This is the component commonly thought of when considering detection risk; that is, that materially misstated data are not detected by the AP. The second component, which is unspecified and not discussed in the literature, is the risk that the prediction model (that is, AU329’s “plausible relationship”) is incorrect. Quantitative APs such as regression analysis have always proceeded from the basic assumption that the chosen model is correct. To the extent that the “plausible relationship” is incorrect, however, inferences drawn from it will be flawed. In an audit application, the foregoing consideration could apply to a change in trend in the period under examination or to misstatement in the data that are used to develop the relationship. WorldCom may have been just such a case.

Unlike a statistical sampling procedure, where the expectation for estimated misstatement is mathematically provable, the expectation that drives an analytical procedure is specified by the auditor, often arbitrarily. Whereas the risk of accepting a materially false hypothesis (the risk of incorrect acceptance) is measurable in statistical sampling applications, that risk is absent from consideration in the statistical literature on such quantitative techniques as regression analysis. Detection risk in an AP is measurable only if one knows exactly the model that is generating the data to be analyzed. While this is the case for statistical sampling, it is not true for analytical procedures, whether or not sophisticated quantitative techniques are employed.[Note] Because the expectation is specified by the auditor, and because the expectation’s parameters are estimated from data that may be misstated, there is a risk that the reasonable expectation is incorrect. This specification risk is not measurable with available techniques. It is unenumerated. Nevertheless, the risk does exist.

The qualitative aspect of AP risk that is ignored by both SAS 56 and the Audit Practice Guide Analytical Procedures pertains to an AP’s ability to provide positive assurance or, alternatively, a specified risk of incorrect acceptance. In discussing the difference between APs performed in an audit, a review, and an attest engagement, the guide states only that: “The primary difference in analytical procedures performed in an audit versus a review is in the desired level of assurance.” The real issue is the nature of the assurance provided, regardless of whether audit or a review.

This issue, a direct consequence of specification risk, has an important qualitative impact on an auditor’s decision making. APs provide only negative assurance; they may alert the auditor to possible misstatement, but they provide no assurance as to the absence of misstatement if deviations are not observed, as illustrated by WorldCom.

The auditor can obtain positive assurance from an AP only if the specification risk can be controlled or measured. To date, there is no approach that can do so. Even if the auditor were to specify relationships properly, the possibility of employee collusion or management override of controls, which SAS 55 properly identified as inherent weaknesses in controls, would render the techniques incapable of reducing detection risk. The hard accounting numbers, those that are transaction-based and supported by data retained in accounting records, require more rigorous testing by traditional means such as inspection, observation, and confirmation.

APs and Accounting Estimates

APs may be the only source of assurance for tests of estimates (such as warranty or bad-debt allowances), which are the soft accounting numbers. For those tests, however, the auditor must have the ability to rely upon the underlying routine data that form the basis for the auditor’s expectation. Because accounting estimates are usually based on APs developed by the client, the auditor will sometimes develop alternative procedures, using one AP to test another. More commonly, however, an auditor will review the client’s procedures and base the decision on that review, as well as on the reliability of the underlying data.

If the underlying data are unaudited (whether or not they are obtained from independent sources), the auditor has no basis for presuming that those data are materially correct. A test of a provision or allowance is essentially a test of a forecast, for which reliability is intrinsically problematic. Consequently, an AP also provides only negative assurance for tests of estimates, an unfortunate fact that the profession must recognize.

An Exception

Many asserted rules tend to have exceptions. In the case of APs, the exception entails the selection of a representative cross section of a population, such as a chain of retail stores. If the auditor examines both the explanatory variable, such as sales floor area, and the dependent variable, such as inventory amount, in the selected cross section, then the auditor may apply regression analysis to estimate a relationship between them. That relationship can then be applied to estimate the values of the dependent variable in the remainder of the population. Although characterized as an AP, this approach is actually a sampling procedure using a well-known statistical method (the regression estimator), for which the requirements of SAS 39 apply.

The Proper Role of Audit Procedures

However appealing APs may be, auditors must realize that the procedures generally cannot provide the positive assurance that is required of principal substantive test procedures. Nonetheless, APs play an important role in an audit. In planning and review, they can alert the auditor to unusual or unexpected behavior in data; however, they cannot be relied upon to do so in a substantive test because of the possibility of management override of controls, as WorldCom revealed. In the case of estimates of provisions and allowances, APs are the only procedures available to the auditor. Positive assurance that material misstatement does not exist may be unattainable at any level, regardless of the audit procedures employed to test an estimate. Thus, an auditor may be expressing opinions on a financial statement that are necessarily a mixture of positive and negative assurance.

The audit risk model conveys a false impression that risk of reaching an incorrect audit conclusion is controllable. To correct this impression, standards setters must provide a clearer understanding of the nature of assurance, including a clear-cut definition of positive assurance, as well as the conditions that distinguish between positive and negative assurance procedures. Standards setters should also better inform users of financial statements of the intrinsic limitations of an audit. Simply to state that auditors give “reasonable” assurance is no longer sufficient. Auditing standards must be revised to reflect the obvious realities illuminated by such scandals as WorldCom and to require the application of more rigorous audit test procedures. Otherwise, more audit failures will follow.


Neal B. Hitzig, PhD, CPA, is a professor of accounting and information systems at Queens College and a member of the NYSSCPA’s Auditing Standards and Procedures Committee. He is a retired partner of Ernst & Young.

Note: This article has been corrected from the original printed text. See here for a full text of the correction.


This Month | About Us | Archives | Advertise| NYSSCPA
The CPA Journal is broadly recognized as an outstanding, technical-refereed publication aimed at public practitioners, management, educators, and other accounting professionals. It is edited by CPAs for CPAs. Our goal is to provide CPAs and other accounting professionals with the information and news to enable them to be successful accountants, managers, and executives in today's practice environments.

©2003 CPA Journal. Legal Notices

Visit the new cpajournal.com.