Quality
Review
Working
Toward an Ideal System for the Public and the Profession
MARCH
2006 - This white paper was approved by the New York State
Society of CPAs Board of Directors on December 8, 2005. It
was developed by the Quality Enhancement Policy Committee,
chaired by Thomas E. Riley, of TFG CPAs. The current members
of the committee are Brian A. Caswell, of Caswell & Associates
CPAs P.C., Andrew M. Cohen, of Weiser LLP, Mark Ellis, of
Michael C. Fina, Martha A. Jaeckle, of Jaeckle Kearney &
Lepselter, Stephen F. Langowski, KMPG LLP, Vincent J. Love,
of Kramer Love & Cutler LLP, Michael L. McNee, of Marks
Paneth & Shron LLP, Robert E. Sohr, Stephen P. Valenti,
of Stephen P. Valenti, CPA, Maryann M. Winters, Sirchia &
Cuomo LLP, and Margaret A. Wood, of Grant Thornton LLP. John
H. Eickemeyer, of Vedder Price Kaufman & Kammholz, and
H. Stephen Grace Jr., of H.S. Grace & Company Inc., serve
as advisors to the committee. Introduction
The
accounting profession’s first uniform peer review
program was created in 1977 by the American Institute of
Certified Public Accountants (AICPA). The program established,
for the first time, requirements to make certain all member
firms conducting attest functions adhered to a single set
of generally accepted auditing standards. The most important
requirement—peer review every three years—would
monitor adherence to those standards.
At
various times during the past 30 years, important, fundamental
changes to peer review have occurred because of its growing
importance to both the public and the profession. But the
peer review process was initially designed as—and
remains—an educational and remedial program to prevent
recurrences of problems and correct deficiencies in the
practice of member firms. It was not created with third
parties in mind or intended to supplant the enforcement
responsibilities of others. From the beginning, the role
of peer review was educational and corrective rather than
disciplinary. Members expected, and the program delivered,
confidentiality throughout the process. In fact, those working
on peer review and ethics in the AICPA and the state societies
cannot exchange information.
In
today’s world, however, this kind of opacity no longer
seems to be acceptable. Some businesses now use peer reviews
to assess the work of their accounting firms. In addition,
the universe of users that rely on peer reviews has expanded.
Peer review has taken on an increased significance to an
audience that now extends well beyond CPA firms to regulators,
clients, credit grantors, and others, all of whom expect
a monitoring system that is effective, public, and a model
of integrity.1 The
public’s perception of peer review is not one of a
cooperative coaching system among firms, but of a seal of
approval for a firm they might consider to engage. Given
this evolving professional environment, new reporting requirements,
and an atmosphere of enlightened accountability, close scrutiny
of the nearly 30-year-old peer review program is not only
prudent, it is vital to the betterment of our profession.
Indeed,
if the accounting profession does not soon strengthen its
peer review process, a governmental body is likely to take
over the program. One need look no further for evidence
of this than the creation of the Public Company Accounting
Oversight Board (PCAOB) by the 107th Congress after the
collapse of Enron and subsequent accounting scandals. In
regard to public companies, Congress has made it clear that
mere adherence to generally accepted auditing standards
(GAAS) is no longer enough. CPA firms are now prohibited
from preparing or issuing audit reports for U.S. public
companies unless and until they register with the PCAOB.
The PCAOB has also abandoned the entire premise of peer
review, opting instead to conduct inspection of firms through
an internal staff of well-paid inspectors.
Several
other federal agencies have also adopted peer review as
part of their stringent reviews of finances and organizations.
The Government Accountability Office (GAO) has mandated
that firms that review the usage of most federal funds—including
monies given to almost every governmental unit and nonprofit
in the country—must undergo peer review. And the FDIC
now requires firms that review major banking transactions
to undergo peer review. The end result: while peer review
has been incorporated into many of the nation’s audits,
there have been no significant changes to the AICPA peer
review program.
QEPC
Goal and Focus
Since
the audit of Enron and the audit of Roslyn’s school
district2—both of which were performed
by peer-reviewed firms that have since gone out of existence—many
questions have been raised about the efficacy of the AICPA’s
current peer review program. In an effort to address some
of the common criticisms of the peer review program and
determine how it should best be run, the New York State
Society of CPAs (the Society) formed the Quality Enhancement
Policy Committee (QEPC) in August 2004. The AICPA has also
created a task force to look at the peer review process.
The
QEPC’s goal in this whitepaper is to outline the tenets
of an ideal quality-review system which will promote consistency
of quality throughout the profession, and upon which the
public can rely. Because of the breadth and depth of the
subject, the QEPC chose to deal with one issue at a time
and first develop its evaluation and recommended redesign
of peer review along conceptual lines that address some
common criticisms. Because the analysis is conceptual in
nature, it does not address all of the detailed implementation
issues, consideration of which could require alterations
to the committee’s recommendations.
It
is important to note that the AICPA peer review is a national
program, with some specific variations to accommodate differences
in the 383 states4 that require it.
New York State does not currently require peer review, although
many CPA firms in New York State have chosen voluntarily
to participate in the peer review program and others have
entered the program because of federal requirements or AICPA
membership bylaws. The committee’s deliberations focused
on how peer review functions in New York State only and
on fundamental conceptual issues in addressing improvements
in the program in the practice environment of New York.
Summary
of committee conclusions. The committee first
and foremost has concluded that peer reviews should be considered
part of a comprehensive quality-review process. This means
reviews should include elements such as: peer review; ethics;
provisions for progressive discipline; educational programming;
and self-monitoring. The committee has also concluded that
the Society does not have the authority necessary to implement
a strengthening of the current peer review program by itself;
legislation will be required in New York State to change
the current peer review program. In addition, the program
should be mandatory, and, to the extent feasible, consistent
with a national program. Finally, at some point in the future,
the committee feels that quality review needs to be extended
to all practice areas of the profession. But for now, the
committee has focused its attention on attest service, which
is the bedrock of the profession.
Findings:
Peer Review Background
The
QEPC began with an in-depth examination of the AICPA peer
review program, which the Society currently administers
in New York State as an agent. The committee also studied
the Society’s Professional Ethics Committee (PEC)
and Peer Review Committee (PRC) in order to best determine
its own direction and to avoid duplication of efforts.
Participation
in the AICPA’s peer review program is now a requirement
for all AICPA member firms that practice accounting, auditing,
or attestation under the Statement on Standards for Attestation
Engagements (SSAE). The program is directed by the Peer
Review Board of the AICPA (PRB), which consists of three
task forces: education (for training, alerts, etc.); oversight
(oversight of the state-administered programs); and standards
(setting of all peer review standards). The Society’s
PRC—which functions as a report acceptance body for
the AICPA’s peer review program—is made up of
about 20 members, all of whom are qualified as peer review
team captains. Society staff supports the technical review
and administration of the program. The AICPA requires that
its database system be used to administer the peer review
program.
There
is no contract between the AICPA and the Society. Instead,
an annual “plan of operation” (by means of a
questionnaire) is executed on behalf of the Society. Fees
of about $475,000 are charged by the Society to cover the
administrative costs of serving 2,329 firms.5
Peer
reviews are required once every three years for firms enrolled
in the program. The Society’s administrative staff
schedules the reviews. The firm to be reviewed selects its
peer review team captain (referred to as a TC in the case
of a System Review) from an AICPA-approved list of reviewer
firms. It also submits a profile on the nature of its practice
in order to attempt a proper matching to a TC with the appropriate
expertise.
The
PRC looks at between 600 and 700 reports a year. The committee
is divided into three-member review acceptance body task
forces (RAB) that determine whether reviews are performed
in conformity with peer review standards. In the past few
years, there have been between five and seven such three-member
groups that have divided the workload proportionately. If
a firm’s report is unmodified with repeat findings,
modified, or adverse, the RAB may prescribe remedial action
for that firm, such as CPE, an accelerated review (i.e.,
another review sooner than the three-year cycle), remedial
direction from an outside consultant, or an outside monitor.
The review is not considered complete until the mandated
follow-up has been completed and accepted by the committee.
If
the RAB cannot agree on the findings of a report, it goes
to the full PRC. If there is still disagreement, the report
is forwarded to the AICPA peer review board. Finally, if
the report cannot be reconciled by the peer review board,
it goes before an AICPA hearing panel. If a report goes
to the panel because a firm has been uncooperative (or some
other issue where standards are not being met), the panel
can recommend expulsion from the peer review program, which
by extension would mean expulsion from the AICPA. Expulsion
would not occur before some due process before a joint trial
board.
The
PRC conducts oversight of 12 to 20 reviewers (primarily
TCs) selected annually. An oversight task force reviews
the work of the TC. Oversights are done both on- and off-site.
High-volume TCs, or those with aberrations (e.g., all their
reviews are unmodified), are selected. There are many criteria
for selecting a review or reviewer for oversight. Starting
this year [2005], one third of the TCs must represent to
the PRC that their areas of expertise are factual and their
CPE is up to date.
There
are three types of reviews under the current standards:
Report
review. This review has no passing or failing
grade, and is somewhat like a management letter. Report
reviews are done for practices that issue compilations with
no disclosures. They are performed off-site and are restricted
to the contents of the reports. Recommendations for improvement
can be made. If there are no comments, the technical reviewer
can accept report reviews without the need for further review
by the PRC.
Engagement
review. This review encompasses the examination
of compilation and review engagements and attest engagements
under the SSAEs, but not audits. After analysis by the technical
reviewer, such reviews move to the PRC, which determines
whether to accept the recommendation of an unmodified, modified,
or adverse report. An unmodified report may have a letter
of comment (LOC), which is a recommendation for improvement
(i.e., for an item which is not a material departure from
professional standards). The firm must answer the LOC in
writing. LOCs are optional for unmodified reports, but are
generally required for modified reports. Under 2005 revisions,
findings that make the report modified are included in the
body of the report. Findings that do not cause a modification
of the report are included in the LOC. If an adverse report
is issued, all findings are included in the body of the
report and no LOC is issued.
System
review. This review is for firms that perform
audits or financial forecasts and projections under the
SSAEs. It is not engagement-driven but rather focused on
the firm’s system of quality control (QC). Review
steps undertaken include staff interviews, as well as the
examination of library resources, hiring and education practices,
and the firm’s ongoing monitoring of its QC system.
In addition, using a risk-based selection process, engagements
and their related working papers are reviewed.
The
number of completed reviews by review type are shown in
Exhibit
1. The results of these reviews are shown in Exhibit
2.
Nationally,
approximately 1% of reviews are adverse, 5% are modified,
and the balance are unmodified, with about 40% of these
having an LOC.6 National statistics incorporate
38 states with mandated peer review. New York State does
not currently mandate peer review, which may account for
any discrepancies between New York State and national figures.
There
are, in effect, three types of rejections of peer review
reports by the PRC:
-
Type 1: The PRC disagrees with
the technical reviewer’s recommendation to accept
or defer a report. A Type 1 requires that a report change
from unmodified without an LOC to unmodified with an LOC;
from unmodified to modified; or from modified to adverse.
- Type
2: The PRC concludes that the reviewed firm
should be subject to additional corrective action, or
that different follow-up action is more appropriate than
the action recommended by the technical reviewer.
- Type
3: The PRC concludes that additional feedback
to the TC is necessary.
Although
exact statistics are not maintained on each type, an overall
review of the most recent years’ activity suggests
that Type 1 decisions are reached only on rare occasions,
and Type 2 and Type 3 decisions are reached infrequently.
The
AICPA peer review program is not for firms that are registered
with and inspected by the PCAOB. These firms are members
of the AICPA’s Center for Public Company Audit Firms
peer review program (Center PRP), and reviews are administered
on a national level by the Center PRP Committee. In New
York State, 116 firms are inspected by the PCAOB and the
Center PRP. In light of the recently enacted inspection
process for public companies by the PCAOB, Center PRP reviews
only look at non–public company engagements.
Although
the standards of the Center PRP and the Society-administered
program are ostensibly the same, all Center PRP reports
are available publicly, but reviews of firms that joined
the AICPA peer review program after 1988 are not. Many non–Center
PRP firms that voluntarily adopted peer review before 1988,
however, have participated in the public file. In addition,
because administration of the Center PRP is not handled
by the Society, the logistics of the two programs differ
to some extent. New standards for the Society-administered
peer review program—developed to increase transparency—went
into effect on January 1, 2005. The peer review report will
now mention if the firm does ERISA, GAO, or work for certain
depository institutions. The report will disclose modified
or adverse findings, instead of those findings appearing
only in the LOC. There are also some changes in the selection
method of engagements, including an engagement that is not
disclosed to the firm in advance of the review.
Analysis:
The Current Peer Review System
The
QEPC’s analysis of the current peer review system
focused on the following issues:
Concentration.
Some peer review TCs perform so many reviews
that they may not have an adequate practice base to maintain
currency in all the facets of an accounting and auditing
practice to cover the breadth of practice of the firms they
review. Ten peer reviewers account for more than 40% of
the peer reviews performed in New York State. There are
only 117 “active” reviewers, down from 206 five
years ago.
Disciplinary
authority. The program is set up to be remedial
and educational; the reviewed firm is the principal beneficiary.
The question of a disciplinary component arises in New York
State because all current state legislative proposals include
a peer review component that presumes that the public would
be the principal beneficiary rather than the reviewed firm.
There is effectively no disciplinary component to the program,
and the way in which engagements are currently selected
undermines the concept of using peer review as a disciplinary
tool. The scope of peer reviews is also limited in terms
of the amount of testing.
Fee
structure. Administrative fees charged by
the state societies are established by formulas tied to
the AICPA data system. As a consequence, the size of the
accounting and auditing practice itself is not necessarily
reflected in the administrative fee. In addition, the fees
charged by peer reviewers have raised concerns both from
the reviewed firms over cost control and from peer reviewers
about a competitive fee for the work.
Guidance.
Although there are extensive checklists and lengthy manuals,
many issues arise in the performance and administration
of peer review for which no clear guidelines are available.
The guidance for reviewers, RABs, and the PRC is fragmented
into various manuals, and, periodically, practical issues
must be resolved through phone discussions between the administrative
agent’s technical reviewer and AICPA staff. There
are currently four separate peer review manuals—a
Peer Review Manual, an Administrative Manual, a RAB Handbook,
and an Oversight Handbook—as opposed to one integrated
and comprehensive guide.
Reviewer
qualifications. The program’s standards
for qualifications are very broad. In some cases, a two-day
course and current experience in accounting and auditing
over the past five years are the only requirements for a
reviewer. Because peer review is firm-on-firm, the only
training ground is within a firm that already performs peer
reviews in an area of practice. There is no place for inexperienced
professionals to get the education necessary to perform
the requested work.
Scope
of the peer reviewer’s role. Many firms
in the AICPA peer review program view their peer reviewer
as their QC person or QC resource, and look to them for
advice on accounting, auditing, and independence issues.
This practice creates serious potential conflicts, where
the peer reviewers may ultimately review their own advice.
Selection
process. Firms undergoing peer review currently
choose their reviewer, and some firms could be reviewed
by “friendly” firms, an arrangement which could
undermine the integrity of the system. Smaller firms, in
particular, view peer review as a barrier to entry into
practice, and may look for friends to conduct their peer
review. Because peer review rules do not allow reciprocal
peer reviews, an alternative arrangement is a triad, where
Firm A reviews Firm B, Firm B reviews Firm C, and Firm C
reviews Firm A. The firms will agree to review each other
at a reduced cost in order to manage peer review costs and
still participate in the program.
Improving
Peer Review: What Should It Be?
In
light of the information gathered, the QEPC chose to focus
its efforts on addressing, at a conceptual level, the attributes
of an alternative peer review model instead of following
the AICPA peer review model. This conceptual ideal can then
be used as a foundation on which to build a stronger peer
review system.
After
extensive deliberation, a presentation by a former chair
of the PRC, and a thorough review of a wide range of sources
and inputs, the committee narrowed its focus to areas of
particular concern and detriments—real or perceived—within
the current system. These included:
-
The relative lack of an effective disciplinary component
to apply after remedial failures (or the inefficacy of
what currently passes for discipline);
-
Concerns surrounding the idea of “who is the reviewer,”
such as:
-
reviewer certification;
-
reviewer training;
-
reviewer qualifications;
-
the concentration issue;
-
the matching of reviewers with appropriate size and
experience to a firm; and
-
the resources available to reviewers;
-
The idea of a broader evaluation of the environment of
quality within a firm, rather than a checklist-oriented
review;
- The
need for more openness and transparency in the program,
for the benefit of the profession and the public;
- The
need for some force of regulation to make quality review
mandatory;
-
Operational risk management; and
-
Comprehensive program guides.
Based
on this analysis, the committee established a set of conceptual
ideals for a new quality review program, outlined below.
These ideals are meant to serve as a foundation for a restructured
quality review system that can monitor and enforce consistency
of quality for the benefit of the profession and the public.
Concept:
Quality Review
-
The scope of peer reviews should be clearly focused on
quality. Quality reviews are more extensive than peer
reviews, and include internal control reviews, a chief
executive’s responsibilities relating to quality,
etc.
-
Peer reviews should be viewed as only one aspect of a
comprehensive quality review system. The goal of a quality
review should be to ensure that firms are complying with
their own continuous, self-imposed systems of quality
control.
-
The higher the risk, the more extensive the review required.
-
Third parties should be able to see the benefit of a quality
review.
Quality
Review Implementation
Quality
reviews will include the following:
1)
Review of Six Functional Areas
Review
of Tone at the Top
The
primary objective of the review of the firm’s “tone
at the top” is to assess whether leadership’s
actions and communications demonstrate a commitment to audit
quality and professional standards in connection with the
firm’s performance of audits, issuance of audit reports,
and related matters involving clients.
The
subject areas for review and analysis will include, but
will not be limited to: the firm’s code of conduct
for employees; documentation related to the firm’s
ethics; organizational charts; information concerning reporting
relationships; and certain communications from management.
Review
of Partner Evaluation, Promotion, and Assignment of Responsibility
The
objectives of the procedures in this area are to assess
the firm’s current policies and procedures for evaluating
and measuring partner performance, for assigning responsibilities
to partners, and in disciplining partners; and to evaluate
whether the design of the measurement and evaluation processes
as documented and communicated can be expected to achieve
the objectives of promoting audit quality.
Review
of Independence Policies
The
objectives of the procedures in this area include gaining
an understanding of certain firm policies and procedures
relating to its compliance with independence requirements.
Reviewers will focus on independence issues related to the
provision of nonaudit services to clients and the firm’s
business ventures, alliances, and arrangements. Reviewers
will also inspect the firm’s requirements regarding
personal holdings of securities, as well as its programs
designed to monitor compliance with these policies.
The
subject areas for review and analysis will include, but
will not be limited to: inspection of the firm’s policies,
procedural guidance, and training materials pertaining to
independence matters and permissible service arrangements
with audit and nonaudit clients.
Review
of Client Acceptance and Retention Policies
The
primary objectives of the procedures in this area are to
evaluate whether the firm’s client acceptance and
retention policies and procedures reasonably ensure that
the firm is not associated with clients whose management
lacks integrity, that it undertakes only engagements within
its professional competence, and that it appropriately considers
the risks involved in accepting and retaining clients in
the particular circumstances.
The
subject areas for review and analysis will include, but
will not be limited to: the policies and procedures for
acceptance and continuance of audit clients; and a list
of all new clients during the previous year.
Review
of Internal Quality-Control Program
The
objectives of the procedures in this area are to evaluate
the effectiveness of the firm’s annual internal QC
program in enhancing audit quality, including evaluating
the results and the remedial actions taken, and to observe
and test the conduct of the internal QC program.
The
subject areas for review and analysis will include, but
will not be limited to: policies and procedures for the
firm’s risk and QC review, including the program’s
goals and objectives and the methods of selecting offices,
partners, and engagements to be reviewed; and the results
of the previous year’s internal QC review.
Review
will include examination of a firm’s quality control
documents for periodic sign-off by the firm’s leadership
responsible for quality assurance matters.
Review
of Practices for Establishment and Communication of Audit
Policies, Procedures, and Methodologies, Including Training
The
objectives of the procedures in this area are to obtain
an understanding of the firm’s processes for establishing
and communicating audit policies, procedures, and methodologies,
including training, in order to: evaluate whether the design
of these processes can be expected to promote audit quality
and enhance compliance; evaluate changes in audit policy
that the firm has made; and evaluate the content of the
firm’s training.
The
subject areas for review and analysis will include, but
will not be limited to: documentation explaining how the
firm develops and revises its policies and procedures; excerpts
from the firm’s policies-and-procedures manual and
other internal guidance; and training materials for training
programs conducted for audit professionals. Reviewers will
also: interview the firm’s risk management leader
to determine how the firm incorporates and communicates
changes in its audit policies, procedures, and methodologies;
evaluate the effectiveness of the design of the processes
for monitoring changes that might require additions to or
changes in the firm’s audit policies, procedures,
and methodologies; and evaluate the nature and content of
recent additions to, or changes in, selected firm audit
policies.7
2)
Review of Selected Audit Engagements
Reviewers
will select an appropriate number of audit engagements to
review. The reviewed firm will not be allowed an opportunity
to limit or influence the selection process, and the engagements
to be reviewed will not be identified significantly in advance
of review.
Concept:
Progressiveness (Education and Discipline)
A
quality review system should be progressively disciplinary
in order to satisfy the needs of today’s environment.
A progressive system is one that is both educational and
disciplinary. It maintains emphasis on the value of a system
that can impart knowledge to firms, but does not lose sight
of the fact that there may be some who lack the will to
comply with high standards without sanction. The disciplinary
aspect comes into play after a firm fails or repeatedly
fails to take corrective action. A quality review program
without an effective disciplinary component would not be
acceptable to the public or public officials.
-
The New York State Legislature should enact legislation
which delegates to the Board of Regents, the State Board
for Public Accountancy, and the State Education Department
the authority to set the standards for quality reviews.
New York State’s quality review program should be
consistent with national standards.
-
FAE has the obligation to handle the educational aspect
of the quality review program.
- All
state-related disciplinary matters should be referred
to the Board of Regents and the State Education Department.
-
All Society-related disciplinary matters should be referred
to the Society’s Ethics Committee.
-
Progressive disciplinary action will require a system
which has:
-
Some degree of the force of law to enforce the discipline;
-
Public reporting;
-
Comprehensive and easily accessible guidelines for
reviewers;
-
A provision for feedback and corrective action by
the reviewed firm;
-
A means of referral to a disciplinary body, established
by the Board of Regents;
-
Documentation of findings and recommendations; and
-
A due process mechanism with realistic timeframes.
Progressiveness
Implementation
In
a progressive discipline system, the severity of the penalty
increases with each infringement of the rules. Progressive
discipline establishes a process of clear, timely, consistent,
and documented communications with a firm, designed to ensure
an understanding of expectations, provide an opportunity
to correct behavior, improve performance, and ensure due
process.8 Among the advantages of a progressive
discipline system is the fact that reviewers can work with
a firm without having to resort to the most severe penalties
immediately. Typically, the
progression is:
Usually,
after a specified time period passes without another infraction,
a firm will get a “clean slate.” Any later infractions
will start the process again with remedial steps. Some infractions
are so severe that the first one, two, or even three steps
may be skipped.
Concept:
Reviewer Pool
A
pooled team of individuals from different firms should conduct
quality reviews.9 This would replace
the current firm-on-firm structure and would allow for an
exchange and learning among reviewers and reviewees and
be more effective and equitable.
-
The assignment of qualified individual reviewers, not
firms, out of a pool would obviate the potential conflicts
of interest that result from allowing a firm to select
its own reviewer.
- A
pooled team of reviewers would make issues concerning
firm-on-firm reviews or the rotation of reviewers irrelevant,
since a new team would be designated for every review.
-
Reviewers should be systematically trained, certified,
and evaluated.
- Reviewers
should be chosen from a state-certified list and assigned
to firms based on a matching of the specialty of the practice
of the reviewed firm and the reviewer’s experience,
skill set, and area of expertise.
- The
selection of quality reviewers should be made by a coordinating
council, which would be selected by either the state or
the Society.
-
The size of the quality review team should be proportionate
to the size of the firm being reviewed.
-
Minimum levels of knowledge, expertise, experience, and
education should be required of reviewers entering the
pool.
-
Reviewers should be completely independent of the firms
they review, in both perception and reality.
-
Quality-review team members should be adequately compensated.
-
Reviewers should have access to a comprehensive guide
of review standards that provide for an effective, risk-based,
and thorough review of the quality of the firm’s
practice.
Reviewer
Pool Implementation
All
qualified firms will be incentivized to place employees
on a state-certified list of reviewers. Possible incentives
include lower review fees for firms that place a certain
percentage of their employees in the reviewer pool, and
the benefit of exposing reviewers to the review process
and to other firms’ best practices. Potential reviewers
will submit their resumes and fill out a questionnaire designed
to gather information about their experience and areas of
expertise so they may be categorized as either specialists
or regular reviewers. Resumes and questionnaires will be
periodically and randomly reviewed, audited, and tested
to ensure they accurately represent reviewer qualifications.
Reviewers
can then be matched with a firm based on the compatibility
of their areas of expertise and the firm’s industry
specialization.
The
number of review team members on a team will be determined
by the size of the firm being reviewed. The Society will
ensure that there are no independence issues between reviewers
and the firm to be reviewed. A firm has the right to accept
or reject the selected review team for sufficient reason
and to ask the Society to select another team. The cost
of the review will depend on the type of review the firm
requires. The Society, if designated, will handle the administration
of the reviewer pool and bill firms for the program, with
the goal of breaking even.
Reviewers
will then perform the review, contacting the firm with any
questions and deficiencies noted. The Society will then
send the report and letter of comments (if necessary), requesting
a letter of response.10 The Society’s administrative
duties will include, but not be limited to: selecting and
forming reviewer teams; mailing engagement letters; coordinating
the exchange of fees; and reimbursing reviewers for certain
travel costs.
Concept:
Robust Program Structure and Reporting
The
demands of today’s users of quality review will require
that the system be public and open.
-
Within the framework of an open system, reviewers shall
respect the confidentiality of client records.
-
The program should be established with appropriate authority
to be able to monitor and foster competency and encourage
compliance with the highest standards and reporting of
such compliance to the state.
-
The program will have an obligation to provide more education
on quality control.
-
Firms should have the ability to tap into the knowledge
base of reviewers (other than those who are performing
the firm’s review).
-
The quality review program should be self-funding.
Moving
Forward
The
committee unanimously agrees that the quality review system
it seeks to implement must be a major step up from the current
program, in terms of standards and comprehensiveness. A
new review methodology should be used by a pooled team of
qualified professionals, provide for progressive discipline,
and include an evaluation of the firm’s quality control
environment as well as a review of engagement performance
quality.
On
December 8, 2005, the New York State Society of CPAs Board
of Directors voted to approve as final this whitepaper on
quality review, and directed that the paper be sent to the
Society’s Legislative Task Force for implementation
of its recommendations in the Society’s legislative
agenda for 2006.
Notes
1
Robert L. Bunting, “Transparency: The New Peer Review
Watchword,” The CPA Journal, October 2004.
2
The Roslyn school district was hit by a scandal that
saw the school superintendent, the chief business administrator,
the district’s auditor, and three other employees
arrested for allegedly stealing millions of dollars from
the district.
3
AICPA State Societies and Regulatory Affairs, August
2005.
4
Guam, a U.S. licensing jurisdiction, also mandates peer
review.
5
As per AS400—the AICPA’s database system—as
of August 2005.
6
As per AS400 as of August 2005.
7
PCAOB, “Report on 2003 Limited Inspection of PricewaterhouseCoopers
LLP,” August 26, 2004.
8
San Francisco State University Human Resources, “Practice
Directive P206: Progressive Discipline Guidelines,”
www.sfsu.edu/~hrwww/directives/p206.htm.
Revised March 2004.
9
Some concern has been expressed that employing the pool
concept, which limits a firm’s ability to select its
own reviewer, could be a concept that regulators would have
interest in adopting for financial statement audits, resulting
in both public and private businesses losing their ability
to select their own auditor. The pool concept is practical
only for quality review and not financial statement audits.
It should not and need not be extended any further. For
practical purposes, it is not feasible to expect that some
regulatory agency could be created to assign an auditor
to every public and private entity in the United States,
or that the capital markets would accept that. In fact,
Congress backed away from such considerations when it created
the PCAOB.
10
Missouri State Society of CPAs, “Missouri’s
Cart Program,” www.mocpa.org/peer_directory.html#cart.
|