6. Performance Auditing, Good Practice Guides and Inquiries

Report on the Efficiency and Effectiveness of the Office of the Auditor-General of New Zealand by an International Peer Review Team.

(Sections 16 and 18 of the Public Audit Act 2001)

6.1 Introduction

In this Section, we review the discretionary work of the Auditor-General; in particular, the performance audits and inquiries, which make up around 10 per cent of the Office’s activities but which are quite pervasive in their impact. There are also important discretionary activities associated with the financial statements and assurance activities, as covered in sub-section 5.3.

6.2 Performance Auditing

The Auditor-General has the authority to undertake performance audits under Section 16 of the Public Audit Act 2001. Such audits are a key element in the delivery of the Auditor-General’s five responsibilities under the Act; in particular to ensure that resources are obtained and applied in an economical manner (i.e. that taxpayers’ dollars are not wasted).

Performance audits cover both central and local government spending. Following a significant increase in funding in 2004, the number of performance audit reports published by the OAG rose steadily from 10 in 2004-05 to 15 in 2006-07. Between 19 and 21 reports (including special studies and inquiries) are planned for 2007-08 of which 14 will be performance audits. On average, performance audits take between 9 and 12 months to complete; and most studies cost between $170,000 and $230,000, with the total estimated cost for the Performance Audit Group (PAG) for 2007-08 being around $2.4 million.

As well as performance audits, the PAG also provides wider assistance to the Office. For example, the Group supports inquiry work and has produced or contributed to the expanding range of good practice guidance published by the OAG in recent years, such as that on managing conflicts of interest for public entities.

Overall, we found a robust performance audit framework, with thorough processes and quality assurance arrangements complementing the findings of other external quality assurance reviews in recent years.

The reports themselves were clearly written and easy to follow. We must also acknowledge the positive comments made about the OAG’s performance reports in our interviews with Ministers, Select Committees and public entities. These were in keeping with the results of the OAG-commissioned 2007 Stakeholder Feedback Interviews with Select Committee Chairpersons and Deputy Chairpersons and other stakeholders from Treasury and the New Zealand Society of Local Government Managers which showed all respondents as satisfied with the quality of OAG performance audits and 86 per cent satisfied with their usefulness. The report also made reference to the general comment from Select Committee Chairpersons and Deputy Chairpersons that the feedback they receive from their Members is very positive about the Office.

We recognise that a large body of material has been generated in recent years appraising the functioning and outputs of the PAG. We are also aware of current initiatives by the OAG to sharpen the focus of the planning cycle and to improve the efficiency and speed of delivery of audit reports, in particular the Performance Audit Process Improvement Project.

Beyond the interviews with a wide range of stakeholders, we reviewed a wide range of documents, including procedural and process guidance, and all recent internal and external quality assurance reviews. We also examined a sample of six performance audits published in the later months of 2006-076, together with the related Post-Project Reviews, which represent an important quality assurance mechanism in their own right, providing a useful analysis of each audit assessed against a set of common criteria.

Audit Strategy

The overarching elements of the Auditor-General’s Strategy for performance audits are set out in the Office’s 2004-09 Strategic Plan. These include increasing the number of performance audits, expanding the suite of good practice guides, and improving the timeliness of performance audits by introducing project management disciplines.

The Auditor-General’s Annual Plan for 2007-08 identifies four “core areas of interest” to the Auditor-General in the selection of performance audit topics:

  • public investment or liability management;
  • public revenue management or generation;
  • asset management or infrastructure spending or management; and
  • expenditure, including service delivery expenditure.

The Auditor-General has defined his role in the Annual Plan as being “to regularly provide assurance about the activities of public entities that are large and complex, and/or where it is difficult to assess their performance”.

Generally, the performance audits listed in the Annual Plan are completed in the year to which the plan relates. Inevitably, some are delayed or displaced by new or more pressing topics. Out of the 20 performance audit reports and other studies published in 2006-07, five were additions to the original work programme. Also in that period, four topics were either deferred or were presented to Parliament in a different form, while two others were deleted from the work programme and one, on e-government, was not continued with. The range of topics reported upon meant that there was good coverage of public expenditure, although, with the deferment of the major acquisition projects report to 2007-08, there were no audits completed in the Defence area in 2006-07. Here we are conscious of the comments made in stakeholder feedback interviews by the Chairperson and Deputy Chairperson of the Foreign Affairs, Defence and Trade Committee; in particular their wish to see year-on-year financial and performance trend information in reports, including a deeper insight into the capital flows that exist within departments.

Most audits in 2006-07 were about either service delivery or operational performance. There were no studies looking directly at capital spending, although there are a number planned for 2007-08, such as that on local government asset management planning.

The Auditor-General is considering whether the current balance of studies is appropriate. For example, whether there would be value in publishing one or two larger and more wide-ranging reports going beyond the more traditional examination of discrete programmes within an entity, to look at the performance of that entity more generally or to look across a sector or sectors. This would be in keeping with wider developments in New Zealand public administration focused at more “joining up” of public service delivery around the needs of citizens and customers.

There is recognition at senior levels within the OAG that there is considerable scope to streamline the performance audit function. For example, a study in early 2007 identified a total of 69 steps involved in the conduct of a typical performance audit. In an effort to simplify these arrangements, without sacrificing quality, there is in place an initiative - the Performance Audit Process Improvement Project - which aims to reduce this number to around 24 steps. This, it is hoped, will also shorten the timescale for the completion of performance audits by around 12 to 17 weeks (the average length of a performance audit is currently around 42 weeks, plus around 4-6 weeks of fieldwork); reduce the bunching of publications around the financial year end; and make for more timely reports generally. Beyond these intrinsic benefits, it was put to us by staff in entities that, while a performance audit is on-going, there may be reluctance on the part of an entity to move forward with a reform or initiative, preferring to await the outcome of the OAG’s performance audit work. This may, in turn, risk delays to performance improvement.

Compliance with Auditing Standards

The OAG Performance Audit Manual is the standard reference for the conduct of performance audits. A review of the processes underpinning the performance auditing function (by reference primarily to the Manual), conducted by the OAG’s Accounting and Auditing Policy QA team in March 2007, concluded that appropriate systems and controls were in place to “mitigate exposure to identified potential risk and protect the Office’s role in conducting performance audits”. The review did, however, record that there was, at times, a lack of documentation to determine whether the controls had in fact operated.

It is OAG policy that performance audits should, wherever possible, conform to the Auditor-General’s Auditing Standards which are, in turn, aligned with the New Zealand Institute of Chartered Accountants’ Standards. In 2006, at the request of the Auditor-General, a review was conducted by the PAG to assess the degree of compliance with those Auditing Standards. The review found that the Group generally complied with the Standards but that improvements could be made to document this more clearly and consistently, setting out the changes to the Manual designed to bring this about. It concluded that “where performance audits are conducted in accordance with the Manual then compliance with the Auditing Standards occurs without additional effort.”

Following the PAG review, an amendment (dated 1 September 2007) was made to OAG procedures whereby the Assistant Auditor-General, PAG, is now required to sign-off each audit certifying that “I have reviewed the necessary documents and records, and have conducted the relevant inquiries, to be assured that this draft report and supporting work are consistent with the Auditor-General’s Auditing Standards”.

In 2008, as a further test of compliance, the OAG’s Accounting and Auditing Policy QA team will be looking at how Auditing Standards are applied in performance audits.

Audit Selection

Against the overarching criteria for topic selection set out in the Annual Plan, there is a well-established “Strategic Audit Planning” process for identifying and refining topics for examination. This starts with initial “environmental scanning” by the OAG’s leadership team, usually in February. It is followed by various stages of study development and refinement before a draft work programme is sent for comment to Parliament and to overarching bodies, such as the State Services Commission for central government, and the New Zealand Society of Local Government Managers for local government, around December each year. A further formal but short round of consultations on the statutory draft Annual Plan, taking in earlier views and comments, is undertaken in the period between the end of March and mid-May when the Auditor-General’s Annual Plan and budget are tabled.

Our consultations with Ministers, MPs and Select Committees showed a good general level of satisfaction with the consultation process, with a feeling that it did allow Members the opportunity to raise issues and concerns. The point was made, however, that this is essentially a passive process, undertaken through correspondence. It was put to us during our discussions with the OPC that there might be scope for a drawing together of comments and suggestions from the initial round of consultation as the basis for a discussion between the Committee and the OAG, before the Office finalises its performance audit programme. This would be a natural complement to the recommendation in the 2006 Governance Review that “a high level and non-attributed summary of the content of feedback from Parliament and stakeholders on the proposed scope and priorities in the Auditor-General's discretionary work programme, and of the Auditor-General's reasoning and conclusions on that feedback, should be disclosed with the Annual Plan or other documents containing the decisions on the work plan.” Comment on specific proposals for performance audits or studies that did not make it through to the final programme would also be useful.

The occasional comment was also made that some of the OAG’s performance audit reports addressed relatively minor issues (although these views were directed more at inquiries than performance audits). There was also a view that there was scope for performance audits to take a wider look at systemic issues and effectiveness - an issue also of interest to the Auditor-General.

It is current practice for the OAG to hold discussions with senior managers (usually Chief Executive Officers) about the intended performance audit work programme to seek their input, as well as providing feedback about the final decisions reached. Senior staff in public entities interviewed for this review were generally supportive of the OAG’s performance audit work; although they felt that on occasion there might be more important topics for the OAG to pursue. They also wanted to be kept informed about the emerging scope and design of specific audits. In the light of the evidence from this review, the OAG might wish to canvass the views of heads of agencies to determine how the consultation process with their organisations might best meet the objectives of all concerned. While ultimately it is, of course, for the Auditor-General to determine the programme of performance audits, senior stakeholders in public entities are well placed to understand where performance audits can have most impact in bringing about improvement. For this reason, it is also useful to keep under review any need to consult such stakeholders directly during the audit, rather than simply to rely on any liaison arrangements put in place at the outset.

Since 2006, effort has been focussed on securing a greater involvement of performance audit teams in study selection and design. While Sector Managers remain responsible for the initial selection work, both PAG and Sector Managers (a roughly 80/20 per cent division of responsibility) now carry out detailed scoping earlier in the planning cycle to develop and test a shortlist of around 25 audit proposals. This involves performance audit teams meeting with the relevant Sector Manager, the appointed auditor, and the auditee “to identify any potential issues that may be an impediment to…undertaking a performance audit”. A Preliminary Scoping Template is then completed.

In some instances, the scoping process has been protracted due to differing expectations of the performance auditors, Sector Managers, the auditee and OAG senior management about the focus of the audits. As part of the move towards closer joint working there have also been successful experiments with “issue analysis” meetings, where Sector Managers, performance auditors and representatives from the auditee have come together to discuss a study proposal and to achieve some consensus of views.

The issue of the scope of performance audits has been raised in external reviews as well as by the PAG themselves following a March 2007 exercise examining the Post Project Reviews for 19 performance audits completed in 2005 and 2006:

“Insufficient scoping being undertaken to understand the sector, and in particular, to establish whether the audit was relevant and well timed or might conflict with other reviews or activities in the sector.”

More positively, one Post Project Review also identified a case where in-depth research and scoping work had paid off, creating a “well focused clear and useful audit”.

On some occasions, issues with project timing have resulted from the scoping process not identifying imminent developments in the audited bodies. For example, a proposed study on tertiary education was deferred from 2004 to 2006 when it was discovered that the New Zealand Qualifications Authority was performing its own review that overlapped with the areas that the PAG were expected to cover. During a performance audit of health information, the Ministry introduced a new information strategy which would have been arguably the better focus for the audit.

One potential measure of the correctness of the selection and scoping process of performance audits is the number of audits that are either not started or, ultimately, have to be abandoned. In 2006-07, two topics were deleted from the work programme, while only one (on e-government) was not continued with, as noted earlier.

Suggestions for Improvement

  • The Office of the Auditor-General should consult with the Officers of Parliament Committee and the Finance and Expenditure Committee to determine the value they would see in a more proactive discussion on the Office’s draft performance audit work programme. This would include commenting on the reasons for specific proposals for performance audits or studies not being included in the final programme.
  • The Office of the Auditor-General may wish to discuss with heads of major public entities how the existing consultation arrangements on the Office’s performance audit work programme might be further developed and refined; and, where a performance audit is pursued, whether there would be merit in more direct contact with the relevant Chief Executive during the key design and scoping stages of the audit.
  • The Office of the Auditor-General should undertake an analysis of Post Project Reviews covering performance audit reports published in 2007-08 to determine how the closer involvement of performance audit teams with the scoping of performance audits has paid dividends. The Office of the Auditor-General should also continue to explore new and innovative ways to get to the bottom of topics, such as making further use of “issue analysis” meetings, to make sure that it has full knowledge of entities’ plans and vice-versa.

Audit Methodology

There are some very positive aspects of the methodology used in the performance audit work of the OAG. Many of the reports make effective use of examples to illustrate good practice and to draw lessons for future delivery. For example, the report on the effectiveness of the ‘Get Checked’ diabetes programme contained good examples of successful local initiatives that could be replicated. Similarly, the roads report contained a useful chapter on good practices in collaboration between local and central government under the title: “What makes collaboration successful”. The reports also reflect the views of service users as well as the auditees, giving the reader a sense of whether or not government services are held in high regard by the people who use them.

Against a backdrop of informative and well-received reports, we offer below some thoughts about possible areas for future development of performance audit methodology, drawing on comments made by other recent external reviewers and examples from the reports we have examined in order to help illustrate the better practice that might be considered. We should also record that the reports examined were prepared under the very arrangements that are currently the subject of OAG scrutiny and reform, and we hope that our observations will assist the OAG in shaping its new approach.

The OAG’s performance reports generally aim to provide assurance about specific issues or programmes and their management by the relevant public entity or entities. They usually focus on what might be termed “assessing performance against expectations”; by testing compliance with pre-determined targets or criteria. Where these are based predominantly on the entity’s own objectives and standards, there is a risk that, without some form of external benchmarking, the audit report may hold entities to modest or undemanding measures that do not encourage the stretching of performance. Here we were struck by the recent comment of an external quality assurance reviewer “It is important not to trivialise performance auditing by making it a mere compliance report against unilaterally developed management standards.”

We raise this issue as a potential generic risk to the overall quality of performance audits. We are conscious, however, of recent examples where audits have used successfully external benchmarking, such as that done in the Contact Centre report to appraise auditee performance against industry good practice.

When commenting on particular activities, OAG performance reports are limited, at times, to a description of analyses or evaluations that the entities themselves have done, or to recording, for example, what the “council told us” or what the entity “believes to be the case” without offering a conclusion on whether or not the activity is well performed and cost effective. This leaves the reader with potentially unanswered questions, as the following extracts from reports indicate:

“Service Express has the potential to reduce the number of calls that need to be answered by Customer Service Representatives. Fewer calls could lead to savings for the contact centre and shorter waiting times for callers.” (Contact Centre report, paragraph 2.86);

“The Service has not compared its methods with alternatives. Therefore we were unable to form a view about whether it is the most effective and efficient method.” (Customs report, paragraph 2.57); and

“Transit had not formally compared the costs and benefits of collaborative agreements with other network management approaches.” (Roads report, Summary, page 7).

At times, the data simply does not exist within the entity, or elsewhere, to allow the OAG to undertake the necessary analysis, and it would be too resource expensive for the OAG to generate it independently. For example, the complexity of the different arrangements on the roads report would have made it very difficult for the audit team to have reached an independent and definitive conclusion on the costs and benefits of the various schemes. As it turned out, the report was only able to describe the financial computations done by the constituent local authorities and Transit.

An entity’s lack of information is itself a potential audit finding. Consequently, the OAG may want to consider being more ready to offer an opinion on whether or not it considers that the entity should have had the particular information available for analysis and decision-making.

The OAG makes use of a range of methodologies in its performance audits including interviews and document examination and checks upon processes and procedures. As well, it undertakes surveys and performs statistical analyses to provide new, independent insights into an auditee’s activities and performance. For example, the Roads audit involved the collection and analysis of road condition data. From our review of recently published reports, and comments from external reviewers, there is potential for the OAG to have more of this independent research and analysis in its reports, and rather less descriptive content.

The foregoing outcome might be achieved through the greater use of surveys, more external benchmarking of performance, as happened in the Contact Centre audit, or through the commissioning of statistical research or modelling to determine the key contributors to successful performance and/or to examine performance variations.

It may also be useful to have a rather more formal demonstration of the reasoning behind the methodologies used in a given report; by setting out in a separate appendix the rationale for the methods used, and details of each; why, for example, a particular sample size was chosen, the number of interviews undertaken, who was interviewed and at what level, and the purpose for doing so. This relatively straightforward measure would provide a discipline on the audit team to demonstrate and justify their choice of methodology and, in our opinion, give performance audit reports added authority and conviction.

We note that, at times, there is only limited financial analysis in performance reports. More quantification of the costs and benefits of government programmes would help to determine whether entities are choosing the most cost-effective means of achieving their objectives. For example, this might have worked well for the Contact Centre report exploring the relationship between staff costs and service levels, and within this, the financial impact of “Service Express”. Similarly, in the Diabetes report, various stakeholder views were recorded on the adequacy of the $40 fee paid to GPs for each patient check carried out. It might have been useful for the PAG, without questioning the policy, to commission further research into this issue to better understand, for example, the influence that the fee had on GPs to encourage patients to participate in the “Get Checked” programme compared with other factors.

Suggestions for Improvement

  • Where an entity’s performance is assessed against internally set standards, there is a risk that performance audit reports may not encourage the stretching of performance. At the audit planning stage, the PAG should test to see whether the methodologies proposed for the audit will yield sufficient independent benchmarking, research and analysis to provide an objective assessment of the auditee’s activities and performance, including whether the immediate targets and standards that the audited body has set for itself are sufficiently challenging.
  • The Office of the Auditor-General may wish to be more ready to offer an opinion on whether it considers that, where particular information is unavailable, the entity should have had that information available for analysis and decision-making.
  • More financial analysis would be helpful in some reports, particularly those that examine particular programmes and initiatives.
  • More quantification of the costs and benefits of government programmes would help determine whether entities are choosing the most cost-effective means of achieving their objectives.

Presentation in Reports

Overall, we found performance audit reports clear and easy to read. There were examples of particularly useful report structures, such as chapters three to five of the Immigration report which detailed clearly at the front of each section: what that section would cover; what the OAG expected to see; and how the department had performed against these expectations.

The structure of some reports meant that the reader did not always get immediately to the most important findings because there was rather too much descriptive material in the way. The Immigration report contained very incisive points on staff training and backlogs, but these were only covered in detail towards the end of the report. By contrast, the Contact Centre report highlighted its key findings in Part 2 of the document, but this meant that there were very few recommendations stemming from the remainder of the report. That report might have given more prominence to the useful and practical steps to strengthen the performance of the contact centre rather than just having them included in an Appendix.

Our reviews of performance audit reports raised questions at times about why particular issues had not been covered, suggesting that there could be more discussion in the text on what exactly the report was focussing on. For example, there is important commentary in Part 4 of the Customs report about the information currently available on levels and trends in compliance (that is, whether all the customs revenue due is actually collected). The scope of the audit at the front of the report, however, makes reference only to revenue collection arrangements and does not mention compliance issues, leaving the reader potentially unsure about the issues that the report will address. In fact, the answer in this case was shown in the Post Project Review as follows: “We discussed Customs’ comment that they would have liked the audit to have included an assessment of Customs’ capability to collect revenue. We agreed that this is a good topic, but would be an audit of its own.”

The view of OAG and entity staff was that, to have comprehensively covered both collection and compliance in this particular performance audit would have made the audit unmanageably large. It might have been helpful, for reasons of clarity, to have included such an explanation in the report itself.

More summary financial detail in reports would be welcome. For example, while the Roads report did not reach an independent and definitive conclusion on the costs and benefits of the various schemes under review, some simple high level figures in the summary, such as the total cost of state and local road maintenance, would have been helpful.

While, in the main, the recommendations made in performance audit reports are clear and well targeted, it was also put to us that some are more about fine-tuning, or tend to confirm actions already planned or put in train by the entity in question. In addition, previous external reviewers, and this present review, have found some recommendations to be rather generalised in nature. While the purposes of the performance audit reports are different to those of Peer Reviews, we were conscious of this issue in making our suggestions for improvement. We were also conscious of the results of the PAG’s own March 2007 review of Post Project Reviews, which reported (in the context of follow-up audits) that some “original recommendations had not been framed in a way to allow easy auditing and had to be re-formulated for the follow up.”

Steps to sharpen recommendations include a recent amendment to PAG procedures to emphasise that recommendations should be significant and targeted. To these might be added the good practice of being measurable: setting out wherever possible clear timescales for action by the entity, and by defining the criteria against which the impact of a recommendation should be assessed. To assist this approach, audit teams would benefit from practical examples of what significant, targeted and measurable recommendations would look like under a range of different circumstances. Perhaps too, there are lessons to be learned from those teams planning follow-up reports; and that, in principle, good recommendations should be capable of “easy auditing” at some future date.

Making sound and helpful recommendations can be problematic and, at the time of writing, UK National Audit Office, for example, was itself giving considerable thought to how the recommendations in its own value for money reports could be improved. The following are five criteria drawn from the views of the UK National Audit Office staff for what makes for a good recommendation. They should be:

  • logically linked to the findings of a study;
  • costed and time-related;
  • specific in the proposed action and clear in the outcome or benefits;
  • based on sound evidence; and
  • carefully targeted at the appropriate person/people.

The reports might also benefit from more frequent use of graphical presentations. Where graphics are used at present, they mainly set out management structures or information flows, whereas they could be an effective way of both summarising performance and financial information in the introductory sections of reports and describing the results of analyses. For example, the Immigration report might have benefitted from a simple graphic early in the report to illustrate how many people apply and are accepted/rejected under the different immigration streams covered in the report. Figures in the Contact Centre report demonstrated very well how graphics can contribute to analysis and reader understanding of the issues in the report.

Suggestions for Improvement

  • The structure and content of reports could be altered to reduce descriptive content and give more prominence to key findings and analysis in direct support of recommendations.
  • Reports could include more explicit explanation of their scope.
  • In addition to recent new instructions to performance audit teams that recommendations should be significant and targeted, we would add that they should also be measurable: setting out wherever possible clear timescales for action by the entity, and by defining the criteria against which the impact of a recommendation should be assessed.

Capacity and Capability

The PAG has around 15 audit staff reporting to a Manager and an Assistant Auditor- General. Typically performance audits are carried out by a small team of OAG staff (1- 3 personnel) drawing in expertise from other areas as necessary, including staff from Audit New Zealand and, within that, from the Special Assurance Services. Where more specialised expertise is required, the OAG will commonly draw upon the advice of outside experts. For example, the “Get Checked” study on diabetes drew upon the advice of an eminent university professor in that field of medicine.

The question of matching skills and experience to performance audits has been raised by various parties. The March 2007 review of performance auditing processes by the OAG’s Accounting and Auditing Policy QA team recommended that the PAG demonstrate more clearly at the outset that teams have the necessary skills and competence to perform the audit and identify what specialist advice or expertise should be sought. In response, in April 2007, a new template form was introduced to provide an explicit focus on staffing issues as part of the detailed audit planning.

More generally, in our discussions with entities, there was a perception that the OAG had faced some challenge in developing its capacity to match the significant increase in resources availability in 2004 and the related increase in the number of performance reports produced annually. As with other parts of the OAG, the capacity to recruit and retain staff also remains a challenge, as we have noted elsewhere in this report. Our review of Post Project Reviews showed that core teams on performance audits are often small, leaving audits potentially vulnerable to a loss of continuity should a staff member leave the team, and limiting the methodological depth that can be achieved. The OAG might consider reviewing staff levels for performance audits, looking at issues of capacity and continuity.

Where particular skills are not available in-house, the OAG may need to buy these in from contractors. Beyond the conventional range of studies, the OAG has in the last few years tackled more technical subjects, most recently the New Zealand Debt Management Office. In recognition of the specialist nature of the issues at hand, this study was contracted out to KPMG. This required a different set of skills around the management of a contractor and, in keeping with good practice, the lessons learned from this exercise have been captured in the form of a detailed internal memorandum on the subject (Contracting out a performance audit: lessons and materials: January 2007).

Alongside immediate staffing issues there is also a question about whether the balance is right between the resources invested in the design and conduct of a performance audit and the amount of effort put into its oversight and review. As identified earlier, a performance audit team is typically small (1 to 3 personnel); yet there can be many times that number of personnel involved in reviewing the same report. In our own research, it was put to us that a performance report can be exposed to as many as 20 people during the various review stages.

For each performance audit there is a Project Steering Committee, the purpose of which is to ensure that governance and risk management issues are addressed effectively. To achieve this, each Committee consists of a wide range of OAG individuals including senior members of the PAG and other senior staff from within the OAG. The March 2007 review by the Accounting and Auditing Policy QA team found difficulty, however, in assessing how the composition of Committees had been determined and why the members selected were considered to be the “best people” for the particular performance audit.

Our examination of Post Project Reviews identified some positive comments about the value of the contributions made by individuals on Project Steering Committees. More generally, however, there was a view amongst OAG staff that these Committees can constitute an over-engineered and time-consuming process (with elaborate planning necessary to allow large numbers of staff to come together, rather than to necessarily meet the timing needs of the audit). External reviewers have also commented that the very numbers involved the risk of a dilution of responsibility, encouraging a sense that no one in particular is accountable. This raises questions about whether the resources involved are being used as effectively as they could be in producing the required outcomes.

Work continues on reforming this process, in particular to potentially replace the sometimes large and unwieldy Project Steering Committees with a much smaller, sharply focussed, governance team of three senior OAG staff but it is not yet entirely clear what role this team will play between high level governance and being involved in the more day-to-day management of the study. The OAG is piloting this new model on three audits as part of wider improvements from the Performance Audit Process Improvement Project, before deciding what changes can be made to make project management and project governance more efficient.

Suggestions for Improvement

  • With 14 performance audits to deliver in 2007-08 and a core audit staff of around 15, the Office of the Auditor-General might consider reviewing staffing levels for performance audits, looking in particular at continuity and at its capacity to do justice to the range and number of audits tackled each year.
  • The Office of the Auditor-General should investigate whether a better balance can be struck between the effort and resources committed to the conduct of a performance audit and that committed to its review.

Quality Assurance

The OAG uses a range of internal management arrangements to oversee the quality of audit reports and to monitor performance against time and budget constraints, including:

  • a Project Steering Committee which meets every 4-6 weeks to review the progress of the audit;
  • a substantiation review conducted by an independent member of the PAG to confirm there is evidence in the working papers to support the statements made in the draft report;
  • a formal peer review of the report again using an independent member of the PAG;
  • a ‘fatal flaw’ review of the draft report conducted by an independent member of the OAG. This happens after the substantiation and peer reviews take place, and once the audited entity has commented on the draft report;
  • two edits of the draft report by the Reports and Communications Group;
  • scrutiny by the OAG’s Accounting and Auditing Policy QA team;
  • involvement of the OAG management team, as members of the Project Steering Committee for individual performance audit reports or as fatal flaw reviewers; and
  • a review by the Auditor-General as the ultimate arbiter of, and authority for, the content and quality of OAG reports.

We are conscious of various positive comments in Post Project Reviews about the value added through the application of these processes, and plainly they represent an important and thorough assurance process. For example, the New Zealand Qualifications Authority performance audit team found having people with no previous involvement with the topic advantageous, and also found their comments very useful. Our own interviews with OAG staff and comments from other external reviewers have revealed, however, some questioning of procedures such as substantiation and whether, as measured by the effort currently required (substantiation can take several days), and also the level of assurance obtained, they represent the most effective way to safeguard audit quality.

As part of the current Performance Audit Process Improvement Project, an assessment is being made of the added value associated with each stage of the performance audit process, including the review mechanisms detailed above. For instance, this work has raised a question about whether, with revised oversight arrangements, there is a need for a separate peer review and fatal flaw review.

To give the processes more impact, we would endorse the suggestion reported in the Post Project Review of the audit of the New Zealand Qualifications Authority that there should be provision for substantiation, peer and fatal flaw reviewers to be present at Project Steering Committee meetings (or those of their successor body) to provide for more animated discussion and debate on the quality of the audit under review. As part of the improvements being piloted, following on from the Performance Audit Process Improvement Project, the OAG will be piloting a more active ‘fatal flaw’ review at the draft report stage of the audit. This will involve the governance team referred to in 6.2.56, assisted by relevant specialists from within the OAG such as the Sector Manager, Substantiator and Peer Reviewer, questioning the audit team on the draft report with regard to areas of risk and likely challenge.

In addition to the standard quality control regime that applies to all performance audits, the OAG has in place a comprehensive system of regular external quality assurance processes. The results of these reviews represent one of the key performance measures in the Annual Plan for 2007-08. These include peer reviews undertaken by academic reviewers and other audit offices; in particular, a reciprocal arrangement with the Australian National Audit Office whereby two OAG performance reports are reviewed every two years. In 2007, as referred to elsewhere, the processes underpinning the performance audit function were subject to scrutiny by the OAG’s own Accounting and Auditing Policy QA team. There was also a separate internal review by the PAG of the results of 19 Post Project Reviews.

In our judgement, based upon our own research and interviews, we would endorse the findings of the various quality assurance processes and reviews (from which we have quoted throughout this Section of our report). These have shown the many measures that are working well across the OAG’s performance audit function, and have also signposted ways in which further improvement can be secured.

A key part of the quality assurance process is that there should be an independent means to check what actions have been taken. Each June the PAG is required to report to the OAG’s Audit and Risk Committee on the outcome of the internal and external reviews and on the actions taken by the Group in response. In addition, an important step in recent years has been the development of a mechanism whereby, instead of the reports for external review being selected by the Group, this is now done in consultation with the Committee, which is also involved in the decision on who should conduct the reviews.

In response to a request from the OPC, the Auditor-General has, since 2004, provided the Committee each year with a performance audit evaluation report commenting on the impact of three performance audits selected by the Audit and Risk Committee for internal review by the PAG. This is designed to allow the Committee to assess the contribution made by the performance audit function to the work of the Office, focussing upon:

  • the desired impact of the performance audits;
  • Select Committee feedback on reports; and
  • external peer review and feedback.

The Annual Stakeholder Feedback exercise offers an important source of independent information about the regard in which the Office is held by Select Committees, Treasury and the New Zealand Society of Local Government Managers. In 2007, the exercise was based on 16 interviews, and asked two main questions to determine stakeholders’ satisfaction overall with performance audits, and how useful they felt they were.

While these interviews are very useful information for the Office, there is scope to widen the criteria and related questions in the Annual Stakeholder Feedback survey to garner views on specific aspects of the OAG’s performance auditing activities, for example, the coverage of topics, and the quality of recommendations.

The Stakeholder Feedback Interviews make a number of general points about the Office’s activities that may have a bearing on the design of the future performance audit work programme including the need to:

  • concentrate on the nub of issues, providing a clearer analysis of financial trends from year-to-year and to alert the Select Committees to future capital risks, and;
  • clarify the steps it is taking to improve the performance of poorly performing departments.

At present, much of the quality assurance effort is focussed towards the end of an audit or, indeed, after an audit report has been published. We consider there would be merit in targeting more of this resource at earlier points in the audit lifecycle. Resources currently absorbed in the external QA reviews of published reports could be redeployed to instigate an external review process of reports at a draft stage. External review might also be introduced at the planning stage of a performance audit (beyond the internal peer review already in place) to offer some independent assessment of audit scope and whether methodologies proposed are fit for purpose.

Suggestions for Improvement

  • At present, much of the quality assurance effort is focussed towards the end of an audit or, indeed, after an audit report has been published. We consider that there would be merit in targeting more of this resource at earlier points in the audit lifecycle. This, we feel, could bring further depth and rigour to the scope and design of a performance audit and would provide a useful “challenge mechanism” to test the range of methodologies and approaches planned for an audit.

Implementation of Report Recommendations

The effectiveness of performance audit is largely dependent on the effective implementation of its recommendations.

Acceptance of, and the responses made by, entities to recommendations in performance audit reports are shown as a key performance measure in the Auditor-General’s Annual Plan for 2007-08.

At the point when a performance audit report is tabled in Parliament, the OAG offers a briefing to the Finance and Expenditure Committee or the relevant Select Committee. This is sometimes followed by a hearing of the particular Committee, at which evidence is taken from the public entity or entities in question. What happens then depends upon: the reaction of the entity itself; whether a Select Committee hearing was held; and how the OAG itself plans to follow up its work. Unlike other jurisdictions, there is no formal requirement for entities to respond to recommendations made by the OAG.

The principal mechanism for follow-up is in practice the briefing provided to the Select Committees in advance of the particular entity’s Estimates or Financial Review hearing by the relevant OAG Sector Manager. This is plainly an important accountability mechanism, but it relies for its effectiveness on the strength of the briefing provided. To reinforce this process, the OAG may wish to explore with Select Committees whether a more formal process should be introduced requiring the entity to set out its response to the recommendations made, against which its subsequent actions can be assessed. This would strengthen accountability by placing a clear obligation on public entities to take necessary action. (As part of its programme of reviews in 2008, the OAG’s Accounting and Auditing Policy QA team is considering looking at the implementation of the recommendations in performance audit reports.) It would also apply a discipline upon OAG performance audit teams to ensure that their recommendations were practical and capable of implementation.

There is also the annual performance audit evaluation to the OPC on the outcome of three reports. This is an important mechanism because it provides structured feedback on the impact of the reports and the actions planned in response. It covers, however, only three reports and it is not clear what actions follow by way of obtaining feedback from the Committee about their views on what is shown. We suggest that the OAG might wish to brief the Committee about the outcome of this quality assurance process to provide for a more proactive dialogue.

Suggestions for Improvement

  • There is scope to strengthen the processes for following up the recommendations made in performance audits, for example by public entities being invited to set out formally their immediate response to the Office of the Auditor-General’s recommendations, and at a later date, for the Auditor- General to review the steps they have taken to implement those recommendations. This would strengthen accountability by placing a clear obligation on public entities to take necessary action. It would also apply a discipline upon the Office of the Auditor-General’s performance audit teams to ensure their recommendations were practical and capable of implementation.
  • There may be a case to enhance the annual Performance Audit Evaluation Report to the Officers of Parliament Committee, for example, by increasing the number of audits covered in the evaluation and, if the Committee so wished, for the Office of the Auditor-General to make a presentation to them in support of the Report.

Follow-up Audits

Beyond follow-up recommendations, the OAG also publishes a small number of follow-up reports revisiting earlier performance audits. These can be prompted by the OAG itself or by a request from another party, for example a Select Committee.

Such reports represent an important accountability mechanism allowing the OAG and Select Committees to revisit subjects of continuing concern, or where there is doubt about whether recommendations from a previous performance audit have been taken sufficiently seriously. It was put to us by the OAG that a more immediate way to enhance the impact of the performance auditing function would be to make sure that there are effective mechanisms to ensure that all recommendations are acted upon, with follow-up reports being a useful additional device if needed. (Inevitably, only a limited number of follow-up performance audits are conducted each year, given the many other new and important areas of expenditure for the OAG to examine.)

At a practical level, we note that one of the findings of the PAG’s March 2007 review of Post Project Reviews identified some issues around the scope of follow-up audits, and in the difficulty of managing the auditee’s expectations about what might, or might not, be covered. This included where the situation had moved on from the original audit, meaning that a tightly focussed follow up could not readily address current issues. As mentioned earlier, this review also highlighted some difficulties where the “original recommendations had not been framed in a way to allow easy auditing and had to be reformulated for the follow up.”

6.3 Good Practice Guides

In addition to performance audits, the OAG has also published a small number of good practice guides. These are produced by different Groups within the OAG. In recent years there have been several guides focusing on the proper handling of conflicts of interest in both the central and local government arenas as well as, for example, Controlling Sensitive Expenditure and Local Authority Codes of Conduct.

Good practice guides were well received by stakeholders, in particular by senior staff in entities. They were seen as a useful and an appropriate activity for the OAG to be engaged in, with little risk perceived of a conflict of interest arising between the Auditor-General being theoretically both standard-setter and post-hoc auditor. There was also a general recognition that the authority of such guides could be enhanced where they can be produced jointly with other bodies such as the New Zealand Treasury and the State Services Commission.

In our experience, good practice guides have become a common output of audit offices generally, representing a useful way of communicating wider messages arising from their audit work. Perhaps inevitably, some commentators saw the guides as being too general in nature while others felt they were too detailed. Good communication with, and the involvement of, relevant entities would tend to minimise such difficulties.

Suggestions for Improvement

  • When preparing good practice guidance, the Office of the Auditor-General should consider whether such guidance, in some circumstances, would carry added weight and authority if it were jointly prepared and published with other bodies, including the Treasury, the State Services Commission, or the New Zealand Society of Local Government Managers.

6.4 Performance Reporting

Summary performance information on the performance audit function is included in the Auditor-General’s Annual Plan and subsequent Annual Report. These include short commentaries on the results of the Stakeholder Feedback Interviews and on the various internal and external quality assurance reviews in the period. There is also a section entitled “Progress against our Annual Plan” which lists the reports completed as well as providing explanation of any variations - mainly audits that have been removed or deferred.

6.5 Inquiries

Under Section 18 (1) of the Public Audit Act 2001, the Auditor-General has the power to inquire into any matter concerning a public entity’s use of resources, either in response to requests from Select Committees, Members of Parliament, Councils or concerned citizens or groups, or on his own initiative.

May 2006 saw the introduction of an Inquiries Manual designed to rationalise processes and procedures across the OAG centred around good practice established in earlier inquires; in particular to provide a consistent decision-making framework for handling requests and, where an inquiry is initiated, indicating how that should be conducted.

Ultimately, it is up to the Auditor-General and his Leadership Team to judge the appropriate course of action in response to requests for inquiries; whether to proceed and in what ways. The Inquiries Manual represents an important decision-making tool and is an important means for the Auditor-General to demonstrate, with clear and comprehensive documentation, that processes have been diligently and consistently applied.

Requests for inquiries are classified by sector; termed “ratepayer” for local authorities and “taxpayer” for central government. (Requests from Members of Parliament are logged separately.) In 2006-07, the OAG received 250 requests (compared with 228 for the previous year) as follows:

  • 72 Taxpayers;
  • 169 Ratepayers; and
  • 9 Members of Parliament

Of those, 173 were dealt with informally, where the OAG judged that the request was either not in its power to examine, or that it was a straightforward issue typically dealt with by simple correspondence or telephone call. 77 requests were, however, the subject of a formal inquiry with 76 classified as “routine” and one as “sensitive”. There were no “major” inquiries in 2006-07 (compared with seven in the previous year).

There are clear processes for monitoring the progress of inquiries. In particular, a report is provided to each Leadership Team meeting on outstanding inquiries and the timeliness of dealing with inquiries, against which progress against performance criteria, as defined in the Annual Plan, can be monitored.

As with performance audits, the Auditor-General’s Annual Report for 2006-07 includes a section showing details of the OAG’s performance in handling inquiries; recording, in particular, the timeliness of the Office in responding to requests for inquiries, and for their subsequent completion. Details of the costs of inquiries were not immediately available.

Inquiries are carried out by different OAG Groups (primarily Parliamentary and Local Government) with support often provided by Audit New Zealand and its Special Assurance Services. The PAG also plays a role either in directly undertaking an inquiry, or more usually, in providing support to inquiries undertaken elsewhere in the Office.

In these circumstances, where inquiries are conducted by different OAG Groups with, inevitably, different management styles and approaches, the new Inquiries Manual is a particularly important means to help ensure consistency.

In practice, it is often the relevant Sector Manager who has prime responsibility for the conduct of an inquiry, but for larger inquiries a discrete team may be necessary. For inquiries that take fewer than 15 hours, there is a general time code under each category of inquiry (that is, Ratepayer, Taxpayer, Parliamentary) with the costs absorbed within the general budget of the relevant Group. Larger inquiries, however, are assigned their own time code for recording the resources used.

As with the OAG’s performance audits, the 2007 Stakeholder Feedback Report found 100 per cent of respondents satisfied with the Office’s handling of inquiries, compared with 75 per cent in 2006. While this is plainly a very important body of stakeholders, it is also a relatively small cross-section of opinion, particularly when many inquiries are prompted by individual citizens. While the inquiries system will not always meet the expectations of often concerned or aggrieved parties, the OAG may nevertheless want to develop the means to canvass a wider body of views, and ask a wider set of questions about the quality of its inquiries work, upon which further development and refinement to the inquiries process can draw.

With the exception of the inquiry into election advertising, which because of the unique constitutional circumstances we do not comment upon here, our interviews with MPs, Select Committees and public entities found a high level of satisfaction with the OAG’s handling of inquiries. No substantive concerns were raised, only the occasional questioning of whether all the subjects really warranted the scrutiny of the OAG.

We reviewed a selection of inquiries from 2006-07, although with 76 out of 77 classed as “routine” and no major inquiries during the year, the evidence base was relatively sparse.

In 2006-07, there were, however, two specific quality assurance exercises, which provided a very useful body of evidence. We are also conscious that in October 2007, shortly after our visit, the OAG published a report on a major inquiry into the proposed new sports stadium in Dunedin.7 Given the report’s recent publication, we have not sought to judge its content or quality but have reviewed it to see whether there are any useful similarities or differences to highlight between this type of document and performance audit reports.

The first quality assurance exercise was a Review conducted in July 2007 by an independent consultant who was also responsible for the November 2006 Governance Review of the Office. This focused on two major inquiries initiated and completed in 2005-06 and involved consideration of the extent to which those inquiries followed the broad approach established in the Inquiries Manual. These inquiries examined Housing New Zealand and Christchurch City Council.

The Review commented favourably on the interview and consultation processes adopted in one of the inquiries and concluded overall that nothing emerged from the exercise to suggest any deficiency in the outcome of the two inquiries as a result of the processes each followed. More generally, the Review made a number of recommendations about areas of the Inquiries Manual that might need strengthening, for example, in respect of the need to establish early in an inquiry the expectations and benchmarks that determine the standards against which the actions of public entities are to be assessed. The Review also raised the issue of the importance of compliance with the Manual, and the proper recording of key decisions (for example, in respect of the decision to hold an inquiry, the approval of terms of reference, and completion of the QA processes).

Major inquiries are often conducted against tight timescales, driven by the expectations of Parliament and the public. The Review raised the importance of an early and realistic assessment of the trade-off between the time needed to complete the inquiry processes and the time pressure facing the inquiry.

A second review, which reported in August 2007, was conducted by the OAG’s own Accounting and Auditing Policy QA team. This included an appraisal of the Inquiries Manual itself, together with an assessment of how well the procedures set out in the Manual were applied in practice, based on a sample of 20 inquiries completed in 2006- 07 and judged by the OAG to be representative of inquiries more generally. There was only one major inquiry during the period on Parliamentary Advertising Expenditure. We consider, however, that given the uniqueness of the subject and its related high public and political profile, this report does not represent a typical inquiry.

The review concluded that appropriate systems and controls were in place. An assessment against 10 high level risks (in particular a loss of credibility and reputation, and a loss of stakeholder confidence), found that the likelihood of each was low, on the basis of the identified key controls in place. The review did, however, identify a need in some inquiries for more documentation to positively demonstrate compliance with the Manual. Where an inquiry was not instigated, the review concluded that a clear reason had been given to the correspondent.

While we recognise the importance of proper processes being evidenced against the Inquiries Manual, it may be an unnecessary duplication, particularly in smaller inquiries, to separately document the rationale for decisions when these are clearly recorded in the final response.

It is apparent that many of the audit processes and disciplines are the same for inquiries and performance audits. For example, for each there is an internal peer review and substantiation and fatal flaw reviews. As a general principle, both need to conform to the same Auditing Standards. While they are plainly different products, this does raise the question of whether the two disciplines can learn from one another and whether there is more scope for knowledge transfer between the different Groups. This would include consideration by the OAG of the applicability of the various suggestions for improvement identified for performance audits earlier in this Section.

When developing the forward programme of performance audits, Sector Managers and the PAG should consider whether any topics lend themselves to an inquiry-style approach. For instance, the shorter timescale of inquiries and the need for the OAG to provide a clear expression of opinion requires disciplined and focussed evidence gathering around clearly defined issues.

The Dunedin Stadium report was in the form of a 17 page letter, considerably shorter than the average performance audit report. By including some of these shorter studies, the PAG would be able to offer a wider range of products and increase the flexibility of its approach. This would also be in keeping with the reported wishes of Select Committees for a more varied diet of small, medium and large performance reports produced more quickly.

Suggestions for Improvement

  • The Office of the Auditor-General may wish to develop the means to canvass a wider body of views concerning satisfaction with the handling of inquiries, and ask a wider set of questions about the quality of its inquiries work, in order to further develop and refine the inquiries process.
  • Consideration should be given to strengthening areas of the Inquiries Manual, for example in respect of the need to establish early in an inquiry the expectations and benchmarks that determine the standards against which the actions of the public entities are to be assessed.
  • The Performance Audit Group should consider whether any topics for future audit lend themselves to an inquiry-style approach. By including some of these shorter studies the Group would be able to offer a wider range of products and increase the flexibility of its approach.

6: The six performance audit reports in question were:

  • Ministry of Social Development: Performance of the Contact Centre for Work and Income, December 2006.
  • NZ Qualifications Authority: Monitoring the Quality of Polytechnic Education, May 2007.
  • Ministry of Health and District Health Boards: Effectiveness of the ‘Get Checked’ diabetes programme, June 2007
  • Assessing Arrangements for Jointly Maintaining State Highways and Local Roads, June 2007.
  • New Zealand Customs Service: Collecting Customs Revenue, June 2007
  • Department of Labour: Management of Immigration Identity Fraud, June 2007

(We also looked at a seventh performance audit report - that on the New Zealand Debt Management Office. This was an audit undertaken by a contractor and we were interested to see whether there were any wider learning points from the experience of a contracted-out performance audit.)

7: Inquiry into Dunedin City Council and Otago Regional Council’s Funding of the Proposed Stadium under cover of a letter from the Auditor-General to Mayor Peter Chin and Chair Stephen Cairns, dated 24 September 2007.

page top