Part 2: Outcomes report

Annual report 2003-04.

What we set out to achieve in 2003-04

Our key objectives for 2003-04 in relation to our Statement of Intent (as set out on pages 26-30 of our Annual Plan 2003-04) were:

1. As a result of our work, to maintain and/or improve our desired impacts. Our desired impacts are:

  • improved public entity operations;
  • public entities that act legally, ethically and with probity; and
  • parliamentary control of expenditure.

2. To improve our own development, measurement and reporting of outcomes; specifically:

  • better define our outcomes; and
  • determine how we can collect reliable and relevant data and information to truly measure our outcomes.

Our outcome framework

We prepared our Annual Plan 2003-04 in keeping with the “Statement of Intent” model. We acknowledged, however, that

… the formulation of our Annual Plan is ongoing and we expect to achieve a steady increase in its quality over the coming year. We will be putting more work and effort into defining our impacts, and into how we can collect reliable and relevant data and information so that we can truly measure the difference we make.

During the development of our Annual Plan 2004-05, we reviewed our outcome framework. As a result, the outcome framework detailed below differs slightly from that contained in our Annual Plan 2003-04. However, our outputs remain unaltered.

 Figure 1.

Our measurement framework

As part of the development of our Annual Plan 2004-05, we also reviewed our measurement framework. This considered the measures we need to assess the extent of our progress with, and our performance, in the following areas:

  • Outcomes
  • Outputs
  • Governance
  • Risk
  • Capability
  • Strategic Plan implementation.

In addition, Parliament requested that we develop, in consultation with the Treasury:

  • an evaluation framework for measuring the impacts of our performance audits, and
  • a three-year evaluation framework for measuring the success of our Strategy (this is not due until 2007).

As a result, we now have a comprehensive measurement framework which is being integrated into our monitoring and reporting framework. In this Annual Report, we start to use parts of our new measurement framework to provide you with better information about the performance of the Office. In next year’s Annual Report, we will fully report using our new measurement framework.

Measuring our impacts

Our Annual Plan 2003-04 set out the impacts of our work as contributions to:

  • improved public entity operations;
  • public entities that act legally, ethically and with probity; and
  • parliamentary control of expenditure.

It also set out our proposed measures and indicators, noting that these were:

. . . our first attempt at defining how our impacts will be identified. We will continue to refine the measures and indicators over the coming years to determine those that are the most appropriate and useful.

For the purposes of this Annual Report, we report against the measures set out in our Annual Plan 2003-04. These measures were refined during the development of our five-year Strategic Plan and Annual Plan 2004-05.

Impact 1: Improved public entity operations (Output Classes D1, D3 and D4)

Intervention logic Measures
If we assess entity operations, and then tell relevant stakeholders, then there
is a greater likelihood that we will continuously improve the:

- sufficiency and appropriateness of public entity information systems;
- effectiveness and efficiency of entity operations; and
- accuracy and adequacy of entity reporting.
1. Improvements in aspects of entity
management, as measured through
our assessments (including local
government in the future)
2. The level of uptake of formal
recommendations we make in
our reports
3. Others taking action as a result of
our work (including adoption of
our advice by Select Committees
in their own reports)
4. Fewer non-standard audit reports being issued over time
Measure One:
Improvements in aspects of entity management, as measured through our assessments

Since 1994, we have analysed trends in the assessments our auditors make each year for five particular aspects of financial management and service performance management. These are known as “the five management aspects”, and are assessed as being either Excellent, Good, Satisfactory, Just Adequate or Not Adequate. The five aspects which are assessed are:

  • Financial control systems
  • Financial management information systems
  • Financial management control environment
  • Service performance information systems, and
  • Service performance management control environment.

The purpose of these assessments is to identify specific areas of management where there are weaknesses, and to make recommendations to eliminate those weaknesses. For each entity, we report these assessments to the Chief Executive and Responsible Minister, and also to the Select Committee which conducts the financial review of the entity. If our recommendations are adopted, then we would expect to see an eventual overall improvement in entity management.

Until the 1999-2000 financial year, these assessments related only to government departments. They were then extended to include District Health Boards, Crown Research Institutes and State-owned Enterprises.

It was our intention to review the five management aspects in 2003-04 and to assess their applicability for local government. This review was deferred and has now been incorporated in our proposed Research and Development Plan. It is likely to take place in 2005-06.

Results

The table below summarises the results of this analysis by comparing changes in assessments of the five management aspects between 2001-02 and 2002-03. For comparison, the results reported last year are also included.

CHANGES IN MANAGEMENT ASPECT ASSESSMENTS 2002-03 COMPARED
WITH 2001-02
Entity type Unit of measure Higher Same Lower Total
Government Departments Number 13 199 3 215

% 6.0 92.6 1.4 100.0
District Health Boards
8 89 8 105

% 7.6 84.8 7.6 100.0
Crown Research Institutes
0 43 1 44

% 0.0 97.7 2.3 100.0
State-owned Enterprises
3 57 3 63

% 4.8 90.5 4.8 100.0
TOTALS
24 388 15 427

% 5.6 90.9 3.5 100.0

CHANGES IN MANAGEMENT ASPECT ASSESSMENTS 2001-02 COMPARED
WITH 2000-01
Entity type Unit of measure Higher Same Lower Total
Government Departments Number 14 187 9 210

% 6.7 89.0 4.3 100.0
District Health Boards
5 86 14 105

% 4.8 81.9 13.3 100.0
Crown Research Institutes
2 41 2 45

% 4.4 91.1 4.4 100.0
State-owned Enterprises
6 51 6 63

% 9.5 81.0 9.5 100.0
TOTALS
27 365 31 423

% 6.4 86.3 7.3 100.0

Commentary on results

Overall, there has been a net improvement in the assessments of the five management aspects from 2001-02 to 2002-03 (i.e. there are 24 higher assessments and 15 lower). This would tend to indicate that the quality of management in public sector entities has improved – albeit marginally.

Within each class of entity, we observe:

  • Government Departments continue to have a net improvement, with fewer assessments being lower than in the previous year’s analysis.
  • District Health Boards appear to have stabilised, with no net change. This compares favourably with the previous two year’s analyses (where a net worsening of assessments had occurred in relation to the two service performance aspects).
  • Crown Research Institutes remain at similar levels to the previous year’s analysis, with negligible change.
  • State-owned Enterprises have no net change, which repeats last year’s analysis and indicates that the quite significant improvements made in 2000-01 continue to be sustained.
Measure Two:
The level of uptake of formal recommendations we make in our reports

Measuring the extent to which entities “take up” the recommendations we make in our reports as part of annual audits provides an indication of the effect we are having on improving entity management practice. We have assessed this by:

  • sampling Audit New Zealand’s top 30 entities by fee and by region.

Note: We did not report on the level of uptake of formal recommendations made in our reports of annual audits for 2002-03. The results for 2003-04 therefore establish a benchmark.

Results

LEVEL OF UPTAKE OF OUR FORMAL
RECOMMENDATIONS (Annual Audits)
Number %
Total number of recommendations (from sample of 30 entities) 228 100
Number of recommendations accepted 110 48
Number of recommendations rejected 17 8
Number noted or under consideration by management 60 26
Number where client has made no response 41 18

Commentary on results

This suggests that, based on a sample, almost half of the recommendations we made in our management reports on annual audits over the 2003-04 year were accepted, with a further quarter either noted or under active consideration by management. Less than 10% of our recommendations have been rejected by management.

We expect to further develop this measure over the 2004-05 year, and include data from our other ASPs. We may also consider relevant benchmark data from our Australian counterparts, if available.

Measure Three:
Others taking action as a result of our work (including adoption of our advice by Select Committees in their own reports)

We measure the “take-up”; i.e. the extent to which Select Committees use the advice we provide in the briefing materials we give them – not only to review public entities, but also to report and make recommendations to the Government on issues – by reviewing a sample of reports provided to Select Committees and identifying the number of issues we suggested which are referred to in the Committees’ reports to the House.

Results

The following table indicates the rate of “take-up” for the 2003-04 year. The previous year’s take-up rate is also included for comparison.

TAKE-UP RATES IN 2003-04 COMPARED TO 2002-03 2003-04 2002-03
Number of reports reviewed 19 33
Proportion of suggested issues taken up 74% 72%
Range of take-up rates 33-100% 33-100%

Commentary on results

We consider a take-up rate of 74% to be satisfactory. The briefing reports reviewed relate to briefings on Annual Financial Reviews and on Estimates across a range of entities and a range of Select Committees. Our briefings, in addition to advice on subject areas, include a series of questions that the Committee may wish to pose to the entity concerned. In general, the Committee will request the entity to respond in writing to the questions posed. This provides the Committee with valuable material in considering the entity.

Measure Four:
Fewer non-standard audit reports being issued over time

A “non-standard audit report” is one issued in accordance with the Institute of Chartered Accountants of New Zealand Auditing Standard No. 702: The Audit Report on an Attest Audit (AS-702). It contains:

  • A qualified audit opinion (i.e. a “disclaimer of opinion”, an “adverse” opinion or an “except-for” opinion), and/or
  • An explanatory paragraph

A full definition of a “non-standard audit report” is set out in our report Central Government: Results of the 2002-03 Audits (parliamentary paper B.29[04a], 2004, pages 35-37).

Results

The following table provides an analysis of all non-standard audit reports issued in 2003-04. Information for the previous year is provided for comparison.

NON-STANDARD AUDIT REPORTS ISSUED IN 2003-04
Type of non-standard audit report Schools Other Total
Unqualified opinion

With explanatory paragraph or reference to a
breach of law

186 24 210
Qualified audit opinion

Disclaimer of opinion

2 29 31

Partial disclaimer of opinion

0 0 0

Adverse opinion

0 15 15

Except-for opinion

65 29 94
TOTAL 253 97 350
Total of all audit reports 2682 1388 4070

NON-STANDARD AUDIT REPORTS ISSUED IN 2002-03
Type of non-standard audit report Schools Other Total
Unqualified opinion

With explanatory paragraph or reference to a
breach of law

112 46 158
Qualified audit opinion

Disclaimer of opinion

1 6 7

Partial disclaimer of opinion

2 0 2

Adverse opinion

1 21 22

Except-for opinion

99 44 143
TOTAL 215 117 332
Total of all audit reports 2900 1292 4192

Commentary on results

There has been an increase in non-standard audit reports as a proportion of all audit reports issued during the year – from 7.9% in 2002-03 to 9.0% in 2003-04.

There were a number of reasons for this increase:

  • There has been an increase in school audit reports containing explanatory paragraphs. This increase is primarily due to an increase in the number of school network reviews that were undertaken during 2003-04. As a result of these reviews, more school audit reports included references to:
    • the uncertainty over the future of the school, as this was still to be decided; or
    • the fact that a decision had been made to disestablish the school.
  • There has also been an increase in the number of audit reports containing a disclaimer of opinion. This increase is due to the financial statements of a number of statutory bodies that entered the Auditor-General’s mandate for the first time in 2003-04 not having been audited the year before. Because the closing position of the prior year had not been independently verified, we were unable to provide assurance about the current year’s result.

On the other hand:

  • There has been a decrease in the number of school audit reports containing “except-for” opinions. The decrease is due to fewer school boards than in the previous year having difficulty in complying with FRS-15: Provisions, Contingent Liabilities and Contingent Assets. FRS-15 requires the inclusion of a provision in the Board’s financial report for their obligation to the Ministry of Education to maintain the properties from which they operate in “good order and repair”.
  • There has been a decrease in the number of audit reports for other entities containing “except-for” opinions. At the end of 2003-04, there were a similar number of audits in arrears as there were at the end of 2002-03. However, the number of audits in arrears at the end of 2002-03 had been less than the year before. It was because a lot of “old” audits were completed in the 2002-03 year, and there were more “except-for” opinions issued for other entities in that year.

Impact 2: Public entities that act legally, ethically and with probity (Output Classes D1, D3 and D4)

Intervention logic Measures
If we are vigilant for any behaviour below expectation as part of our day-to-day activities, and highlight and fully investigate any irregularities we come across, then there is a greater likelihood that we will ensure that an effective deterrent to unlawful or unethical actions, or those undertaken without probity, exists. 1. Transparency International ranking
2. Analysis of themes emerging from the inquiries we conduct
Measure One:
Transparency International ranking

The Transparency International (TI) Corruption Perceptions Index ranks countries in terms of the degree to which corruption is perceived to exist among public officials and politicians. As such, it provides one possible indicator of the extent to which public entities in New Zealand are perceived to act legally, ethically and with probity. (Note: It does not distinguish between administrative and political corruption.)

The TI is a composite index, drawing on 17 different polls and surveys from 13 independent institutions carried out among business people and country analysts, including surveys of residents, both local and expatriate. In 2003, the TI ranked 133 countries.

Results

In 2003, New Zealand was ranked 3rd equal with Denmark (behind Finland and Iceland), the same ranking as in 2002. New Zealand’s TI score was 9.5, on a scale where the top score was 9.7 and the lowest score 1.3.

Commentary on results

New Zealand has sustained its international ranking on the TI Corruption Perceptions Index. We acknowledge that this is a high-level indicator and is not solely attributable to our work. However, it may be possible to draw some tentative conclusions that our work does act as a deterrent and/or uncovers and stops unlawful or unethical behaviour or behaviour undertaken without probity.

Measure Two:
Analysis of themes emerging from the enquiries we receive

Another possible indicator of the impact our work has on the behaviour of public entities and public officials is the extent to which we see continuing themes in the enquiries we receive. The assumption is that the occurrence of themes should lessen over time, if our assurance interventions are targeted appropriately.

Results

We responded to 179 enquiries from ratepayers, taxpayers and Members of Parliament in 2003-04, and 80 enquiries under the Local Authorities (Members’ Interests) Act 1968. A large proportion of these related to:

  • issues around processes and procedures;
  • issues of appropriateness of expenditure;
  • matters arising from the sale and purchase of assets; and
  • unusual payments and the legal authority for such payments.

Three common themes emerged from our review of all enquiries received:

  • contract management practices (in Local Government – issues such as contracting for infrastructure development, and in Central Government – issues such as contract management for funding NGOs);
  • conflicts of interest – both of a pecuniary and non-pecuniary nature; and
  • advertising guidelines for parliamentary and local government elections – i.e. the need for clear and consistent rules to guide expenditure of public funds on advertising and communications.

Commentary on results

This is the first time that we have attempted to report on key themes emerging from the enquiries we receive. It is intended to provide a benchmark for future years’ reporting. We intend to develop mechanisms to monitor the recurrence of these and any other themes.

Impact 3: Parliamentary control of expenditure (Output Class D2)

Intervention logic Measures
If we properly carry out our Controller function, then there is a greater likelihood that the mechanisms that are designed to ensure Parliamentary control of public money will function properly and no funds will be released without appropriate authorisation. No funds are released to government departments without appropriate Parliamentary approval.
Measure One:
No funds are released to government departments without appropriate Parliamentary approval

The Controller function is a statutory function carried out in accordance with section 22 of the Public Finance Act 1989. The Office acts as a monitor, on behalf of Parliament, to control the issue of funds from the Crown Bank Account.

Results

100% compliance; i.e. all warrants issued by the Governor-General for the release of funds, and daily amounts released to departments to fund their activities, were supported by appropriations and were for lawful purposes.

Commentary on results

There have been no breaches of this measure. This provides a high level of confidence to Parliament and the public that funds released from the Crown Bank Account are done so within a robust framework.

Summary

1. Did our work maintain and/or improve our desired impacts?

We believe that, as a result of our work over the 2003-04 year, our desired impacts have been maintained at existing levels.

  • We saw ongoing strengthening of entity management evidenced through our comparison of the five management aspect results for 2002-03 compared to 2001-02.
  • Select Committees continued to draw from our advice to make their own recommendations at the rate of 74%.
  • New Zealand maintained its ranking on the Transparency International Corruption Perceptions Index.

We have identified the following as areas of our work that we need to strengthen and/or improve to enhance our desired impacts (the Auditor-General has already acknowledged these in his five-year Strategic Plan):

  • we need to improve our timeliness, especially for performance audits and inquiries;
  • we need to enhance annual audits to have stronger non-financial and waste, probity governance and accountability emphasis; and
  • we are being asked for more performance audits in order to provide greater depth and breadth in areas of value-for-money, effectiveness and efficiency.

These areas of strengthening and/or improvement have been incorporated into our Annual Plan 2004-05.

2. Did we improve our own development, measurement and reporting of outcomes?

We believe that, arising from the work on our Strategic Plan and Annual Plan, we have made significant improvements in our own development, measurement and reporting of outcomes.

  • We comprehensively reviewed our outcome and measurement frameworks as we developed our Annual Plan 2004-05.
  • We developed, in consultation with the Treasury, an evaluation framework for measuring the impact of our performance audits. The Officers of Parliament Committee has since approved this framework.
  • We started to use our revised measurement framework to improve the information provided in this Annual Report.
page top