2.5 Service Levels

Local government: results of the 2002-03 audits.

In reports to Parliament in recent years, we have expressed concern that asset management plans have lacked information about service levels, or that such service levels had not been established through a public consultation process.23

The international infrastructure management manual Creating Customer Value defines service levels as service parameters or requirements for a particular activity or service area against which service performance may be measured. Such service levels can relate to dimensions of, for example, quality, quantity, reliability, responsiveness, environmental acceptability, and cost.

In the Local Government Act 2002 (the 2002 Act), the concept of service levels is applied to all activities undertaken by local authorities rather than solely for asset management planning. Local authorities must specify in their long-term council community plans (LTCCPs):

  • service levels for groups of activities;
  • performance targets and measures; and
  • the estimated expenses of achieving and maintaining these levels of service.

The LTCCP must specify this information in detail for the first three years, and in outline for subsequent years.

In the past, we have not always been convinced that costs of groups of activities in long-term financial strategies – the predecessor of LTCCPs – have been linked to, or driven by service level decisions. To meet the LTCCP content requirements of the 2002 Act, local authorities will need to more explicitly link levels of service with estimated expenses in underlying work programmes and budgets.

In our view, the LTCCP should clearly and logically state the relationship between the outcomes and the services undertaken, and the resulting service levels and performance expected for activities or groups of activities.

Information in LTCCPs about groups of activities is likely to improve in the next 2-3 years, as local authorities undertake the community outcome process under the 2002 Act, and improve the robustness and integration of underlying information.

We reviewed service levels for asset management-related activities in our review of asset management plans (see pages 52-59). Feedback from our auditors and local authorities has suggested that approaches to setting service levels for non-infrastructure-based services are proving more problematic. Therefore, for this review, we looked at service levels for two activities undertaken by each local authority that were not infrastructure-intensive. In most instances, the activities we reviewed were for services associated with:

  • consent processing;
  • animal control;
  • arts, culture and recreation services; and
  • community and democratic services.

Results of Our Review of Non-infrastructure-based Service Levels

Understanding Services

Our review found that there does not appear to be a strong understanding by many local authorities of the services that they are providing, or the contribution that these services make to outcomes. However, both the 2002 Act and the 1974 Act required local authorities to provide information in planning documents about the rationale for the delivery of services.24

The myriad of terminology around concepts of performance reporting – such as outcomes, outputs, service levels, objectives, and deliverables – also appears to be creating confusion for both local authorities and the public.

In our view, a clear understanding of a local authority’s reasons for providing a service, including how it contributes to outcomes being sought, is critical to creating a meaningful framework for determining service levels and assessing performance against these levels.

When the nature and effect of a service is not well understood:

  • like-services cannot be grouped well – which can make explanations in external planning and reporting documents difficult for readers to understand; and
  • the selected service levels may not be the most appropriate or relevant, which can make it difficult for both local authorities and the public to assess the effectiveness of services and their value to the community.

Limited Consultation with Communities

The majority of local authorities that we reviewed had used their annual planning process to consult with their communities on service levels and performance.

Local authorities then drew the conclusion that, as the performance measures had remained unchanged in annual plans for some years, they established that the public had agreed to these service levels. However, we noticed that, in some instances – regardless of the extent of change or actual result achieved – performance measures did not change over a number of years in planning and reporting documents.

Local authorities frequently undertake consultation associated with specific activities or decisions – such as for the development of new policies, asset management plans, or redevelopment of facilities. However, such consultation often is:

  • not focussed on service levels or changes to service levels that are likely to result from the activity or decision; or
  • not resulting in consideration of the service levels to be specified in planning and reporting documents.

Legislative Requirements Dominate

Many local authorities set service levels based on legislative requirements (such as the resource consent processing timetables for the Resource Management Act 1991). Legislative requirements must be observed, and, in many instances, may be relevant and appropriate measures of service. However, the role of a local authority in delivering services is to promote the well-being of its district.25

Local authorities should therefore consider, based on community feedback, whether other measures or indicators might be more relevant or important to the community than those set out in legislation.

Over-reliance on ‘Satisfaction’ Surveys

Ratepayer or user satisfaction can be a useful and relevant way of obtaining feedback on services. However, in our view, there tended to be an over-reliance by some local authorities on user satisfaction or ratepayer survey results for assessing the quality of services and service levels. Such feedback is useful where:

  • the standard of performance being sought is clear, and feedback is relevant to the dimension of performance;
  • respondents are asked questions that they could reasonably be expected to understand and hold a view on; and
  • the survey allows feedback to be collected on elements of the service that respondents were not satisfied with, in order for this information to be used to improve service quality.

In noting the over-reliance on satisfaction surveys, we are not discounting the importance of public views and satisfaction. Rather, we are suggesting that performance against service levels is best evaluated through a range of indicators, such as ratepayer or user views, scientific data, quality control procedures, and numeric quantity or cost data.

Why Do Service Levels Matter to the Auditor-General?

Under the 2002 Act, we will be required to give an opinion on LTCCPs, including:

the extent to which the forecast information and performance measures provide an appropriate framework for the meaningful assessment of the actual levels of service provision.26

Assessing the effect of change on service levels is also important to decision-making under the 2002 Act. If a proposed change has a significant effect on service level, a local authority may be required to, for example:

  • undertake the special consultative procedure; and
  • amend its LTCCP, including obtaining an audit opinion on the amendment.27

We found that the specification of service levels was problematic in a small number of instances when issuing opinions on amendments to LTCCPs adopted in 2003. The major issue was that, as we did not have a mandate to audit the base document that was being amended, it was not possible to assess the extent to which the performance measures provided an appropriate framework for the meaningful assessment of service provision. In attempting to issue opinions on amendments that affected service levels, we also encountered many of the issues that arose in our review of service levels as discussed above.

What Does This Mean for Local Authorities?

The purpose of the LTCCP, according to the 2002 Act, is to:

(a) describe the activities of the local authority;

(b) describe the community outcomes of the local authorities’ district or region; and

(c) provide integrated decision-making and co-ordination of the resources of the local authority; and

(d) provide a long-term focus for the decisions and activities of the local authority; and

(e) provide a basis for accountability of the local authority to the community; and

(f) provide an opportunity for participation by the public in decision-making processes on activities to be undertaken by the local authority.28

The outcomes to which activities contribute, the service levels determined for those activities, and the measures and targets by which the achievement of these services levels is assessed are central to the purpose of the LTCCP. In our view, many local authorities will need to create a comprehensive performance model that takes account of the various information requirements of the 2002 Act so that communities are able to:

  • understand the reasons for local authority services being undertaken, and the outcomes to which these services contribute;
  • meaningfully assess the extent to which actual services have been achieved, based on comparison with forecast projections; and
  • participate in debates about the services and levels of service sought, the value and cost of those services to communities, and how those services should be funded.

What Would a Comprehensive Performance Model Look Like?

The 2002 Act specifies considerations that a local authority is required to take account of, and contents to be included in the LTCCP. Within these statutory requirements, a local authority is able to determine the performance model that it will use.

In our view, a comprehensive performance model would:

  • consider all elements of a comprehensive model of performance;
  • incorporate a time dimension;
  • choose useful reporting levels;
  • select relevant information from each element of the model to an appropriate extent; and
  • include commentary on uncertainties and strategy.29

The following diagram is drawn from our 2002 report Reporting Public Sector Performance, and illustrates the relationship of the elements that we see forming a comprehensive model of performance.


We will be building on our understanding and expectations of planning and reporting by local authorities so that communities can assess and understand the sustainability and effects of their local authorities’ decisions. We are aware that the Society of Local Government Managers (SOLGM) is considering ways to support local authorities in regard to addressing the area of service levels, and we intend to work closely with SOLGM on these initiatives.

Footnote 23: See, for example, Local Government Looking Back and Looking Forward 2002, page 28, and Local Government: Results of the 1999-2000 Audits, page 14.

Footnote 24: Clause 2(1)(b) in Schedule 10 of the 2002 Act, and section 122L(b) of the 1974 Act.

Footnote 25: Sections 10 and 11.

Footnote 26: Sections 84(4)(c) and 94(c).

Footnote 27: Section 97.

Footnote 28: Section 93(6), as required by section 84(4)(c).

Footnote 29: Reporting Public Sector Performance, 2nd edition, 2002, page 9.

page top