Part 9: Performance measures and targets for outputs

Statements of intent: Examples of reporting practice.

To provide a balanced and rounded performance story about service delivery, it is usually necessary for output performance measures to cover several dimensions of performance (for example, quantity, quality, and timeliness of service delivery). It may also be necessary to include a variety of different measures within each dimension (that is, several quantity or quality measures).

The Securities Commission adopts a clear and orderly format, grouping together measures of quantity, quality, timeliness, and cost in its forecast SSP (Example 23). We have reservations about several of the performance measures themselves, but consider the output information to be clearly presented and well organised.

The clear categorisation of dimensions presented by the Securities Commission in its SOI can be easily applied to the Information Supporting the Estimates of Appropriation, as shown by the Ministry of Social Development (Example 24). The budgeted and estimated actual figures for 2007/08, required in the Information Supporting the Estimates of Appropriation, provide a useful background to the 2008/09 targets.

Performance measures need to be meaningful. They need to measure the service aspects that represent good performance, and that are within the control of the reporting entity. Therefore, performance measures need to be relevant, controllable, valid, verifiable, unbiased, complete and balanced, understandable, and comparable.

Specific and measurable targets should be attached to the performance measures, preferably with comparative data, including prior-year targets and actuals (see Example 24). The SOIs (including the forecast SSPs for Crown entities) provide scope for reporting longer-term historical trend information that gives users some context for the current set targets. At the very least, forecast targets should be included for the forthcoming annual reporting period. Longer-term targets are more useful in helping users understand the direction and level of the entity’s intended performance. Comparative information from other entities, regions, or countries can function as useful benchmarking.

Well-specified targets with contextual information allow users to gauge the entity’s intended level of performance for the period and to compare actual performance against intended performance. Targets should be reasonable (in terms of representing best estimates), and reflect the priorities of the entity, its resources, decisions, and past performance.

Many of the Department of Internal Affairs’ performance measures in the Vote Internal Affairs Estimates material relates to quality measures (for example, error rates) and timeliness measures (for example, turnaround times for applications). Other types of measures are used in its Regulatory Services output class (Example 25). Of note in Vote Internal Affairs is the considerable amount of “activity information”, which is demand-driven.

Generally, data that record demand-driven events relate to matters outside the reporting entity’s control (for example, number of applications received, number of prisoners, and number of students enrolled in compulsory education programmes). They are not often “true” measures of performance, because demand-driven data do not tell the user about how well the service is being delivered. However, purely demand-driven data may provide context, and be useful when they relate to real measures of performance (for example, the effect on the quality of services from a change in throughput volume).

Usually, reporting demand-driven data is unnecessary from an external performance reporting perspective. However, if the demand-driven data are included in a forecast SSP for contextual reasons, then this inclusion should be justified and explained. Importantly, reporting demand-driven data is not a substitute for true service performance measures and targets, or results.

If the demand-driven data are considered necessary to provide meaning to service performance reports, then they should be clearly identified and separated from the true performance measures. The Department of Internal Affairs has done exactly that and includes a useful footnote explanation (Example 25).

We consider that the New Zealand Qualifications Authority (NZQA) provides one of the better examples of a Crown entity forecast SSP, in terms of its clear and orderly layout (Example 26). The format of NZQA’s report provides a useful starting point for showing its output classes, outputs, performance measures, and targets. However, the report could be clearer if separate columns for performance measures and performance targets are included to distinguish the measure from the target.

Although there are areas for improvement in the content of the forecast SSP, we consider this example shows a useful format for presenting performance targets across the various dimensions (that is, quantity, quality, and timeliness). Areas for improvement in content include: removing information on processes; better identification of some outputs; and identifying performance criteria where they are currently absent.

page top