Part 3: Improving performance measurement

Sport and Recreation New Zealand: Improving how it measures its performance.

In this Part, we describe:

We then discuss important aspects of what SPARC did to create its performance measurement framework. These included:

  • ensuring robust and reliable data collection;
  • creating standardised measures; and
  • integrating ways to gather information.

Finally, we provide our views on SPARC’s work to improve its performance measurement.

Limited information on overall effectiveness

Since SPARC was set up in 2003, it has had a strong focus on evaluating its performance, building progress reviews into its programmes, and examining how it delivered its work. This evaluation work helped SPARC identify how it could improve its performance. However, the work focused on evaluating SPARC’s processes or performance in delivering work rather than progress with increasing participation or building capability within the sport and recreation sector. This meant that SPARC had limited information on its effectiveness in achieving its broader strategic outcomes. There was a clear need for SPARC to improve how it assessed and evaluated its performance so that it had a better understanding of what it was achieving.

SPARC was already aware of this need at the time of our audit. It was introducing a comprehensive performance management framework to help it demonstrate its achievements and to inform decisions about its work.

How SPARC evaluated its work

SPARC used a mix of formal and informal evaluation activities to assess the value of its work, identify improvements, and better target its work. SPARC’s evaluation activities included strategy reviews, programme evaluations, surveys of sector organisations, surveys of those using SPARC resources, and reviews of its own systems and processes.

It used the results of these activities to refine and better target delivery. For example, an evaluation of a leadership development programme found that it was valued by the participants, identified that the coaching component of the programme had not worked well for participants, and made recommendations to improve the programme design.

Although SPARC had a strong evaluation focus for individual activities, little work was done to combine the results of individual activities into an overall view of progress and increases in participation. Because SPARC’s evaluation activities tended to focus on processes or its performance in delivering programmes, it had only basic information about the effectiveness of SPARC’s activities in increasing participation. SPARC did not have any long-term trend data on participation rates from the groups specified in the Act, and acknowledged that it had limited evidence of its influence in changing participation for the general population.

Challenges for SPARC in evaluating its effectiveness

We recognise that measuring the effect of SPARC’s activities is a complex exercise. Behavioural change – in this case, influencing participation in sport and recreation – is a gradual process, and a long period of time can be needed to see results. Participation rates will also be influenced by factors outside SPARC’s control. Many of SPARC’s activities have indirect links to increasing participation, so it can be difficult to quantify the effect these activities have.

An example of this is SPARC’s work to support and build capability in the sector. Research shows that people look for quality engagements with sports clubs and organisations. Clubs with adequate organisational capability are more likely to provide a positive experience for people and either maintain or grow their membership.

SPARC can measure its work in building organisational capability – for example, how participants experience or value its leadership development programmes or how many organisations use its organisational development tool. However, it is much harder to link improvements in organisational capability to increases in participation.

SPARC acknowledged that its ability to demonstrate the effect of its funding activities was mixed. Its agreements with the organisations it funded contained measures for assessing performance. We were told that the quality of these had been poor (for example, the number of measures bore little resemblance to the amount of funding invested), that measures provided minimal information about actual achievements, and that there was minimal monitoring of whether organisations were achieving their targets.

The complexities involved in quantifying the effect of SPARC’s activities account, to some extent, for the process and performance focus of SPARC’s evaluation. However, this focus meant that there was limited information on the effectiveness of SPARC’s activities in increasing participation in sport and recreation. SPARC recognise d that it needed better information to demonstrate the value of its work and account for its performance.

Introducing better performance measurement for well-informed decisions

SPARC wanted to demonstrate measurable achievements from its work and to gather robust information to inform policy, investment decisions, and business development. To do this, SPARC needed an information-gathering framework that would enable it to assess progress and to identify changing needs.

SPARC created a performance measurement framework so that it could assess its achievements using a solid evidence base and make well-informed decisions about its work. In creating this framework, SPARC:

  • linked its work to its strategic goals;
  • identified the information it needed to demonstrate performance against strategic goals and how it could collect this;
  • planned to measure progress at various intervals;
  • proposed to use different ways to measure progress at these various intervals;
  • established how it would set baselines as a starting point for measuring against; and
  • integrated ways to gather information about challenges and issues in the sport and recreation sector and how to address these.

Figure 1 sets out SPARC’s framework for measuring its performance.

Figure 1
Performance measurement framework

SPARC’s strategy: More kids, more adults, more winners
needed to know
When Collecting
information in
different ways
information to
strategic goals
information for
Strategic information
Achievement of
strategic goals
and outcomes
Assessed at 4-5
year intervals
Surveys Surveys
selected to
provide specific
about individual
strategic goals

Earlier survey
data established

Double arrow. Double arrow.
Operational information
Progress towards
strategic goals
and outcomes
Assessed at 1-2
year intervals
from sport
and recreation

measures linked
to strategic goals
for sport [and]

progress towards
strategic goals
and where
changes are
Double arrow. Double arrow.
Contextual information
the sport and
Ongoing Research

SPARC work with
the sport and
recreation sector
informed by
strategic goals
emerging issues
and challenges

what works

strategy, policy,
and research

where changes
are needed

Source: SPARC.

Ensuring robust and reliable data collection

Good practice guidance on performance measurement considers data quality to be the most important factor in creating an effective measurement framework. SPARC had thoroughly considered its information needs. SPARC established what information it needed, how it would collect this information, when the information would be collected, and who was responsible for collecting it.

SPARC identified that it needed different ways to assess whether it had achieved its strategic goals and to assess how well its work was contributing to achieving these goals. These assessments needed to occur at different times. SPARC selected different ways to get information that would enable it to:

  • monitor its strategic goals on a four-yearly basis;
  • track progress towards meeting strategic goals annually; and
  • collect information for business and policy development and investment decisions.

In selecting the ways it would collect information, SPARC considered different options such as national surveys, other agencies’ surveys, market research, applied research, information from sports clubs, data from other government agencies such as Statistics New Zealand, and information from its funding arrangements.

It used criteria to assess how well the different options would meet its needs. Figure 2 describes the criteria that SPARC used. SPARC assessed each option for the quality and relevance of the data it would provide, its availability, how often data could be collected, and the cost-effectiveness of collecting the data. From this assessment, SPARC identified existing ways to collect information that would meet its needs, where it needed to create its own way to get the data it needed, and where alternatives might be needed if there was uncertainty about the ongoing use of a particular option.

Figure 2
Criteria for assessing data collection options

Criterion Considerations
Quality Will information be robust?

For what purposes will it be suitable?
Relevance Can the information provided be used to measure goals, targets, and objectives in the strategic plan?

Does it provide information to track progress, measure performance (internally/externally), and/or provide information on where improvements or changes might be needed?
Availability Can the tool be used over time and in time periods when information
is needed?
Frequency How often will information be collected?
Cost-effectiveness Is the data provided worth the investment to collect it?

Source: SPARC.

SPARC identified that it should maintain responsibility for collecting monitoring information so that it could ensure that robust information was available when it needed it.

As SPARC decided how and when it would collect the information it needed, it established where its baseline data would come from. SPARC planned to use data from national surveys (1997/98, 1998/99, 2000/01, and 2007/08) on New Zealanders’ sport and recreational activity for this. Having a baseline was important so that SPARC could assess more current information against this starting point and identify longer-term trends or changes. It could then use this information with the other information it gathered to understand whether its work was achieving the desired results.

In assessing its information needs, SPARC recognised that information from the sport and recreation organisations it funded was a crucial part of its performance measurement framework. SPARC needed consistent information from the organisations it funded and a way of linking this information to its strategic outcomes. To get this, SPARC created standardised measures for use in its funding contracts (see paragraph 3.25).

A challenge for SPARC in getting better information from the organisations it funded was that these organisations did not always have the information technology capability and data collection systems to provide robust and reliable information on participation. At the time of our audit, SPARC was designing a database to improve this. SPARC considered that important factors in designing an appropriate system for data collection were having a person with the right design skills and a good understanding of how the sector worked. At the time of our audit, SPARC’s database design work was still in the early stages. We will maintain an interest in the progress of this work.

Creating standardised measures

SPARC created standardised measures linked to strategic goals so it could compare results between programmes and over time. SPARC did this so it could make well-informed decisions about the results of its investments and understand how well these investments supported progress towards longer-term goals.

SPARC designed standardised outcomes and measures for use in its funding arrangements. SPARC structured its outcomes and measures so that it could see the relationship between inputs, outputs, and programme results, and the link to SPARC’s longer-term goals. Figure 3 illustrates these relationships using one of SPARC’s outcomes for increasing participation.

Figure 3
Example of linking individual measures to strategic goals

Strategic goal More kids in sport and recreation (by 2015)
Detailed outcome More young people participating in organised sport through primary schools and secondary schools
Expected change or
Increase in primary-aged students participating in physical education and co-curricular sport

Increase in secondary-aged students participating in physical education and co-curricular sport
Inputs Investment in strategies and programmes to increase participation in schools
Outputs Strategies and programmes delivered in schools
Measures X* increase in the number of young people participating in organised sport through primary schools

X increase in the number of young people participating in organised sport through secondary schools

* Note: Actual figures agreed with SPARC and funded organisation to take account of individual factors.

Source: SPARC.

This system of linking outcomes and measures supported effective measurement because SPARC identified expected changes or improvements, the target population where change was expected, and the amount of change expected. With this information, SPARC could examine programme efficiency (comparing inputs to outputs), the quality of programmes that were being delivered, and whether the intended programme outcomes had been achieved.

Having a standard system was important for consistency and to allow results to be compared. SPARC created a standardised list of outcomes and a format for measures for staff to use in investment contracts. At the time of our audit, SPARC was introducing these standardised measures. Recently negotiated contracts with regional sports trusts included these standardised measures, and SPARC told us that its contracts with national sports organisations were going through a similar process.

SPARC commented that checking the quality of measures through its routine quality assurance checks and staff training on writing measures were important for ensuring that measurement information requirements in the contracts were robust and would provide the information SPARC needed.

Integrating ways to gather information

SPARC integrated ways to gather information about challenges and issues in the sector. SPARC did this so it could refine its work to better meet sector needs and adapt to changing needs or emerging issues.

Integrating ways to gather information about sector issues was an important part of SPARC’s framework. SPARC recognised that it needed information and evidence to understand emerging sector issues to inform its policy, research, and business development work. Demonstrating the value of sport and recreation activities was also important for SPARC.

SPARC’s research function was a core component of its performance measurement framework. To gather information on the sector, SPARC planned to use its research grants programme, commissioned research, evaluation and environmental scans, and other information sources.

SPARC refocused its research grants programme to get better alignment with its strategic goals. In making decisions about research grants, SPARC was looking for projects that met SPARC’s needs as well as those of the wider sector. SPARC focused this research grants programme on community sport and recreation. Some examples of current research included:

  • identifying drivers for, and barriers to, participation in grassroots football and what these meant for increasing participation in the sport;
  • examining factors influencing participation in outdoor recreational activities and how SPARC could use this information to increase participation in outdoor recreation; and
  • examining how volunteers’ experiences can influence long-term intentions to volunteer.

SPARC intended that the results of its research would feed back into considering ways to better meet sport and recreation needs and the longer-term strategic direction of SPARC.

Our views on work to improve performance measurement

Because SPARC’s performance measurement framework was new, it was too early for us to assess its effectiveness. However, we consider that SPARC has created a comprehensive framework that, when fully established, should provide SPARC with robust information to inform its decision-making and enable SPARC to account for its performance with a solid evidence base.

It was clear that SPARC knew what its information needs were, had considered how to meet these information needs, and was establishing systems to provide the information it needed to evaluate progress.

We identified several elements of SPARC’s framework that should support effective performance measurement. These include:

  • linking activities to strategic outcomes;
  • establishing a base to measure against;
  • ensuring that robust and reliable data is collected;
  • planning for measurement at various intervals (shorter and longer term);
  • creating standardised measures so SPARC can compare results between programmes and over time;
  • using measures that provide information about the relationship between investments and programme results, and the quality of programme results, so that SPARC can assess what has been achieved;
  • incorporating ways of monitoring environmental factors so SPARC can adapt and refine its work in response to changing needs; and
  • considering the cost-effectiveness of measurement activities.

We support SPARC’s efforts to improve its monitoring of progress and consider that this is useful work for SPARC to demonstrate the effectiveness and value for money of its investments. We consider that this information is critically important for SPARC to be accountable for its performance, give a complete and accurate account of how it uses public funds, and demonstrate the contribution its funding investments make to its outcomes.

page top