Part 5: Monitoring and evaluating the effectiveness of civilian maritime patrols

Effectiveness of arrangements for co-ordinating civilian maritime patrols.

In this Part, we describe:

Summary of our findings

The NMCC had a mix of formal and informal systems for monitoring and evaluating the effectiveness of patrol co-ordination and maritime patrols. These systems had some limitations that made robust evaluation difficult. Evaluation activities tended to have an operational focus and did not provide a broader strategic perspective on how well maritime patrols were meeting New Zealand's needs. The NMCC's information for evaluating the overall effectiveness of maritime patrols was minimal and more systematic data collection and analysis was needed.

Improvements were already under way. The NMCC and NZDF were introducing systems to provide better information for evaluating patrols' effectiveness, and to support more robust evaluations of how well patrols were meeting needs.

A more comprehensive performance framework (including strategic and annual planning) and using a range of information for evaluating patrol needs and use would support a better understanding of patrols' effectiveness. The NMCC was working to establish this framework.

Because a range of work was under way to improve how the performance of patrols was measured and evaluated, we have not made any recommendations for improvements. However, we note that having information on patrols' effectiveness is critically important for demonstrating that patrols are adequately meeting New Zealand's needs in the maritime domain. We will maintain an interest in progress with work intended to support better monitoring and evaluation of maritime patrols.

Systems for monitoring and evaluating the effectiveness of maritime patrols

The NMCC's evaluation activities tended to have an operational focus and did not provide a broader strategic perspective on how well maritime patrols were meeting New Zealand's needs.

The NMCC had a mix of formal and informal systems for monitoring and evaluating the effectiveness of patrol co-ordination and maritime patrols. Systems included:

  • monitoring performance against the NMCC's output measures;
  • seeking comments from other interested organisations;
  • reporting to other interested organisations;
  • collecting and analysing patrol data; and
  • discussing patrols at planning and co-ordination meetings, and recording the lessons learned.

Most evaluation activities had an operational focus. The NMCC and NZDF were introducing improvements to provide more regular evaluations of patrols and better information for analysing how well patrols were meeting needs. These improvements are described in paragraphs 5.21-5.29. Before these improvements were introduced, the effectiveness of individual patrols was not regularly evaluated in a meaningful way in cross-agency discussions.

The 2001 Maritime Patrol Review identified that there was little understanding of the strategic effectiveness of maritime patrols. We found little progress in this area. However, the NMCC, NZDF, and government agencies were establishing systems to better evaluate patrols' effectiveness; over time, these systems would improve the wider understanding of patrols' strategic effectiveness.

Limitations in systems for monitoring and evaluating the effectiveness of patrols

Existing measures and outcomes and patrol information did not always provide meaningful information with which to evaluate effectiveness.

Limitations in measures and outcomes

The NMCC's output measures provided only limited information about the NMCC's performance. They did not provide robust information about how well maritime patrols were meeting the needs of government agencies.

The NMCC had two output measures. One measure counted the allocations of available patrol aircraft and ships; the other measured other organisations' satisfaction with the NMCC's co-ordination.

The NMCC's ability to allocate an aircraft or ship depended on the aircraft or ship's availability, the government agencies having a patrolling task to do, and the suitability of the available aircraft or ship to fulfil that task. Therefore, counting allocations was not a good measure of the NMCC's performance.

The NMCC measured the satisfaction of other interested organisations through an externally co-ordinated annual survey. The survey response rate was poor (a 47% response rate in 2008, 69% in 2009) and the number of organisations surveyed was small. This made the survey data unreliable for making useful inferences about the NMCC's co-ordination performance. The NMCC told us it was considering how to better measure the organisations' satisfaction with the NMCC's performance.

Having commonly agreed outcomes and a way of measuring progress against those outcomes is considered a success factor for co-ordination arrangements.16 There were broad outcomes identified in the Governance Framework that had been agreed to by everyone involved, but there was little detail for evaluating performance or progress against those outcomes. There were no measures that provided information on how well maritime patrols were supporting New Zealand's strategic maritime interests.

The Governance Framework outcomes made it clear that the NMCC contributed to the work of government agencies using maritime patrols for the aspects of their work that occur in the maritime domain. Because the NMCC is a unit supporting the delivery of other agencies' goals, it seems appropriate that the NMCC has some understanding of what those agencies are trying to achieve with patrols as a way of focusing effort and informing assessments of patrols' effectiveness. It was difficult for the NMCC to gain this understanding because some of the agencies considered it outside the NMCC's mandate.

The lack of strategic guidance also made it difficult to evaluate progress and effectiveness. Because the maritime patrol strategy was still to be finalised, there were no specific goals or objectives to link performance to.

Limitations in patrol information

The NMCC had a basic system for recording some information on patterns of patrol use by agency and by aircraft or ship. The information that was recorded included:

  • programmed, completed, and cancelled patrols;
  • the aircraft or ship used for the patrol;
  • the agency requesting the patrol;
  • the area covered by the patrol and the number of vessels seen;
  • sea days and aircraft hours;
  • general comments about the patrol, such as reasons why patrols were cancelled; and
  • available and unallocated aircraft or ships.

In 2008/09, the NMCC started recording information on agency patrol requests that could not be met and the reasons why these requests could not be met. We consider this useful information for forming a more comprehensive view of government agencies' patrol needs and use.

There were limitations in patrol information, which made it difficult to form an accurate picture of patrol needs and use. These limitations included:

  • data was recorded inconsistently, or how the data was recorded changed over time;
  • concerns that the government agencies might not be making requests for patrols because they knew aircraft or ships were not available or because they wanted to preserve patrol days/hours for future tasks, potentially hiding unmet need; and
  • a lack of measures or baselines to show what might be an appropriate level of patrolling.

We looked at the NMCC's patrol data for three financial years (2006/07, 2007/08, and 2008/09). We were able to examine general patrol trends, such as which government agencies were using patrols and which aircraft or ships were used for patrols. However, the data did not provide a full picture of patrol use, because not all patrol requests were processed through the NMCC.

It was more difficult to use recorded information to see how response patrols were used, reasons why patrol aircraft or ships were not available, and whether there was any unmet need. It was not always easy to track with certainty what happened when planned patrols were cancelled or rescheduled. Information on results from patrols was recorded only occasionally.

Improving systems for monitoring and evaluating the effectiveness of patrols

The NMCC and NZDF were improving systems to provide better information for evaluating patrols' effectiveness, and to support more strategic evaluation of how well patrols were meeting needs.

Incorporating evaluation through the new planning process

The new planning process incorporated evaluation in two ways:

  • It enabled the purpose of patrols and what the patrol achieved to be measured more precisely.
  • The new co-ordination meetings were a forum for government agencies to share lessons learned (the NMCC encouraged regular reporting on patrol activities in this forum).

As we discussed in paragraphs 4.6-4.10, systematic data collection was built into the new planning process to provide meaningful and consistent information about what patrols were achieving. The new planning system would provide information on:

  • risk in relation to EEZ coverage;
  • whether patrol objectives were met;
  • whether a patrol request could not be met because of unavailable aircraft or ships;
  • whether aircraft and ships were available but were not used because they did not fit with the needs for the patrol;
  • whether aircraft or ships used on patrol were the best fit to achieve the patrol objective; and
  • whether a patrol did not occur because of unavailable crew or agency staff.

The NMCC's new system was yet to mature, but over time this information could be analysed to identify trends, gaps, and issues. Having robust data available to support the need for changes, such as more resourcing (staffing, technology, and patrol aircraft or ships) or a different mix of these resources, is important. Based on this information, the NMCC should be in a good position to:

  • show what was working well;
  • identify where there were gaps or capability shortfalls;
  • show where refinements or changes were needed; and
  • share any identified issues with all organisations involved or interested in maritime patrols for their consideration or action.

Working with government agencies to improve patrol performance information

NZDF had existing systems for its own post-patrol evaluations. Operational details and lessons learned were recorded in a database. NZDF was encouraging government agencies to add their own information to this database.

At the time of our audit, the Navy was aiming to get better information about the performance of patrols. It produced an assessment tool to help government agencies in defining the success of a patrol. This tool was a scoring template including a series of questions that covered planning and carrying out a patrol. Answers to these questions were combined to produce a measurement score. The Navy anticipated that, over time, it would have better information about what was working and why. This would result, in turn, in more effective decisions about how ships were used to support maritime patrol activities. Agencies were providing comments on the scoring template at the time of our audit.

These systems help in evaluating operational activities and allow for improvements in planning and carrying out patrols. Collecting data over time also allows for analysis that can provide a more strategic view of what is or is not working, so changes can be made to provide more effective patrol planning and more effective patrols.

Changing the focus from measurement to effectiveness

The NMCC, NZDF, and the core agencies were moving their focus from simple output measures (for example, the number of hours or sea days), to a focus on effectiveness (that is, getting the best effect from the patrol hours and days available).

NZDF anticipated that introducing new ships and upgraded aircraft would provide more capacity to meet maritime patrol needs and enable them to get more from the patrols that were carried out. Aircraft upgrades would improve technological surveillance capabilities and make it likely that more could be achieved within the flying hours available. Ship availability would increase and ships would be able to patrol farther and longer, maintaining a presence (which is an important deterrent). Surface and aerial patrols could also be used in a complementary way to get better results from patrol activity.

The NMCC was encouraging government agencies (through its processes and informal discussions) to consider what they were trying to achieve from patrols. The new planning process introduced more meaningful measures of what patrols achieved, rather than just measuring the hours or sea days used for patrols.

Our views on monitoring and evaluating the effectiveness of patrols

We were pleased that the NMCC, NZDF, and the core agencies were putting systems in place to improve how they monitor and evaluate patrols. In our view, having information on the effectiveness of patrols is critically important to show that patrols are adequately meeting New Zealand's needs in the maritime domain, and to identify any issues or gaps that may be limiting the effectiveness of patrols.

We acknowledge that measuring the effectiveness of maritime patrol is not a simple exercise. For example, it is hard to measure the deterrent effect of patrols. If a patrol does not find something, is the deterrent effect working or was the patrol looking in the wrong place? If illegal activities are found, is this evidence of effectiveness or ineffectiveness? Also, it is difficult for government agencies to know whether activities (including illegal activities) are occurring but not being found.

Because of the difficulties associated with evaluating the effectiveness of patrols, we consider that there is value in using a range of information to form a view about the performance of maritime patrols and their broader effectiveness. As well as data on patrol coverage of the EEZ, use, capability, and staffing, information contributing to a broader assessment could include:

  • patrol requests compared with patrols completed;
  • reasons for cancelled patrols and whether an alternative was found;
  • numbers of ships sighted, boarded, and apprehended;
  • actions resulting from patrols, such as prosecutions;
  • costs of patrols; and
  • assessments by government agencies about how well patrols are meeting their needs.

The NMCC had a role in identifying gaps and issues for maritime patrols and was establishing systems that would help in doing this. As a unit supporting the delivery of the goals of other government agencies, the NMCC needed participation from those agencies to maximise the effective use of patrol aircraft or ships and to feed into broader NMCC evaluations of its effectiveness.

Overall, we consider that the evaluation of effectiveness should be more comprehensive. More comprehensive evaluation will be supported by the NMCC completing work already under way and by addressing our recommendations. The specific actions that we consider will support more comprehensive evaluation are:

  • completing the strategic and annual planning content specified in the NMCC's Governance Framework (work in this area was under way);
  • establishing better guidance on what an appropriate level of patrolling might be as a starting point for monitoring and evaluating the use of new and upgraded aircraft and ships (see Recommendation 2);
  • collecting better information through changes introduced with the new patrol planning system to enable better assessment of patrol planning and tasking; and
  • ensuring that collected information enables the identification of knowledge gaps, testing of assumptions, and monitoring of existing requirements, so that unmet needs or future requirements are identified and supported with robust evidence (see Recommendation 6).

We will maintain an interest in progress with work intended to support better monitoring and evaluation of the effectiveness of patrols.

Evaluating the effectiveness of patrols is not an end in itself. Information demonstrating successes as well as aspects to improve can be useful for getting government agencies actively participating in discussions about needs and commitments.

16: State Services Commission (2008), Factors for Successful Coordination – A Framework to Help State Agencies Coordinate Effectively, Wellington, page 13.

page top