Part 4: Performance reporting

New Zealand Customs Service: Collecting customs revenue.

In this Part, we discuss:

  • the integrity of the Service’s performance measurement information;
  • the measures used to report on the effectiveness and efficiency of customs revenue collection;
  • monitoring compliance; and
  • how the Service compared its performance with other customs services.

The integrity of performance information

We expected the data used to report performance measures in the Service’s accountability documents to be accurate and complete.

Our findings

The customs revenue collection performance measures were a mix of financial and non-financial measures, which we list in the Appendix. The Service obtained figures for some activity and financial measures from information held in CusMod and the FMIS (see rows 1, 2, and 4 to 8 of the Appendix). As we reported in Part 3, this information was accurate and complete.

For other quantitative measures (see rows 3 and 12 to 14 of the Appendix), such as the number of revenue audits carried out, the Service kept statistics. We randomly selected one of these measures, and the Service produced an audit trail showing how the statistics were collected.

Our conclusion

The data the Service used to report performance measures was accurate and complete.

Measuring the effectiveness and efficiency of customs revenue collection

The Public Finance Act 1989 requires a government department’s annual report to contain, among other things, a Statement of Service Performance that includes, for each class of outputs, the standard of delivery performance achieved compared with the forecast standards.

The purpose of measures and standards is to produce information to enable an informed assessment of a department’s performance during the year.

Annual reports should help answer the question “What difference did you make?” We have published Reporting Public Sector Performance1 to stimulate discussion about how public entities measure and report performance. Our report can help in considering measures to report performance.

We expected the Service’s performance measures in its annual report to provide assurance over the customs revenue and about the effectiveness and efficiency of customs revenue collection.

Our findings

The Service prepared an annual report that included a Statement of Service Performance reporting performance against measures and standards compared with the previous year.

The measures can be characterised mainly as “busyness” measures, which are often demand-driven. They described how many activities were carried out and, in some cases, how well. Therefore, they generally did not assess the Service’s performance in revenue collection.

Despite the voluntary compliance regime, which relies on individuals and traders understanding what they need to do to comply with customs legislation, there were no measures about information or education. For example, there were no measures about issuing accurate procedure statements for CCAs or about educational visits to CCA licensees or traders.

In the body of its annual report (but not in the Statement of Service Performance), the Service reported continuing improvements in the cost-efficiency of its revenue collection. In the year ended 30 June 2006, the Service collected about $3,000 for every dollar spent to collect revenue. However, it did not include all costs spent to collect revenue. The ratio reported on a narrow range of costs within the Revenue collection, accounting and debt management output class, which were mostly the costs of running the National Credit Control Unit and some costs from the finance department.

The Service’s 2006/07 Annual Plan included a project to comprehensively review the output classes it used to report its performance, including its performance measures, and then to review its costing model. This project was in progress during our audit. Although we saw some of the Service’s draft documents, because the contents were subject to change we did not review them.

Our conclusions

The Service’s measures did not enable a reader to assess the Service’s effectiveness or efficiency and did not cover the whole voluntary compliance regime. The measures provide information about some aspects of the voluntary compliance regime but do not give assurance about the whole of its customs revenue arrangements.

Recommendation 5
We recommend that the New Zealand Customs Service enhance its performance reporting measures to illustrate the contribution of all of its activities – including education, intelligence, and audit – to the voluntary compliance regime.

Monitoring compliance

One of the difficult areas for revenue collecting agencies is to know whether all the revenue that should be collected is being collected. Nevertheless, we expected the Service to be able to demonstrate, to the greatest degree practicable, whether compliance with the Act was improving.

We know that not all the customs revenue due is collected. This is because, in any voluntary compliance regime, a balance is struck between the cost-effective collection of revenue and encouraging trade through the efficient flow of goods. The Government accepts that every transaction cannot be audited. However, there is an expectation that other measures will reduce, to the lowest degree practicable, the unpaid customs revenue.

Our findings

We could not assess from the information available whether compliance was improving. For example, there was no analysis of the reasons for customs revenue not being collected and no analysis of trends in reasons for non-compliance.

One of the Service’s strategic goals was to provide increased assurance about the integrity of customs revenue collection. To achieve this goal, in 2006/07 it intended to develop and trial a method for assessing the level of customs revenue collected. It then planned to use that understanding to target ways of reducing the customs revenue not collected. At the time of our audit, this work was in progress. It built on work the Service commissioned in 2004 to “estimate and target so-called revenue gaps”.

Our conclusions

We could not assess from the information available whether compliance with customs revenue requirements was improving, and we recognise that this can be a difficult area for revenue collecting agencies to address. We are interested in the results of the Service’s project to develop and trial a method for assessing the level of customs revenue collected.

Comparing performance with other customs agencies

We expected the Service to compare its performance with other countries to help it identify areas to improve the effectiveness and efficiency of customs revenue collection.

Our findings

The Service did not undertake formal benchmarking projects or compare key performance measures with other customs agencies. The Service told us that it had not yet found a customs agency similar enough to make comparisons feasible. The Service continued to seek opportunities to undertake formal comparisons.

The Service kept an extensive calendar of international meetings and events, so staff could find out when meetings would be held and who, if anyone, the Service would send. As part of its continuing contact with other customs agencies, the Service was able to compare its performance with those agencies in various ways.

Customs agencies collaborated through the WCO to improve performance in all aspects of customs. The Service was able to use the WCO’s resources as needed:

  • In 2006, the WCO created WorldCap, an information management database with data from member countries, to support capacity building initiatives and benchmarking, and promote partnerships between members.
  • In 2005, the WCO published a Compendium of Integrity Best Practices that included two examples from New Zealand.

The WCO was directed by the full Council of 169 member countries and the 24-member Policy Commission. The Service was a member of the Policy Commission, and expected to be increasingly looked to for the contribution it could make to capacity building for less-developed member countries and to the WCO at a strategic and policy level.

As an example of this, in January 2007, because of the views the Service expressed at a WCO meeting in India and a book it commissioned from the Institute of Policy Studies,2 the WCO’s Secretary General asked the Service to produce a discussion paper that gave an independent (that is, non-customs) view of:

  • what the border might look like in the 21st century; and
  • how customs agencies would interact with that border and with their main stakeholders.

Our conclusion

Through its international contacts and participation in joint activities, the Service had opportunities to assess its performance with other countries. However, there are difficulties with undertaking formal benchmarking between different revenue collection systems.

1: Second edition, January 2002,

2: Andrew Ladley and Nicola White (2006), Conceptualising the Border, Institute of Policy Studies, Wellington.

page top