Conclusions and Recommendations
- Our Audit and Recommendations
- The Benefit Processing System
- The Extent and Causes of Under- and Over-payments
- Case Managers’ Caseloads
- Checks on Case Managers’ Performance
- The Use of Team Coaches
- Assessing Case Managers’ Performance
- Measuring and Reporting Accuracy
- Uncertainty of Measurement Results
- Identifying the Size of Errors and Fraud
- The Causes of, and Hardship Caused by, Errors
- Consequences of the Ministry’s Structure
- Potential Beneficiaries
In 2002-03, the Ministry of Social Development (the Ministry) was responsible for the administration of nearly $12,000 million of Crown funding for income support for beneficiaries. The Ministry has systems to measure payment accuracy, and the measures are audited and reported annually.
The Ministry’s obligation is to pay benefits correctly on the basis of the information available to it (whether the information was provided by or obtained from the applicant/beneficiary, or was otherwise known to it). It measures accuracy in terms of this obligation.
We think this measure of accuracy is of only limited use in enabling Parliament to assess whether or not the Ministry is doing a good job. The Ministry does not currently collect information in other areas that we think are relevant to the general issue of accuracy. For example, it does not seek to estimate the total cost of payment errors, whether in terms of the amounts of benefit overpaid or underpaid, or in terms of the true cost of “fixing” the errors.
Because of the limitations of the information currently available, we have been unable to form a clear view about whether or not the Ministry is performing well. In saying this, we acknowledge that the Ministry has been meeting the performance targets in its recent Departmental Forecast Reports. We also acknowledge that Ministry staff have competing priorities, and that accuracy is only one dimension of their performance. Nonetheless, given the very large sums of money involved, we are firm in our view that more and better information is required.
Our Audit and Recommendations
In view of the very substantial sums involved and the implications of inaccurate benefit payments for the welfare of both taxpayers and beneficiaries, we considered it necessary to supplement our annual audit with a special audit of the soundness of the Ministry’s systems for benefit administration and accuracy measurement.
We have made 14 recommendations in this report to highlight areas where improvements could be made.
The Benefit Processing System
The Ministry’s computer system, SWIFTT, while old, is sound in terms of accurately processing benefits. It has some constraints that limit its use to identify accuracy. The Ministry has developed some new functions, and there is scope for the system to be enhanced further.
Recommendation 1 (page 36) |
---|
We recommend that – The Ministry continues to explore further enhancements to SWIFTT involving automatic checks, alerts to case managers, compulsory fields to provide explanations of decisions, and the production of management information. |
The Extent and Causes of Under- and Over-payments
Under- or over-payments that arise outside of the Ministry’s obligation (for example, because the applicant provides incorrect information) can still have consequences for the Ministry or the beneficiary, or both. There could be benefits in learning more about why they happen, with a view to devising ways to avoid them, or to identify them as soon as possible.
Recommendation 2 (page 40) |
---|
We recommend that – The Ministry continues to explore cost-effective ways of extracting and analysing data about the extent and causes of under- and overpayments, and uses the results of the analysis to devise ways to avoid under- and over-payments or to identify them as soon as possible. |
Case Managers’ Caseloads
The Ministry’s arrangements for allocating resources and devolving responsibility for managing them are consistent with giving managers appropriate responsibility for what they control and making them accountable for results. However, benefit administration is likely to be improved if reliable assessments of the benefits and risks to accuracy of different approaches to improving accuracy were routinely shared with other regions, rather than remaining as lessons held within particular regions.
Because accuracy results are only reported annually to the national level, any region that adopts an approach that turns out to alter accuracy will not come to national management’s attention in a timely way. In the meantime, if the impact is to reduce accuracy, beneficiaries may be disadvantaged for a considerable length of time.
Regions are resourced according to the number of clients. Those regions with a greater than average number of beneficiaries with complex circumstances could be disadvantaged, because the additional work arising from dealing with these beneficiaries could leave them less able than other regions to meet their accuracy target.
Recommendation 3 (page 48) |
---|
We recommend that – The Ministry continues to promote sharing of information among regions and with National Office on approaches to staff deployment and managing caseloads. |
Recommendation 4 (page 48) |
---|
We recommend that – For those regions that have a high number of beneficiaries with complex circumstances, the Ministry explores whether a relatively higher allocation to reflect the workload associated with complexity is justified. |
Checks on Case Managers’ Performance
Key Performance Indicators (KPIs) are a central feature of the Ministry’s culture. Case managers have clear, measurable targets to aim for in their day-to-day work.
The checks undertaken at service centres (known as 5+5 checks) provide valuable information on accuracy, and are an effective means of achieving and maintaining accuracy. In those regions that look at the results across service centres, they also provide useful comparative information.
A lack of more high-level comparative analysis of the results – particularly between service centres dealing with similar beneficiary profiles and problems – is a missed opportunity to compare performance and to transfer lessons.
Recommendation 5 (page 50) |
---|
We recommend that – The Ministry investigates the possibility of using aggregated 5+5 data at regional level to draw inferences about regional performance through the use of appropriate statistical techniques such as double sampling. |
The Use of Team Coaches
Team coaches provide important and responsive support and training to the case managers. We consider that not sharing information on how regions use team coaches is (as with the 5+5 checks) a missed opportunity to compare performance and to transfer lessons.
Recommendation 6 (page 52) |
---|
We recommend that – The Ministry continues to promote sharing of information among regions and with National Office on approaches to the use of team coaches. |
Assessing Case Managers’ Performance
The Ministry has a systematic approach to linking individual performance of its case managers to its KPI for processing accuracy. However, the way some regions assess performance creates a risk that staff will not see accuracy as a high priority, potentially leading to an inconsistent level of accuracy between case managers and regions.
Recommendation 7 (page 53) |
---|
We recommend that – The Ministry requires all regions to assess the performance of staff consistently. The chosen method should provide an incentive to staff to accord accuracy an appropriately high priority. |
Measuring and Reporting Accuracy
The Accuracy Reporting Programme (ARP) measures and reports accuracy on the basis of the Ministry’s statutory obligations and Purchase Agreement. It does not, and is not intended to, provide a wider measure of benefits being paid accurately to all who are eligible to apply – which would entail a separate special exercise in data collection and analysis.
Some of the cases selected as part of the ARP sample cannot be assessed as correct or incorrect because files or parts of files are found to be missing. At present the Ministry excludes these cases – known as “unverifiables” – from the sample. This practice could lead to an incorrect estimate of the overall level of accuracy, because cases for which papers cannot be found may well have an error rate that is different from the rate for cases that are well documented.
By changing this practice to include unverifiable cases as errors, there would be a strong incentive for the Ministry to reduce the unverifiables to as low a level as possible. However, there would also be a risk that the results could under-estimate the overall level of accuracy, since it is unlikely that all unverifiable items contain errors.
An alternative approach would be for the Ministry to clearly explain unverifiables and disclose their level in its annual report.
Recommendation 8 (page 63) |
---|
We recommend that – The Ministry explains unverifiables and discloses their level in its annual report. The precise form of the disclosure should be agreed with the Audit Office as part of the annual audit. |
Uncertainty of Measurement Results
The ARP was not designed as a statistically valid instrument for measuring regional performance. Consequently, because the ARP estimates the level of accuracy of the total “population” of benefits on the basis of the accuracy of a relatively small random sample of those benefits, any use of ARP results in respect of regions needs to be approached with caution. Once down to regional level, the samples are small, and the smaller the sample on which an estimate is based, the greater the level of uncertainty of the estimate.
In our view, the relative levels of accuracy between different parts of the country are important. Regions’ ARP results provide an imperfect measure of accuracy, but, in the absence of something better, it is acceptable for regional managers to use them as one of their operational tools – so long as they are clear about the uncertainties of the data they are using.
Recommendation 9 (page 68) |
---|
We recommend that – The Ministry continues to give the regions their ARP results, but in a form similar to Figure 11 on page 64 – showing each region’s data at a 95% confidence level, and comparative data of other regions. |
Recommendation 10 (page 68) |
---|
We recommend that – The Ministry provides all Regional Commissioners and Regional Operations Managers with training on the nature of sampling error and the appropriate interpretation of statistical estimates that include confidence intervals. |
Identifying the Size of Errors and Fraud
The ARP does not estimate the amount of under- or over-payments. Other available information and our own analysis suggest that the risk of large errors is low. However, it is important to undertake a specific exercise periodically to estimate the amount of over-payments (including fraud). The estimate would also provide the Ministry with a factual basis on which to estimate whether the current level of expenditure on Benefit Control ($38.3 million in 2003-04) yields the greatest cost/benefit.
Recommendation 11 (page 70) |
---|
We recommend that – The Ministry regularly performs a risk-sizing exercise to estimate the amount of over-payments. |
The Causes of, and Hardship Caused by, Errors
A more systematic collection of information on errors, their size and cause, and how they were found would provide useful indicators to help management assess for policy purposes:
- priorities for focusing effort to achieve better accuracy – by identifying processing errors early and by avoiding them; and
- the hardship that such errors might be causing beneficiaries.
Recommendation 12 (page 70) |
---|
We recommend that – The Ministry continues to explore the collection and analysis of information on errors, their size and cause, and how they were found, and to link this work with enhancements to its information technology systems. |
Consequences of the Ministry’s Structure
The Ministry views the Service Delivery and Specialist Services components of its benefit administration separately. Viewing them in this way runs the risk that potential cost/benefit opportunities from examining the various components together will not be identified.
Recommendation 13 (page 76) |
---|
We recommend that – The Ministry treats all its processes for administering benefits as components of an integrated system. It should periodically re-estimate the optimum balance of effort between the different components in order to achieve the most cost-beneficial outcome. The method of estimation should consider all costs and benefits, including those incurred by or realised by beneficiaries, rather than being confined to those of the Ministry. |
Potential Beneficiaries
In our view, the Ministry requires fuller information on the potential population of beneficiaries. Its activities to identify people who are eligible for a benefit but have not applied for it are likely to be more effective if the characteristics of the target population are known.
Recommendation 14 (page 78) |
---|
We recommend that – The Ministry periodically undertakes exercises to estimate the number of people who are potentially eligible for social security assistance but who have not applied. For a supplementary benefit such as the Accommodation Supplement, the exercise would be likely to require information on personal income that might best be obtained from the Inland Revenue Department. The Ministry should investigate the feasibility of undertaking data matching or data extraction exercises that would yield the necessary information. |