Part 3: Monitoring performance and identifying school boards at risk

Ministry of Education: Monitoring and supporting school boards of trustees.

In this Part, we assess the effectiveness of the Ministry's monitoring of board performance. We also assess the effectiveness of the Ministry's systems for identifying boards at risk of poor performance in terms of the operation of the school or to the welfare or educational performance of their students (boards at risk).

Our expectations

We expected the Ministry to have:

  • clearly defined policies and procedures for monitoring board performance, including criteria for gauging board performance and identifying boards at risk;
  • identified the information that it requires to monitor board performance and ensured that all staff use this information in a consistent way; and
  • ensured that it identifies boards at risk early.

Policies and procedures for monitoring school board performance

The Act provides the Minister and the Secretary for Education (the Secretary) with a range of formal statutory intervention powers to address risks to the operation of individual schools, or to the welfare or educational performance of their students. The Act does not specifically confer on the Ministry a role in monitoring board performance. However, to give effect to its statutory intervention role, the Ministry must monitor board performance to identify situations of risk.

The Ministry receives ongoing management information from schools that could be used to identify declining board performance that might need more in-depth investigation – for example, financial deficits, poor student achievement, increasing staff turnover, and increasing numbers of student suspensions and exclusions.

There are no established rules to measure board performance and establish levels of risk. Professional judgement is required to assess the need for more in-depth investigation and to decide when and what action is needed as a result of the investigation. Therefore we expected that the Ministry would have written policies and procedures to guide staff when they are exercising professional judgement and to encourage consistency in making decisions about required action.

Ministry staff advised us that they were monitoring the performance of boards. However, they were unable to provide us with written policies and procedures that defined this monitoring role. There is no guidance on what criteria or triggers should be used to identify boards at risk.

In the Ministry offices that we visited, Ministry staff:

  • relied on internal and external networks to identify emerging risks;
  • relied heavily on the ERO to identify problems with curriculum delivery and board governance; and
  • monitored the financial performance of boards.

However, as there is no guidance, these practices differ between the offices that we visited. For example, there were differences in the extent of the internal and external networks, the frequency of the meetings, and the action taken as a result of these meetings. Also, some offices have a much closer relationship with the training and support providers that enable these providers to actively offer timely support to boards. One office has a strategy of matching schools with other organisations in the sector, with the Ministry liaising between the two.

The relationship between the offices and the ERO also differs. Some offices meet with the ERO regularly to discuss boards that have potential problems with their performance, while others wait until ERO reports identify a problem.

We discuss the differences in the financial monitoring of boards in paragraphs 3.24-3.27 and differences in monitoring analysis in paragraphs 3.28 to 3.31.

Information for monitoring school board performance

The Ministry's School Performance Team has a range of information available on board performance that it can use to assess whether boards are at risk. However, the team has not assessed the available information to establish the most appropriate sources of information for monitoring board performance.

Sources of information currently used

Networks within the Ministry and with external stakeholders are the main sources of information for identifying boards with governance problems.

All the offices that we visited use regular team meetings to share information about board performance.

Most offices actively build relationships with other stakeholders and encourage communication on risk issues. This includes running "schools support informal networks". These networks are made up of representatives of organisations in the wider school sector – for example, principals associations, the New Zealand Teachers Council, the New Zealand Educational Institute, and the Post Primary Teachers Association. Most of the networks meet regularly and provide a source of information about schools that are potentially at risk, and advice on options for providing support for those schools.1

Other external sources of information include complaints from parents and concerns raised by principals and teaching staff.

Ministry staff receive and review the draft and final versions of the reports prepared by the ERO. ERO reports contain an evaluation of the school's specific priorities, the Government's priorities, and whether the school has taken all reasonable steps to meet legal requirements.

Ministry financial advisers monitor the boards' annual financial statements to identify:

  • continuing deficits and negative working capital situations2; and
  • diminishing equity.

The Ministry employs monitoring analysts based in regional offices. At the time of our audit, the role of the analysts was being redefined. We reviewed a sample of the analysis done by the monitoring analysts. This included analyses of:

  • National Certificate of Educational Achievement (NCEA) results for individual secondary schools;
  • Ministry data on individual secondary schools;
  • Ministry data on composite and small secondary schools in the Otago and Southland regions; and
  • reading recovery data collected in 2006 compared to data collected in 2007.

Consistent use of information from those sources

There is no consistently used method of collecting and analysing the information from the internal and external networks. The Ministry has several databases and systems, but its use of them varies.

Ministry staff record contacts that staff have with schools in a Regional Contacts Register. Use of the register was recorded in some instances on the Ministry's files of schools that we reviewed. However, we cannot be sure that the register is being used to record all contacts. Therefore, we were not able to ascertain that the information collected is being completely recorded and accessed by relevant staff members.

The Ministry has a database, the Education Review Office Management Information System (ERMIS), for recording data extracted from ERO reports. One of the main reasons that the Ministry set up this database was to allow it to better support boards where it had identified a need, or to follow up on ERO recommendations or actions.

The use of the ERMIS database differs between the offices that we visited. In most cases, the data is entered but not used to analyse trends. Ministry staff use the full ERO report rather than the ERMIS summary. There is little attempt to analyse emerging issues and trends in these reports that might indicate an increase in risk at a school.

The Ministry evaluated ERMIS in June 2004 to check how accurately and consistently Ministry staff used ERMIS as a tool to analyse ERO reports. The evaluation found that the purposes that ERMIS was originally intended for (that is, consistent analysis and providing appropriate support to schools where needs were identified) had been limited by significant shifts in the ERO's philosophy, focus, and reporting style. This meant that the categories in ERMIS no longer aligned with the data in ERO reports. This had also resulted in variable interpretation of the information entered into ERMIS.

Financial monitoring

Ministry financial advisers review the annual financial statements of school boards. We were told, and saw evidence, that the financial advisers contact school principals or board chairpersons to discuss poor financial performance.

However, there were variances in practice between the three regions we visited. Each financial adviser had created their own spreadsheet to record information about, and action taken with, the boards they were concerned about. Two financial advisers had been using their spreadsheets for some time, but the third was only just setting up a spreadsheet.

In one region, the adviser automatically visits schools with a negative working capital situation and continues to monitor them quarterly. Where schools have increasing operating deficits and decreasing working capital, the adviser requests and reviews budgets to try to establish the causes of the problems.

This process differs in the other two regions. In one region, the adviser contacts schools if the analysis of the annual financial statements shows operating and working capital deficits. These schools are monitored quarterly. Other schools requiring assistance are identified through requests for additional funding. In the other region, the adviser focuses on building relationships with schools and their financial service providers. The financial adviser has ranked the boards according to risk and works with the schools in the high- and medium-risk categories and monitors those in the low-risk group. The adviser visits schools to discuss issues and to ensure that the schools have a plan in place to deal with those issues.

Monitoring analysis

As noted in paragraph 3.18, the role of the monitoring analyst was being redefined at the time of our audit. The monitoring analysis was being carried out on an "as required" basis. There was no overall plan to ensure systematic coverage of schools or risk factors. It was also difficult to establish how the analysis contributed to the monitoring process and assisted Ministry staff to identify boards at risk.

We were told that the reports were generally used as the first stage of analysis to identify potential problems. If necessary, more detailed analysis was carried out to establish the nature of the issues. We noted two instances where this was done.

In one region, after further analysis of the available data, what was thought to be a district-wide problem was isolated to a particular school within the district. Ministry staff were then able to liaise with that particular school about the issue.

In another region, the monitoring analyst had identified secondary and composite schools at potential risk of poor performance. The analysis was based on school performance against six key indicators. The report noted that the identified schools were not necessarily at risk but were most likely to be, and that further information and analysis would be needed to identify and prioritise which of the schools needed help. We would expect Ministry staff to actively use the available information in this way as a matter of course.

Available information not used for monitoring

We noted potentially important sources of information that Ministry staff were not using systematically to monitor board performance, including:

  • the School Support Factor;
  • school charters; and
  • Analysis of Variance reports.

The School Support Factor

The School Support Factor (SSF) is a risk rating derived from a set of factors that research has shown relate to how well a school is performing (for example, information from ERO reports, teacher data, and financial data). The SSF is designed to give an early indication of risk to student achievement and indicate other possible areas of underperformance, such as financial management and governance. However, as the SSF is an indicator of risk, more in-depth analysis is needed to determine whether the issues are significant enough for some form of statutory intervention to take place.

The SSF was designed so the Ministry's Schooling Improvement Team could identify schools where students are not achieving and set up appropriate statutory interventions to improve student achievement. In a recent policy paper, the Ministry noted there was general agreement that the SSF is a very good measure of a school's performance, but there was substantial doubt that it is a good indicator of poor student achievement.

The Ministry's Demographic and Statistical Analysis Group is improving analysis and use of data to help identify schools potentially at risk. A "flashpoint" project is addressing the timeliness of analysis of new data,3 and the content of a new data set. The data will be drawn from the current SSF factors and additional data, for example, school leaver qualifications, NCEA results, school roll data, and teacher information. The project will also identify what criteria or triggers are the most appropriate to identify schools potentially at risk.

School charters and Analysis of Variance reports

Boards demonstrate understanding of, and compliance with, the National Education Guidelines in two main ways – through the school charter and the Analysis of Variance report in a board's annual report. The Analysis of Variance report is a statement in which schools provide an analysis of any variance between the school's performance and the relevant aims, objectives, directions, priorities, or targets set out in the school charter.

Therefore we expected that Ministry staff would review the quality of these documents – in particular, the performance targets, the measures used, and the achievement of the targets reported in them. However, the charters and the Analysis of Variance reports are not being systematically reviewed to establish whether boards adequately understand and demonstrate their compliance with the National Education Guidelines.

The current Ministry policy is to restrict the review of charters to ensuring that the contents comply with the Act. The review does not look at the quality of the information provided in the charter. Ministry staff will only comment on quality if the board or school management indicate that they would like feedback. The Ministry estimates that about 125 schools asked for feedback in 2007.

The Ministry offices that we visited had implemented this policy in different ways. While some restricted the review to establishing whether charters complied with the Act, others went a step further and suggested improvements to the quality of the charters.

In the offices that gave qualitative feedback, it was generally given verbally to the school principal and concentrated on improving annual target setting and measurement of student achievement against the targets. The Ministry's letter of acknowledgement referred to verbal discussions. We were told by staff doing the reviews that most school principals welcomed this feedback, and that only a small number declined the opportunity to receive feedback.

There is no requirement or policy for Ministry staff to assess the quality of the annual targets and the achievement of these targets. Ministry staff receive this information in the Analysis of Variance report in the board's annual report, but it is not analysed to assess the board's performance in delivering the curriculum.

Ministry staff in one office told us that they assess the quality of the targets and refer back to the Analysis of Variance report to assess what was achieved last year. However, this analysis does not feed into an overall assessment of board performance. There was no formal system to incorporate the analysis into the monitoring function and overall assessment of the performance of the school.

An NZCER report noted that there was interest among secondary school trustees in talking with their local Ministry office about assessment data and school targets. It seems that there is an opportunity for the Ministry to provide boards with feedback.

Timely identification of school boards at risk

How the Ministry identifies school boards at risk

Ministry staff are identifying some boards at risk and offering them support. However, we are not confident that it identifies all boards at risk consistently and that it provides timely support. This is because there is no guidance on what indicates a board is at risk and when the Ministry needs to intervene and offer support. Ministry staff rely solely on professional judgement when making these decisions.

During our interviews, staff members were able to tell us of boards they considered to be at risk or likely to become at risk. Some offices maintain “watch lists” of boards they consider to be at risk. One office had attempted to rank the schools according to their assessed risk.

There is little external verifiable data on the number of boards at risk. While ERO research has identified that about 7% of the schools in analysed ERO reports have aspects of governance that needed significant improvement, only 3.7% of all schools have a current statutory intervention.

Timeliness of identifying school boards at risk

Two of the processes the Ministry uses to monitor board performance are not timely because they rely on information that is either historical (the financial information) or is only available every three to four years (the regular ERO report cycle).

The Ministry's financial advisers' risk analysis focuses on reviewing the annual financial statements. These cover the previous 12 months and might not be received and reviewed by the financial advisers until five months after the calendar year to which they relate. In our view, to identify early the boards that may be in financial difficulties, Ministry staff should supplement the review of the annual financial statements with reviews of other financial reports – for example, the banking staffing reports that are available.

Ministry staff do not become actively involved in addressing curriculum issues until the ERO recommends a statutory intervention or the board approaches the Ministry seeking assistance. ERO reports are usually done every three or four years. If the report describes problems with school performance, a supplementary review will usually occur 12 months later to establish whether the problems noted in the report have been addressed. This means that it could be up to five years before the Ministry recognises curriculum risks.

For example, it took three and a half years from when problems were first recognised by the ERO at one school for support to be provided by the Ministry. In this instance, the November 2006 ERO report said “the board and staff have made limited progress in addressing the recommendations of the 2003 ERO report. Improving student achievement, the quality of teaching and self review remain priorities for the school.” The report said the ERO intended to do a supplementary review in 12 months. The Ministry did not fund an adviser to assist the board until May 2007.

During our file reviews, we identified several instances where we consider it took too long to put a statutory intervention in place to assist boards who were experiencing financial difficulties.

In one instance, a school had been experiencing significant working capital deficits since at least 31 December 2002. The Ministry's financial adviser met with the board chairperson and the school's management about the school's financial performance during 2004 and 2005. However, the extent of the problem was not properly investigated until mid-2006, when the school's auditor alerted the Ministry that the school was not able to meet the payroll costs. A business adviser was then contracted to investigate and report on the financial management and position of the school. The Ministry noted:

The board's systems for managing the school's finances were found to be seriously inadequate. The financial position had deteriorated largely because of weakness at management level, inadequate monitoring and reporting, combined with the board lacking financial acumen.

After discussions with the Ministry, the board requested the appointment of a limited statutory manager to help with the financial management of the school. A limited statutory manager was appointed in July 2007.

In another instance, a school had experienced continuing operating and working capital deficits from December 2001. There was a note on the Ministry's file of the school that "the Board is managing the school's financial position closely and an improved operating result is expected for the 2006 year". However, the result for the 2006 year was a further deterioration of both the operating deficit and the working capital deficit. The Ministry's financial adviser had noted on their spreadsheet in 2006 the continuing working capital deficit, roll decline, the auditor's view that the school was in serious financial difficulty, and that the local Ministry office was working with the school. In our view, this situation warranted putting in place informal support or a statutory intervention.

In a third instance, a school had been experiencing operating deficits and working capital deficits since at least 2002. The Ministry's financial adviser assessed the school as being at high risk. A supplementary review by the ERO near the end of 2006 identified a number of risks to the school's financial management, specifically that:

  • the school had persistent deficits for a number of years and needed to restore its financial viability; and
  • financial deficits had led to a number of course changes and staffing cuts that the ERO considered had adversely affected staff morale and the structure of the junior school.

A limited statutory manager was appointed in June 2007 to resolve complaints about, and employment issues surrounding, the principal and staff, and to rebuild staff morale and community confidence in the operation of the school. In our view, regaining financial sustainability should also have been included in the limited statutory manager's terms of reference.

Our conclusions

The Ministry identifies some boards at risk. However, the lack of policies, procedures, and guidance available for Ministry staff to gauge the risk to board performance means that we cannot be confident that the Ministry identifies all boards at risk and that it provides timely support to these boards. This is because practices for monitoring boards vary between offices.

The Ministry has a range of information available on board performance that it could use more effectively and efficiently to identify boards at risk. Staff are not using current information systems consistently. Defining the Ministry's monitoring role will assist with this.

The Ministry is not systematically reviewing charters and the Analysis of Variance reports to establish whether boards adequately understand and demonstrate compliance with the National Education Guidelines. The Ministry could use this information to identify risks to board performance and the support needed to improve it.

Recommendation 4
We recommend that the Ministry of Education clarify the criteria or triggers for identifying school boards at risk of poor performance, and prepare policies and procedures for monitoring boards to identify as early as possible boards that may be at risk.
Recommendation 5
We recommend that the Ministry of Education identify the information it needs to consistently identify boards at risk and use it in a timely way.
Recommendation 6
We recommend that the Ministry of Education review school charters and Analysis of Variance reports to assess the extent to which school boards are meeting the National Education Guidelines, and use this information to identify areas where boards may need further support.

1: We consider that a school at risk is the same as a board at risk, and the other way around.

2: A negative working capital situation means current liabilities exceed current assets, which can lead to a board being unable to pay its bills

3: Previously, the SSF was produced several months after particular sets of data were available.

page top