Part 6: Closing responses and learning from them
6.1
In this Part, we set out our findings about how the Ministry closes a response down and how it goes about capturing learning from it.
Summary
6.2
Introducing the new response system provided the Ministry with a way to capture learning and to identify improvements at the end of a response. However, a lack of accountability, priority setting, and staff continuity has meant that much of the staff time and money invested was wasted, because it did not lead to significant improvement. This led some staff to become sceptical about managers' attitudes to change.
6.3
A different approach is now in place. A new process means that the Senior Leadership Team is required to respond to audits and other reviews, which should increase accountability. There is some evidence of a more open approach to acknowledging mistakes and treating these as learning opportunities. Some staff believe that these changes signal the advent of a learning organisation.
Missed opportunities for continuous improvement
6.4
The Ministry's response system aims to:
- record learning from responses;
- formally review the response system's performance, policies, and procedures; and
- update the response system and plans.
6.5
Before introducing the response system, the Ministry had no formal way to record learning from each response. Most learning that took place during the response to gum leaf skeletoniser, which predated the response system, took place individually. This learning added to a staff member's skill base, but led to inconsistent learning for future responses and a loss of knowledge if the staff member left their job.
6.6
Introducing the response system was an investment in the right tools and methods to learn and continuously improve. Unfortunately, these tools have not been used to their full potential. Many previous lessons-learned exercises did not lead to significant change. Evidence shows that many recommendations from the 2005 foot and mouth disease simulation were not put into effect fully. We reviewed the turnover of the Knowledge Base content to see whether this had been updated. We found some revisions but fewer than we expected. Having the right tools and methods is of little value if they are not used and refined.
6.7
We reviewed the lessons-learned exercises from the responses to kauri dieback (July 2009), Psa (October 2011), and Exercise Taurus 2012 (May 2012). These highlighted many similar failings, including:
- staff being unfamiliar with the response system;
- the Response Strategic Leadership not working as well as it should; and
- poor management of the tension between BAU and response work.
6.8
Feedback from Exercise Taurus 2012 showed that, at first, many staff questioned the value of the exercise. Staff stated that, in the past, managers had not championed and required change and had failed to learn from earlier simulations, exercises, and responses. This lack of learning means the Ministry missed earlier opportunities to improve and repeated similar mistakes, which negatively affected staff attitudes.
Earlier difficulties in making improvements
6.9
The response policy dictates that an evaluation or lessons-learned exercise at the end of a response should record any learning and feed this into improvements. We consider that the processes for identifying organisational learning have been ineffective because of a lack of:
- accountability – there was no robust method within the biosecurity system nor performance management to translate what was learned into improved systems;
- prioritising – the Ministry failed to fully acknowledge the effect of response work on BAU, so had unrealistic expectations of what it could do; and
- continuity – changes to staff and management structures reduced staff continuity, which often meant that work was not finished.
6.10
Failing to implement identified improvements wastes the investment made in the process. For large responses, identifying the lessons learned can be expensive. In the Psa response, the budgeted cost of the lessons-learned debrief sessions was $39,000, which is value for money only if the Ministry works better as a result. Otherwise, it is wasted.
6.11
The Ministry's internal audit team told us that many of its earlier recommendations were not addressed because of work pressure during responses and a lack of accountability.
Introducing more positive management behaviours
6.12
The Ministry reports that, in June 2012, it began to respond to audits and other reviews in a new way, requiring the Senior Leadership Team to agree to an action plan. Strengthening its approach to in-house advice provides opportunities for change and recognises the value in such reviews, including the work of the internal audit team.
6.13
Staff told us that changes in managers' behaviour showed an open approach to learning lessons, such as in the 2012 review of how kits containing strawberry seeds and coco peat were wrongly cleared for sale in New Zealand.17 Staff believe that it is a good example of where mistakes have led to a positive outcome and that it signals the advent of a learning organisation.
6.14
The overall action plan to manage preparedness, containing the recommendations from Exercise Taurus 2012 and other responses, shows a changed approach. The Ministry says that, every three months, it tracks progress in addressing these recommendations.
6.15
If staff see signs of organisational culture changing, they are more likely to change their own behaviour. This should put the Ministry in a better position for the future.
17: In 2011, thousands of kits containing strawberry seeds and coco peat were wrongly given biosecurity clearance and went on sale nationwide. The product was later recalled and the Ministry of Agriculture and Forestry tested a sample of the seeds.
page top