Part 5: The examination system

Inquiry into the Plumbers, Gasfitters, and Drainlayers Board: Follow-up report.

5.1
Under the Act, the Board can prescribe, for each class of registration, the minimum standards for registration, which can include requiring a person to have passed an examination set by the Board.

5.2
In 2010, we reported that:

  • the systems that the Board uses to prepare examination questions were not as reliable or robust as the Board believed;
  • there were mistakes in some questions that made them unnecessarily difficult to answer, as well as unanswerable questions in some papers; and
  • the current prescriptions for examinations did not match the listed competencies in Gazette notices about registration.

Summary of progress

5.3
We are satisfied that the Board has addressed the many problems that we identified in 2010. The risk of producing examination papers containing errors has been reduced. New checks put in place mean that it is more likely that errors will be picked up. Adoption of the recommendations from the September 2013 evaluation (see paragraph 5.9) will give the Board a broader range of measures with which to track improvement.

5.4
We made two recommendations on examinations in our 2010 report. Figure 4 summarises the progress against the recommendations, with further detailed findings in paragraphs 5.5 to 5.13.

Figure 4
The Board's progress in addressing recommendations 7 and 8

We recommended that the Plumbers, Gasfitters, and Drainlayers Board:Progress by the end of 2013Relevant paragraphs in this report
7 … in preparing questions for any future examinations, ensure that the questions are appropriate for assessment under the Plumbers, Gasfitters, and Drainlayers Act 2006, are able to be answered, are free of mistakes, and do not contain unrealistic scenarios. Complete 5.5-5.10
8 … review its processes for preparing and moderating questions, and for setting examination papers. Complete

Examination changes

5.5
The Board changed its approach to examination-setting and moderation in time for the November 2010 examinations. Changes included the inclusion of multiple-choice questions and allowing open-book examinations. Moderation meetings now include an NZQA examination expert. Moderators check to:

  • see that the questions are based on the training material given to those wanting to sit the examination; and
  • ensure that the questions are unambiguous and technically accurate.

5.6
After the examination, any papers where the candidate just missed the 60% pass mark are automatically remarked. Papers are checked for adding errors. A selection of papers from each venue is compared, to increase consistency across training providers and markers.

5.7
In February 2011, the Board hired consultants to help improve the examination papers further. The consultants concluded that the Board was making progress, but added that:

… there are many opportunities to improve examination questions. The opportunities include editing questions from the literacy and readability perspectives, ensuring that more questions include scenarios to provide context, user testing questions, and providing study skills resources to assist candidates answer multi-choice questions.7

5.8
Quality assurance reports show that the Board has continued to focus on these improvement opportunities during 2011 to 2013.

5.9
Further expert evaluation was commissioned in 2013, leading to the Examination Process Review Report (EPR report) in September 2013, which analysed examination outcomes since June 2013 and found that:

  • A total of 437 papers were sat, of which 108 did not meet the 60% pass mark requirement (subject to the appeals process). Seventeen of those 108 papers were being reconsidered. This compared with 17 in 2012 and 25 in 2011.
  • No calls were received from candidates immediately after the examinations. Three complaint calls were received after the results were published.
  • For three questions, alternative answers had to be allowed because of problems with the questions, but there were no unanswerable questions in any of the papers.

5.10
The EPR report made several recommendations. These included the Board giving consideration to widening checks on papers that failed to meet the pass mark to ensure that no candidates were disadvantaged. However, the EPR report concluded that improved pass rates were a good indication of a better-managed process.

5.11
We independently analysed pass rates. Our analysis showed significantly higher pass rates for 2010-2012 than those we had found for 2008 and 2009. However, the percentage of plumbers who succeed at certifying level is lower than gasfitters and drainlayers.

Figure 5
Percentage of applicants who passed examinations, 2008 to 2012

2008 pass rate
%
2009 pass rate
%
2010 pass rate
%
2011 pass rate
%
2012 pass rate
%
Plumbers 36.0 47.0 79.0 79.5 75.5
Gasfitters 36.0 77.0 75.0 72.0 66.0
Drainlayers 51.0 54.0 75.0 81.5 88.0
Certified plumbers 20.0 46.0 67.0 66.0 69.5
Certified gasfitters 30.0 47.0 71.0 81.0 83.5
Certified drainlayers - - - 100.0 87.5

5.12
We saw evidence of the Board considering the reasons for the lower pass rates for certifying plumbers at its November 2013 meeting, although it was not clear how the matter would be taken forward. However, we are satisfied that results are scrutinised by the Board, and that pass rates have remained at levels comparable to those in similar industries.

5.13
Discussions continue with NZQA about how to include the Board's examination for the licensed class of registration in National Certificates at level 4. The pace of this review has been slow, but there has been significant change in the way training in trade skills is organised. However, when the new arrangements are put in place, they should help to ensure that teaching, learning, and assessment in the polytechnics and the Board's examinations are more aligned.


7: Workbase Consulting (February 2011), Examination question analysis report.

page top