Part 5: The surveillance function

Civil Aviation Authority: Certification and surveillance functions.

In this Part, we assess the CAA’s surveillance function, in particular the extent to which risk assessments (identified in Part 3 of this report) influence the depth and frequency of the surveillance.

Policy in relation to risk and the surveillance function

The industry

The Surveillance Policy requires the CAA’s operational groups to adjust surveillance priorities and methods. For example:

  • surveillance associated with air transport operations and related service providers is to be given priority over surveillance associated with other forms of activity (such as agricultural aircraft operations);
  • the surveillance programme must adopt strategies from the CAA Business Plan that relate to particular groups of operators (such as air transport operators with aircraft in the 2721 to 5670kg group); and
  • information from the Safety Plan (now part of the CAA’s Statement of Intent) and the 6-monthly Aviation Safety Report is intended to lead to longer term shifts in the focus or direction of the surveillance programme.

Individual operators

The Surveillance Policy allows for changes in individual operator risk through varying the depth and frequency of the surveillance.

The Surveillance Policy states that a change in depth is achieved by changing the extent of sampling done during the audit or inspection, or by carrying out additional surveillance – for example, a special purpose audit or inspection.

The extent of sampling depends on the judgement of each inspector. Currently, there is no sampling methodology to guide inspectors in exercising this judgement. The CAA’s Professional Standards Group identified a need for guidance in this area in 2002, and has since implemented the “Surveillance Review Project”, which includes development of and training in sampling practice and methodology.

“Frequency” refers to the number of times the operator is visited in a year. The International Civil Aviation Organisation (ICAO) recommends that –

All significant aspects of [an] operator’s or organisation’s procedures and practices should be evaluated and appropriate inspections conducted at least once every twelve-month period.19

It is current CAA policy for the CAA to undertake an annual routine audit in all instances. However, in the case of Part 119/135 and 137 operations assessed as low risk, we believe the CAA could undertake a routine audit less frequently than yearly. Such an approach would need to be supported by a robust risk assessment framework and other forms of intervention (e.g. spot checks). We understand that the CAA is now reviewing its procedures to consider the circumstances in which routine audits may be undertaken less frequently than every year.

This would be similar to the approach of the Australian Civil Aviation Safety Authority (CASA), which requires all general aviation operators to be audited at least once in every 3-year certificate lifecycle. Larger passenger-carrying operators and the Certificate of Approval operators20 that maintain them have more frequent scheduled surveillance. High Capacity Regular Public Transport operators21 are audited every 6 months, and Low Capacity Regular Public Transport operators22 and large charter operators every year.23

This scheduled surveillance is supported by risk-based audits of operators that CASA believes are a relatively high risk to aviation safety. They may be triggered by:

  • a high Safety Trend Indicator score;
  • industry intelligence;
  • aviation incidents or accidents; or
  • findings from scheduled surveillance.

Surveillance in the General Aviation sector

Types of audit

Since our December 2000 audit, operators of aircraft with 2 or more engines who were previously operating under a Transitional Air Operator Certificate, have been required to gain Part 119/135 certification by the end of February 2001. Singleengine, fixed-wing, and helicopter operators were required to gain their certification by the end of February 2003.

Certification effectively changed the approach taken by CAA inspectors towards General Aviation sector operators. Inspectors are generally now able to take a systems-based audit approach that checks whether the operators are conforming to their expositions. These expositions set out the instructions, procedures, and information necessary to permit the personnel concerned to perform their duties and responsibilities with an acceptable degree of safety and comply with the Act and relevant CARs.

In practice, inspectors in the General Aviation Group have combined systems-based audit with some observation of activities. We support this hybrid approach because we consider that some observation-based sampling is required to confirm that the systems are operating effectively. The degree of reliance that inspectors consider they are able to place on the management systems should determine the extent of inspection undertaken. Non-compliances or non-conformances identified in the initial sample should determine the extent to which further sampling is done.

Those operators certificated under Part 119/135 are intended to be audited annually, but this is subject to work pressures and staff availability.

Agricultural operators under Part 91/137 are subject to inspection. Although the Surveillance Policy states that such operators are to be visited on a 2-year cycle, in practice they are inspected annually.

Impact of risk on depth and frequency of audits/inspections

Our audit assessed the extent to which assessments of industry risk and individual operator risk affected the surveillance process.

We found that industry risk does influence the spot check programmes that are based on types of operations (for example, frost control, or looking at aircraft areas that are prone to cracks). However, we were not able to establish, and the CAA was not able to demonstrate to us, how increased risk in particular STGs feeds into the routine audit process. For example, despite increasing accident trends in the 2721-5670kg group (which has historically shown the highest level of risk), this group still undergoes the same depth and frequency of audit as the other STGs.

To assess the extent to which individual operator risk influences the surveillance process (routine audits, spot checks and special purpose audits), we selected 5 operators with “very high” Client Risk Assessment ratings and low QI scores (less than the “pass” mark of 65) to establish whether the depth or frequency of the audit was altered to reflect this risk.


In our sample, we found that individual operator risk had little overall effect on the depth of surveillance undertaken. In fact, in 2 instances we were concerned that the depth of the audit or spot check was not sufficient to address the risks noted.

In the first instance, the depth of the routine audit appeared to be determined by the time available rather than risk, in that the January 2003 routine audit report noted–

  • Changes and repairs to the pilot seat are questionable, but there was not time during the audit to pursue this further.
  • Only the helicopter log books were available for inspection and from these it was very difficult to determine that all required maintenance had been performed, or that all maintenance that had been carried out had been recorded correctly. Maintenance requirements that could not be verified as being carried out include those inspections and re-torques required by the maintenance manual chapter 05-20-15 and 05-20-20. From the log book it could be seen that a number of compass calibrations had been carried out, but these could not in all cases be related to other maintenance entries. The additional work records associated with work identified in the log books could well hold the required information and this will be followed up at a later date. No findings have been issued as a result of this inspection as the information may be contained in additional maintenance records.

The report’s summary commented–

Due to the standard of logbook entries, a spot check will be conducted at a later date to determine that all the required maintenance has been carried out and the appropriate record of that maintenance has been compiled. This may result in findings being raised.

In the second instance, the depth of the spot check did not appear to address the risks noted, in that it was reported–

Discussions with the Maintenance Controller…left the Airworthiness Inspector with a feeling that [the organisation’s] approved procedures were not being followed and also poor attitude to CARs requirements with Tech Logs being raised for periods greater that the approved maintenance programme. …the Maintenance Controller was also observed to fit an altimeter without carrying out a leak test.

However, no findings were raised as a result of this spot check.

In both of the above instances, we would have expected CAA inspectors to have done enough work to establish whether the organisation’s approved procedures and the CARs were being followed and, if not, to have raised appropriate findings.


We noted that, although a perceived increase in risk did not affect the depth of the audit, it did increase the frequency of the audit.

For one operator the areas observed by the inspector actually reduced from 2002 to 2003 in that:

  • in August 2002, the QI scores (64% for maintenance and 59% for flight operations) did not constitute a “pass” and the audit resulted in 11 findings against this operator. One of the 10 QI areas (tools, equipment and materials) in the flight operations area was “not observed” during the audit; and
  • by September 2003, the QI scores had improved to 67% for maintenance, and 61% flight operations, but given the results of the previous surveillance, we were surprised that 3 areas of the maintenance side of the organisation (clarity of management systems, control/system effectiveness, corrective/preventive actions), and one area in the flight operations area (tools, equipment and materials), were “not observed” during the audit.

However, we noted that the frequency of surveillance did increase after the August 2002 routine audit, in that:

  • A spot check was undertaken in October 2002 to do a QA system follow-up check and exposition content and knowledge check. Also went through individual findings from the last audit to check what had actually been done to rectify them. The audit report from the spot check found that ... the company have addressed all the findings issued at the audit; however the knowledge and experience required to ensure QA system compliance will take some time to achieve … the company requires assistance to achieve the necessary standard and will require spot check surveillance to check on progress. This was positive in that the audit had identified a “risk” that was to be followed up by more frequent surveillance.
  • A further spot check was performed in February 2003. The report of the spot check noted that the outstanding issues from the last audit had been discussed with the operator’s Chief Executive – However it was not possible to establish that the issues relating to Quality Assurance had been resolved. We were concerned to note that the depth of the spot check had been affected by the time available to the inspector rather than risk; fuel was stored in a shed some distance from the hanger, but ... Due to a shortage of time this was not inspected.

In relation to another operator, the frequency of surveillance also increased. However, it took 2 routine audits and more than a year for this to happen, in that:

  • A routine audit was completed in July 2002, producing a QI score of 67% for maintenance and 57% for flight operations. Despite the policy that a low QI score should be followed up, the next contact CAA staff had with the organisation was a routine audit in July 2003. The QI score in this audit was still below a “pass” – 62% for maintenance and 62% for flight operations.
  • Additional spot checks have been completed since the July 2003 audit – one in November 2003 and another in January 2004.

Deficiencies in follow-up action

In another case, although the perceived risk in relation to the operator did result in the inspector identifying a need for increased surveillance, the frequency of surveillance did not increase. A routine audit in August 2002 resulted in 7 Finding Notices and a QI score of 64% for maintenance and 62% for flight operations. Another spot check was carried out in November 2002 as a follow-up, with the report noting that spot checks on the operation should be carried out when staff were in the area. The next visit was not until August 2003, at which time the company’s exposition was still found to be deficient, but its QI scores had improved to 71% for flight operations and 67% for maintenance.

A final example leaves us concerned about the amount of surveillance of what appears to be a high-risk operator.

In this case, a routine audit had been completed in July 2003 (12 months after certification), which resulted in a QI score of 73% for maintenance (the peer reviewer thought that this was not low enough to reflect the findings recorded in the report) and 59% for flight operations. The routine audit resulted in 10 findings (6 non-compliances with the CARs and 4 non-conformances with the operator’s exposition).

We were concerned when we reviewed this operator’s file that no follow-up action was recorded from the time of the routine audit (July 2003) until a special purpose audit was completed in April 2004.

CAA staff advised us that the following action had been taken:

  • CAA staff had visited the operator in September 2003, and discussions of a general nature took place with an emphasis on compliance issues. They also had discussions with another senior operator who undertook to “have a severe word” to the operator from an industry or mentor perspective and to provide him with sound advice from his peers.
  • Two spot checks (December 2003 and January 2004) were attempted, but the operator was not at the airfield so the inspectors made no contact.
  • A special purpose audit was undertaken in April 2004, which resulted in a reduction of both QI scores (maintenance from 73% to 59% and flight operations from 59% to 52%). This audit and our concerns with it are discussed in paragraphs 5.39-5.42 below.

We have the following concerns about the follow-up action taken by inspectors in relation to this operator:

  • It was left to a member of industry to undertake to “have a severe word” to the operator while CAA staff had discussions of a general nature. We would have expected the CAA staff to have at least strongly reprimanded the operator.
  • The fact that the operator’s absence from the airfield on 2 occasions meant that a spot check by inspectors was not undertaken. We would have expected that, in instances where operators are clearly not meeting the required standard and not taking corrective action in relation to findings, every effort would be made to inspect the operation.
  • The fact that, when the operator still did not take corrective action to address the findings noted during the routine audit (apparently in spite of warnings by both CAA staff and his peers), and the subsequent special purpose audit resulted in a further reduction in the operator’s QI scores, this did not lead to any stronger follow-up action than a further routine audit being completed in July 2004.

It was not until the further routine audit was completed that the operator achieved a pass for the QI scores (66% flight operations and 73% maintenance). However, we note that, at the time of the routine audit, there were findings still outstanding from the special purpose audit, in spite of the operator being given a reminder that the findings needed to be attended to.

The General Manager of the General Aviation Group advised us that the operator would undoubtedly benefit from operating under the watchful eye of a helicopter operator experienced in the discipline of today’s environment. However the CAA is not in a position to impose this. The General Manager then went on to say that he also counselled him on several occasions, including speaking to his father and mother. He then went on to say –

The action taken … has been time consuming, scrupulous and onerous to both operator and CAA but may be taken to be representative of the approach taken by GA Group to operators who cause concern. It will always be our first intention to bring an operator back into compliance rather than impose heavy handed administrative or enforcement action, unless that operator is negligent, unrepentant or dangerously at fault.

In our view, this case raises a question about the level of non-compliance required before a case is referred to the CAA’s Law Enforcement Unit for further action. We consider that the CAA should develop guidelines to determine when instances of non-compliance should be referred for enforcement action.

Special purpose audits

Over the last 5 years, 2 special purpose audits have been directed at Part 119/135 operators. The first, in June 2002, was done in response to:

  • the results of a spot check which revealed deficiencies in the company’s systems;
  • information that had been forwarded to the CAA;
  • CAA statistics; and
  • the company’s previous audit.

The audit covered all aspects of the operation in question and replaced the routine audit scheduled for the following month. Twenty-six Finding Notices were issued as a result of this audit.

The audit was followed by a spot check in August 2002. It focused on following up the findings from the special purpose audit (at the time of the spot check, 2 findings still required action). No findings were noted from the spot check. A routine audit was completed in June 2003 and no Finding Notices were issued.

A second operator (the same operator referred to in paragraphs 5.29-5.34) was subject to a special purpose audit in April 2004. This special purpose audit was completed as a result of a lack of action to clear routine audit findings, a low QI score from the same audit (59%), and a “moderate” Client Risk Assessment.

The special purpose audit included a follow-up of the findings of the previous audit (5 of the 10 findings had still to be closed). Staff changes, facilities, exposition and maintenance planning were also reviewed, the aircraft was inspected, and maintenance records examined.

Although a Finding Notice was issued at the conclusion of the special purpose audit, we were concerned that other observations had been made in the audit report, but had not been verified to determine whether a Finding Notice should have been issued. They included the following –

  • There is no indication that an operational flight check, or a check of the autorotation RPM had been carried out in accordance with the manufacturer’s maintenance manual at the appropriate times. The results of these tests are required to be recorded as specified in Part 91 and 43.
  • A maintenance entry indicates that the fuel low light has been adjusted to come on at 28 litres. The maintenance manual chapter 29-00-00 requires this to be set to illuminate at 35 pounds of fuel in line with the fuel gauge indication.
  • There is no record of the yearly inspections being carried out as specified in chapter 05-20-15 of the maintenance manual.
  • Information relating to the above issues may well be found in the work records associated with the log book entries, but as these were not sighted at the inspection this possibility was not confirmed.

The special purpose audit resulted in a reduction of both QI scores (maintenance from 73% to 59%, and flight operations from 59% to 52%). However, we were surprised that only 9 of the 10 categories of QI were graded during the audit (“corrective and preventative actions” was not). As the special purpose audit was in response to the “risk” associated with the organisation (low QI score, moderate risk, and non-response to findings) the Surveillance Policy requires an increase in the depth of the audit.

Surveillance in the Airline sector

Types of audit

Unlike Part 135 and other operators in the General Aviation sector, Part 119/121 and 125 operators are audited according to a Customised Audit Programme that contains all the audit modules that will be completed during the coming financial year. According to the CAA’s Surveillance Policy, these programmes ... are proposed by CAA and agreed with the operator…

The Surveillance Policy also notes that –

In selecting the modules customised for that operator, account is taken of that organisation’s past safety performance (from data held in the CAA safety database) and the capability of its internal quality assurance function.

The General Manager of the Airline Group told us that the content of the programmes was determined through reference to database material and risk assessments, in addition to the combined experience of the Group’s inspectors. He said that the programmes changed little from year to year, but that for 2004-05 the Group took a “zero-based” audit approach to revise its assignment of modules, with the number of audit hours coming out at about the same as in previous years.

As a result of our file reviews and discussion with an airline operator, we do not believe that all Customised Audit Programmes are the result of consideration of the operator’s past performance or the quality of their quality assurance, as envisaged by the CAA’s Surveillance Policy. We found instances where programmes were simply rolled over from one year to the next, while others were largely unchanged from year to year, despite a record of sound audit performance. It also appears that little reliance is placed on the operator’s own quality assurance function, regardless of their maturity and sophistication.

We acknowledge that the CAA has international obligations to aviation regulatory bodies to maintain a certain level of monitoring for airlines that operate in their jurisdictions. Nevertheless, our findings relate to the components of the programmes, as well as the level of audit. These findings are supported by the examples below.

Operator 1

This operator told us that their audit programme was “negotiated” with the CAA, but that the term was a misnomer as the CAA sent a programme for the company to agree to, and there was no negotiation over which particular modules were to be completed. Nevertheless, this operator thought that the components of the modules were normally appropriate, but that the programme remained consistent from year to year regardless of their performance in the previous year’s audits. The operator wanted more communication with the CAA, and more analysis of the findings of previous audits so that areas in the next year’s Customised Audit Programme could be better targeted.

We reviewed this operator’s audit files to see whether its Customised Audit Programmes reflected the organisation’s past safety performance (from data held in the CAA safety database) and the capability of its internal quality assurance function, as required by the CAA’s Surveillance Policy.

A CAA audit of this operator’s quality assurance function found that –

The audit programme appeared to cover all the activities of the company, with most of the audits carried out to schedule. Some had been re-scheduled, but it appeared that all was in good control.

As for the internal audits completed by the operator, the CAA concluded –

The reports indicated that the depth of audit was sufficient, and this was verified by the complexity of some of the audit findings.

The CAA audit raised no findings against the operator’s quality assurance function.

In relation to the operator’s safety performance, a special purpose audit had been conducted in the previous year, because the CAA wanted assurance that organisational changes had not affected the operator’s lines of communication. The fact that this audit was done shows that the audit programme has a degree of flexibility, although the operator questioned its value and wondered if the inspectors had had enough time to do a thorough job.

Looking at this operator’s safety performance from QI scores recorded for the recent past, the range was 66.7% to 90%24 for 2001-02 and 65% to 91.1%25 for 2002-03. Despite this level of performance, the content of the operator’s Customised Audit Programme changed little over the period reviewed. In fact, audit hours remained the same for both the 2001-02 and 2002-03 programmes (the programmes were simply rolled over) and then increased slightly for the 2003-04 programme. The operator’s QI score ranged from 68.9% to 90%26 for 2003-04. According to the CAA’s Surveillance Policy, for QI scores of 80 or more, inspectors can consider reducing the depth and frequency of the audit, but that has not happened in the case of this operator.

We also observed during one of the operator’s audit modules that not all elements of the module, as detailed in the Customised Audit Programme, had been completed by the CAA inspectors. Time constraints appear to have meant that some elements of this organisation were audited lightly, with others not being covered at all.

Such constraints could mean that an inspector does not have enough time to test a process in any great detail, and that concerns us. We are not convinced that there is a process by which any unfinished audit work is incorporated into the next module, or by which risk areas are identified during the module so that they can be targeted if time is running short.

In contrast to audits by the General Aviation Group, standard checklists had not been used for this Airline Group audit. During our file reviews we noted the presence of a variety of CAA checklists, depending on the nature of the audit (for example, line operations, or aircraft and log book checks). We consider that the regular use of checklists by Airline Group inspectors would save time and help to ensure consistency.

Operator 2

During correspondence with the CAA about its Customised Audit Programme, this operator noted that–

Your suggested programme is larger than last year’s which we accept for Part 125 elements, but consider the other sections could be streamlined taking into account the time we have been operating under Part 119 and the results of the last few audits.

In response, the CAA stated–

You will probably have noted that we have gone away from speaking of ‘negotiating’ audit programmes. This is because the ‘negotiation’ gave the idea of haggling over time taken. Under the Act the Director has to monitor operators, so this is what has to be done. The idea of estimating hours is for an indication of budgeting for the operator and for CAA – as you say “as a best guess”. It is better than a guess in that it takes into account past experience etc.

Later in the same response, the CAA added:–

…I don’t mind what the hours are the modules just have to be completed satisfactorily and effectively. [sic]

This example illustrates some industry frustration at the formation of standard Customised Audit Programmes that do not necessarily reflect previous audit results. We agree that audit programmes should not be negotiated, but they should be developed taking account of previous audit experience and findings.

Impact of risk on depth and frequency of audits/inspections

In a similar exercise to that done for the General Aviation Group, we reviewed operators who had received a “high” risk assessment score (higher than 40%), and QI scores less than the “pass” mark of 65% over a 12-month period. This exercise was designed to see whether audit programmes were adjusted in response to the CAA’s indicators of high risk and low level of confidence that the operators will adhere to the CARs and their own expositions.

The Airline sector differs from the General Aviation sector in that risk assessment and QI scores are calculated more regularly for airlines because of the audit module system that applies to their operations. Over the period reviewed, we noted 3 modules where airline operators scored a QI of less than 65% (the range being 40-62%), while another 3 operators between them had 7 “high” or “very high” risk assessment scores (the range being 43-56%).

Our review found little evidence that routine audits done as part of the Customised Audit Programmes were adjusted, either in frequency or depth, in response to indicators of “potential increased risk” or low QI scores. We would have expected risk areas to be reviewed more frequently, even if this necessitated unscheduled reviews to ensure that deficiencies had been corrected. There was also little evidence that operators scoring low-risk assessments and high QI scores were audited in less depth or less frequently, despite this being CAA policy.

We consider that, when preparing an operator’s audit programme for the forthcoming year, CAA inspectors need to take account of the risk assessments available to them, in addition to drawing on their combined knowledge and experience. We also consider that, if the risk assessment or QI scores change during the year, the programme needs to be altered to reflect any increase in risk. This may mean that unscheduled reviews need to be undertaken, to ensure that the deficiencies have been corrected.

Recommendation 6
We recommend that the CAA continue with its review of its surveillance function. In undertaking this review and designing a new approach, the CAA should:
  • ensure that the audit process directs resources at the highest-risk operators;
  • direct appropriate activities and interventions at high-risk Safety Target Groups;
  • give priority to the sampling project (a sampling methodology will allow inspectors to make informed decisions on the work necessary to cover the assessed risk);
  • assess where reliance can be placed on operators’ own quality and risk management systems, so that audits can be targeted at higher-risk areas;
  • ensure that the depth and frequency of surveillance is adjusted to reflect operator and operation risk; and
  • develop guidelines to indicate when instances of non-compliance should be referred to the CAA’s Law Enforcement Unit for further action.

19: ICAO document 9734-AN/959, Safety Oversight Manual (Part A).

20: In Australia, a Certificate of Approval is issued to persons and organisations that intend to carry out the design, distribution or maintenance of aircraft, aircraft components, or aircraft materials.

21: In Australia, High Capacity Regular Public Transport refers to aircraft with 38-seat capacity or greater operating regular public transport services.

22: In Australia, Low Capacity Regular Public Transport refers to aircraft with less than 38-seat capacity operating regular public transport services.

23: Aviation Safety Compliance Follow-up Audit, ANAO, Audit Report No.66, 2001-02, page 58.

24: 25 of 34 (73.5%) QI scores given were 80 or over.

25: 26 of 33 (78.8%) QI scores given were 80 or over.

26: 27 of 32 (84.4%) QI scores given were 80 or over.

page top