Part 5: Is good quality data and information enabling regional services planning?
5.1
In this Part, we look at whether regional services plans are based on good quality data and analysis. A lack of robust data leads to imprecision and inaccuracy. This, in turn, can lead to false assumptions, followed by poor decision-making.
5.2
Our research revealed that there are concerns about health data throughout the health system. Although we did not carry out a system-wide review of data, we found problems where we did look. Based on our limited testing, we share the concerns raised with us by people in the health and disability sector. These concerns were mostly about completeness of data, information technology systems, coding errors, and timeliness.
Why good quality data and information is important
5.3
Good quality data benefits patients, for example, in diagnosis, treatment, and learning from what works and what does not. The aggregation of patient and service data supports improvement in performance, service delivery, and planning. As funding and accountability systems become more complicated, the demand for good quality information – based on valid and reliable data – increases. Good quality data and information provides users and decision-makers with assurances about effectiveness, efficiency, and economy.
What we knew and what we did
5.4
The Review Group's report noted that the health sector has a history of poor execution of information technology projects. Because of this, many information systems are incomplete and inconsistent. This limits their usefulness to support clinical workstreams. Some DHBs are using old and outdated patient management systems. Some DHBs have been unable to access information systems in their regions. The uneven progress has resulted in disjointed systems that contribute to poor-quality data and information. There is a lack of information connectedness between DHBs and the primary and private health sectors.
5.5
In our early fieldwork, people from the Ministry, regional agencies, and DHBs told us that it was challenging to get good quality data to support planning. Except for some national data, there is little confidence, generally, in the quality of data. In some instances, this meant staff had to rely more on their experience than the available data.
5.6
We tested the quality of data by:
- auditing patient records in four DHBs.
- looking at two samples of data and information used to support capital planning; and
- reviewing one region's information strategies.
5.7
We audited patient records in four DHBs to test the quality of the raw data available from DHB information systems. Looking at the way source data was recorded, collected, and collated allowed us to see how easy it was to get good quality information to inform planning. We chose a new measure because we were interested in seeing what data was like without significant, and targeted, further investment of cost and time.
What our work revealed about data quality
5.8
There are recognised flaws in the quality of health-related data when it comes to measuring the quality of the nation's health services. The New Zealand Health Quality and Safety Commission states that:
The availability of data is our biggest challenge, in particular the balance between imperfect but readily available data and high-quality, very specific data which is difficult to collect.
5.9
People in DHBs and regional networks who work with the data available to support regional services planning do not trust its quality. This is because there are significant gaps and limitations in the data. This could limit how effectively regional services are planned.
Our concerns about the quality of data and information
5.10
We found a variety of problems in the samples of data we tested. These problems included:
- discrepancies between source data and reported data;
- a lack of understanding, leading to different interpretations of what should reasonably be recorded;7
- not enough training or support for those responsible for collecting the data and reporting on the indicators;
- underestimating the time required to get data definitions right, even if the clinical events seemed relatively straightforward; and
- people having to collect data manually because it was too difficult to get data from the official computer systems.
5.11
During our fieldwork, we found a widespread awareness of data quality problems and many reasons contributing to those problems, including:
- completeness of data – for example, in one instance, up to 20% of records could have incomplete data, with one or two incomplete fields in about 15% of cases and wrong data in about 5% (this was attributed to busy staff being under pressure);
- information technology systems – including old and unreliable systems that did not talk to each other;
- coding errors – mistakes in coding data or poor record-keeping making the coding task more difficult;
- inpatient referrals, where it was more difficult to find out the date of the first specialist appointment or assessment;
- some referrals that came in from the private sector were missing information or difficult to find; and
- timeliness – in many instances, there was a direct trade-off between the speed of data being available and its quality.
5.12
We observed the effects of system limitations faced by some of the DHBs. For instance, in one DHB, the system could only show information about individual appointments for a patient rather than their whole period of care. Staff had to access many systems to pull the appropriate data together. In another DHB, some staff could not get information because it was held offline.
5.13
We identified problems other than clinical data. For example, we reviewed an early CIC attempt to pull together information for a national asset management plan. We found problems with common definitions and gaps in data. That early CIC attempt was based on assumptions of no changes in where services were located or the way they were delivered, because of a lack of information. The private sector's capacity for delivery had to be estimated, because private sector providers do not always give data to the Ministry.
5.14
Based on that finding, we looked into one region's early planning for Assessment, Treatment, and Rehabilitation (AT&R). We chose this because the capital requirements already feature in outline plans for spending. In the region, four DHBs had begun looking at what inpatient beds they needed for AT&R. An ageing population is the main reason given to justify more beds, but working out exactly how many more beds causes some difficulties.
5.15
The difficulties arise because each DHB uses different definitions of AT&R. Each DHB uses the beds differently. Different DHBs use different methods to predict how many beds are needed. As a result, there are differing assumptions about how patients move across DHB boundaries for care. This could lead to double counting. All of this has a major effect on capital planning, because DHBs could be understating or overstating their requirements.
5.16
One of the regional information strategies notes concerns that population health data available to the health sector is poor quality, fragmented, and difficult to get. The strategy says:
Individual practitioners can, after major effort, collect and report on some of the population health information some of the time, but none can take a district wide or regional comprehensive and aggregated view of population health status, trends and determinants of ill health and wellness.
Faster Cancer Treatment indicators
5.17
The Ministry is preparing Faster Cancer Treatment (FCT) indicators, which are important new measures for tracking how quickly cancer patients get treatment. Until now, it has been difficult to measure how long it takes for patients to see a specialist from the time their doctor suspects they have cancer and refers them to a specialist, to the start of their first cancer treatment. There has been no national approach to collecting this information, and DHBs have been collecting and reporting data in different ways. The lack of consistent information has made it difficult to identify where improvements can be made. Decision-makers do not yet rely on the indicators.
5.18
We chose to examine these new measures because we wanted to test the quality of "readily available" data in DHBs' systems. To help to inform the development of the FCT indicators, we looked at whether the information was relevant, understandable, comparable, and reliable.
5.19
The reason for the FCT indicators is highly relevant. The Ministry's website (www.health.govt.nz) states:
Cancer is a major health issue for New Zealanders. One in three New Zealanders will have some experience of cancer, either personally or through a relative or friend. Cancer is the country's leading cause of death (28.9 per cent) and a major cause of hospitalisation. Improving the timeliness of access to services for cancer patients is important. If it takes too long for a patient with suspected cancer to receive treatment this may affect their outcome and cause unnecessary stress for them and their families and whānau.
5.20
The guidance on FCT indicators was difficult to understand, with complicated and ambiguous definitions. Each of the four DHBs whose patient records we audited had interpreted the definitions differently.
5.21
We found various "teething issues" with reliability. Information about cancer treatment timeliness was not comparable, because individual DHBs "started and stopped the clock" at different points. There were many copies of guidance in circulation, between and within DHBs. We found discrepancies in, and missing, data. Some DHBs had to access many separate in-house information systems to extract data, but did not always have access to the electronic and paper information systems that they needed to verify dates.
5.22
Making the measures more reliable before they could be used as indicators has taken time. A description of the FCT indicators was released in December 2011. More guidance followed in March and October 2012. The Ministry told us that its analysis of the first collection of FCT data from DHBs in mid-2013 showed problems with data quality. This means that the Ministry will need to increase support to those putting the indicators into effect.
Improving data quality
5.23
For information technology to improve service delivery, agreed approaches to clinical and administrative procedures must be in place first. Progress putting information technology projects into effect is mixed but improving.
5.24
Before regional services planning was introduced, each DHB invested in its own information technology systems. This unco-ordinated investment was sometimes not enough. Now, investing in regional information technology systems means that the quality of data available is improving. However, good information technology systems are only part of the solution. Human action – or inaction – caused many of the factors affecting data quality that we identified. However, a good information technology system can ensure that some of these errors are prevented, by ensuring that expected entries are well defined and that reporting happens quickly on what appear to be outliers.
5.25
Information needs to be sought after, valued, and in regular use if accuracy is to improve. In our view, when practitioners stop using data, there is no urgency to get it right – and the people producing it might not know it is wrong. We heard about other efforts to improve the accuracy of data, but most of these were time-consuming attempts to "clean up" poor data for use.
5.26
Regional collaboration on information technology projects is improving under regional services planning. The NHITB is showing clear leadership about the direction for information technology investment in the health sector. It has a national plan and a clear set of priorities that have remained stable. This gives more certainty to the sector. The NHITB is aware that it makes demands on a limited pool of money, and that it needs to be clear about how it decides to do things. It is working with DHBs to help with prioritising and to build capability to carry out information technology projects. At the same time, the NHITB shows a determination to keep people focused on what is important.
Recommendation 2 |
We recommend that the Ministry of Health and district health boards work together to improve the quality of data for planning and reporting, by exploring whether our overall findings on data quality apply to other information collected to inform decision-making. |
Recommendation 3 |
We recommend that the Ministry of Health and district health boards work together to report on how they will improve the quality of data used for planning and reporting. |
Recommendation 4 |
We recommend that the Ministry of Health refine the guidance on Faster Cancer Treatment indicators to remove ambiguity about the definitions. |
Recommendation 5 |
We recommend that the Ministry of Health and district health boards discuss and agree how to apply the definitions of the Faster Cancer Treatment indicators consistently, so that indicators are comparable between district health boards. |
7: The lack of understanding covered many aspects, such as what the data was supposed to show, exactly what data needed to be collected and recorded, and for what reasons.
page top