Andrew Morris is a pioneer in harnessing computer data to improve patient care. As dean of medicine and director of the Medical Research Institute at Dundee University, he spearheaded the development of the Scottish Diabetes Research Network which uses informatics to better understand diabetes and its complications.
The system tracks real time clinical information on all 239,000 people with Type I and Type II diabetes in Scotland, with patient records updated daily. The improvements in care seen as a result are spectacular – a 40 per cent reduction in amputations and 43 per cent reduction in laser treatment for diabetic retinopathy, a cause of blindness, over a seven-year period.
The healthcare sector as a whole has been notoriously backward in integrating information systems and wielding Big Data. But Morris and others have thrown a spotlight on its huge potential to improve the quality, effectiveness and speed of care.
For Morris, now Chief Scientific Officer for Scotland, the diabetes research network exposes a key failing of many modern health care systems. Clinics and hospitals are good at episodic care, he explains, but surveillance of chronic disease is not as good because it requires integration of information. “Waste, duplication and harm,” result from the lack of integrated information and the failure to share data between the different links in the chain, he says.
Morris’ work is delivering dramatic results because it knits together streams of data on different aspects of diabetes care, allowing clinicians to improve monitoring and spot warning signs of deterioration that simply were not picked up in the past.
Big Data analysis is also being applied to real-time monitoring of patients in intensive care. McLaren Electronic Systems, a builder of control and data systems for motorsport racing, is deploying its skills in analysing real time data tracking the condition of Formula One cars as they speed around a track to analysing outputs from clinical monitoring systems in the intensive care unit at Birmingham Children’s Hospital in the UK.
In a trial that began in 2011, McClaren is working with physicians to analyse huge and diverse streams of clinical data to prevent cardiac arrests. The goal is to alert staff to potentially life-threatening events earlier by identifying when patterns start to change. Normally hospital monitors sound an alarm when a cardiac arrest occurs. The intensive care staff at Birmingham Children’s Hospital now receive a warning up to several minutes beforehand – a crucial aide in helping avert the attack.
“What we are doing is taking the philosophy of looking at Big Data as it comes through and before it is stored to decide if something is about to happen and be able to respond,” says Peter van Manen, managing director of McLaren Electronic Systems. “There’s a better chance of a good outcome if you react sooner.”
In the case of the Birmingham intensive care unit, it’s clear minutes matter – life-threatening events have been reduced by 25 per cent thanks to a real-time big data analysis. The Birmingham trial will take several years to complete. But if it is successful, McClaren will develop and market a system for intensive care units aimed at improving patient care, van Manen said.
Air quality monitoring is another field that is crying out for the application of Big Data, says Sean Beevers, senior lecturer in air quality modelling at King’s College London. “To build good air quality models, we have to utilise Big Data, incorporating more measurements, and so improving predictive capabilities.”
Decrypting the opportunities of large DNA data sets
One example involves analysing number plate recognition data collected as vehicles pass into London’s congestion charging zone to find out exactly what vehicles enter the centre of the city, and cross referencing this data against published vehicle emissions information, giving one estimate of pollution levels. This can then be factored into the model alongside actual measurements of emissions made by roadside monitoring equipment (or indeed by citizens such as for the Eye On Earth monitoring system).
The prime reason to be concerned about air quality is the effect that pollution has on health. Beevers is involved in a major research programme, called Traffic, which aims to better understand the health problems caused by vehicle emissions in London. Among other aspects this will attempt to understand individual exposure to pollution by using anonymised data collected by Transport for London’s Oyster card electronic payment system. “This will show where people spend time, how they move around, what mode of transport they use, at a spatial and temporal resolution,” Beevers said.
But before Big Data becomes a mainstream tool in healthcare, delicate privacy issues must be resolved. “We need to sort out data issues before implementing Big Data,” said Morris. “There must be proper approval for data sharing or it will conspire against us”.
In one move to secure such approval, more than 60 leading healthcare, research and disease advocacy organisations from around the world announced in June the formation of an international alliance dedicated to enabling secure sharing of genomic and clinical data. Each has signed a letter of intent pledging to work together to create a not-for-profit, inclusive, public-private, international, non-governmental organisation, modelled on the World Wide Web Consortium, W3C, to develop a common framework.
Advances in genomics have already begun to change healthcare practice and promise to transform many aspects of medicine over the next decade, noted Ted Bianco, acting director of the Wellcome Trust, one of the organisations involved. “Realising this potential will need the establishment of common ethical and technical standards to allow genomic data to be shared safely and confidentially amongst researchers,” Bianco said.
The cost of sequencing a complete genome sequencing has fallen to around $1,000, and ever-increasing numbers of people are making their genetic and clinical data available for research and clinical use. However, setting up a system that can handle the huge amounts of data involved, and ensure privacy, is beyond the scope of a single oversight body.
At present, it is generally not possible to predict which changes in DNA sequence lead to clinical consequences. “Only by comparing each personal genome sequence to a large repository of other such data can robust patterns and relationships can be identified,” said Tom Hudson, chairman of the executive committee of the International Cancer Genome Consortium, and president of the Ontario Institute for Cancer Research in Canada.
The stakes are high: “If we get it right we can create new opportunities to define diagnostic categories, streamline clinical trials, and match patients to therapy. We want to make sure this is done in a global manner, and with the highest standards for ethics and privacy,” Hudson said.