IDG Contributor Network: A speedy recovery: the key to good outcomes as health care’s dependence on data deepens
It may have been slow to catch on compared to other industries, but the health care sector has developed a voracious appetite for data. Digital transformation topped the agenda at this year’s Healthcare Information and Management Systems Society (HIMSS) conference in Florida, and big data analytics in health care is on track to be worth more than $34 billion globally within the next five years—possibly sooner.
Electronic health records are growing in importance to enable more interdisciplinary collaboration, speed up communication on patient cases, and drive up the quality of care. Enhanced measurement and reporting have become critical for financial management and regulatory compliance, and to protect organizations from negligence claims and fraud. More strategically, big data is spurring new innovation, from smart patient apps to complex diagnostics driven by machine learning. Because of their ability to crunch big data and build knowledge at speed, computers could soon take over from clinicians in identifying patient conditions—in contrast to doctors relying on their clinical experience to determine what’s wrong.
But as health care providers come to rely increasingly on their IT systems, their vulnerability to data outages grows exponentially. If a planned surgery can’t go ahead due to an inability to look up case information, lab results, or digital images, the patient’s life might be put at risk.
Symptoms of bigger issues
Even loss of access to administrative systems can be devastating. The chaos inflicted across the UK National Health Service in May following an international cyberattack—which took down 48 of the 248 NHS trusts in England—gave a glimpse into health care’s susceptibility to paralysis if key systems become inaccessible, even for a short time. In the NHS’s case, out-of-date security settings were to blame for leaving systems at risk. But no one is immune to system downtime, as was highlighted recently by the outage at British Airways, which grounded much of its fleet for days, at great cost not to mention severe disruption for passengers.
Although disastrous events like these instill fear in CIOs, they can—and should—also serve as a catalyst for positive action. The sensible approach is to design data systems for failure—for times when, like patients, they are not firing on all cylinders. Even with the best intentions, biggest budgets and most robust data center facilities in the world, something will go wrong at some point according to the law of averages. So, it’s far better to plan for that than to assume an indefinitely healthy prognosis.
If the worst happens, and critical systems go down, recovery is rarely a matter of switching over to backup infrastructure and data—particularly if we’re talking about live records and information, which are currently in use and being continuously updated. Just think of the real-time monitoring of the vital signs of patients in intensive care units.
If a contingency data-set exists (as it should) in another location, the chances are that the original and the backup copy will be out of sync for much of the time, because of ongoing activity involving those records. In the event of an outage, the degree to which data is out of step will have a direct bearing on the organization’s speed of recovery.
To ensure continuous care and patient safety, health care organizations need the fastest possible recovery time. But how many organizations have identified and catered for this near-zero tolerance for downtime in their contingency provisions?
Emergency protocol
The issue must be addressed as data becomes an integral part of medical progress. Already, data is not just a key to better operational and clinical decisions, but also an intrinsic part of treatments—for example in processing the data that allows real-time control and movement in paralyzed patients. Eventually, these computer-assisted treatments will also come to rely on external servers, because local devices are unlikely to have the computing power to process all the data. They too will need live data backups to ensure the continuity and safety of treatment.
On a broader scale, data looks set to become pivotal to new business models (for example, determining private health care charges based on patient outcomes, otherwise known as “value-based medicine”).
While technology companies will be pulling out all the stops to keep up with these grander plans, maintaining live data continuity is already possible. So that’s one potential barrier to progress that can be checked off the list.
This article is published as part of the IDG Contributor Network. Want to Join?
Source: InfoWorld Big Data