<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=89122&amp;fmt=gif">
Blog |  8 min read

Saving Lives Using Big Data

You might not think the role data quality plays can be the difference between life and death but that’s exactly the case when the data in question is that of a major hospital.

When you are admitted to a hospital, or visit a healthcare provider, important information is collected about you including what conditions you might have, what drugs you have been given and what procedures have been undertaken on you. If you end up in hospital numerous times, the previous information collected about you can be integral to improving your care and your outcome – but only if used productively.

Unfortunately, using data productively becomes difficult when patient numbers are ever increasing and data systems remain stuck in the 20 thcentury. That’s why Seattle Children’s Hospital recently implemented IBM Big Data Technology to help enhance its care and streamline the way staff are able to find and receive vital patient information.

Previously, analysts spent days pulling data from disparate sources and manually compiling the data into spreadsheets and databases for reporting. The process was cumbersome, with even simple queries taking minutes or hours to complete.

Now the increasing amounts of data have been integrated into electronic medical record and billing systems, resulting in faster analytics and reporting that have helped the hospital improve operations and adherence to safety protocols, and enhance patient care.

“Some queries that were running in five minutes are now taking only four seconds,” says Wendy Soethe, Enterprise Data Warehouse Manager at Seattle Children’s Hospital. “Now we can deliver the right data to answer the right question at the right time, and power users can propagate their insights throughout the hospital, so the data reaches more people.”

I cannot help but think that Big Data in hospitals helped kill off future series of House MD and could explain why it was ended in 2012. It's hard to fill in 42 minutes of procedural hospital drama if a computer diagnoses the problem in less than a second.

Someone on the DVD Talk forum once summed up every episode of House:

1) House is an a**hole

2) Person comes in with terrible disease

3) House is an a**hole

4) Person turns critical, House can't figure it out

5) House is an a**hole

6) House miraculously figures it out at the last second, putting his job, career, and livelihood on the line on a hunch he explains to no one else

7) People revel in House's brilliance

8) House is an a**hole, wishes he was dead.

Now that IBM has introduced Big Data to Hospitals we have to rewrite the script.

1) House is an a**hole.

2) Person comes in with terrible disease.

3) Computer receives input and monitors patient’s conditions.

4) Person turns critical, House can't figure it out

5) Computer checks symptoms against every known medical text, study and treatment plan in under two seconds.

6) Computer tells medical staff what is wrong with the person and how to fix it.

7) People revel in Big Data's brilliance

8) House is an a**hole, wishes Big Data was dead.

It kind of takes all the fun out of hospital drama, like giving GPS navigation to contestants on The Amazing Race or putting a McDonalds outlet on the next Survivor island.

I remember being at a Q&A a couple years back on using InfoSphere Streams for neonatal care at Ontario. All the medical machines hooked up to a monitor, a newborn had a different data format and the machines did not talk to each other. Streams had to read machine data in real time, correlate it and detect early signs of trouble such as a change to the heartbeat. It could detect a new infection before there were any visible signs so they could treat it early.

Most companies I work with have the same problem - though not as life threatening. They have a number of systems that were built by independent vendors and do not talk to each other. The success of their Big Data initiatives depends on how they go about bringing data together in the one place and finding correlations, trends, problems and opportunities in that data. I would argue that it is now easier and cheaper to build this type of Analytics platform using real time feeds rather than overnight batches because real time technology (like Streams or database replication) and Data Warehouse methods (like Data Vault) tend to be more robust and less complex than old school DW.

To ensure your important data assets are managed accurately and effectively click here to find out more about our Certus Data Quality Framework.

 

Stay up to date

Stay up-to-date with the latest news and receive notifications right to your inbox when new articles are published by subscribing now.