Introducing 

Prezi AI.

Your new presentation assistant.

Refine, enhance, and tailor your content, source relevant images, and edit visuals quicker than ever before.

Loading…
Transcript

Evidence Based

Health Informatics:

Replacing Hype

With Science

Learning Objectives

Disclaimer

Quotes worth pondering

I have no conflicts of interest to report

I am approaching this subject after many years of practicing clinical medicine and about a decade of also doing clinical research and teaching Health Informatics

“Technology is not the destination, it is the transportation”

Dr. Charles Safran AMIA

“The great tragedy of science—the slaying of a beautiful hypothesis by an ugly fact ”

Thomas Huxley 1870

After completion participants should be able to:

Enumerate how Gartner’s Hype Cycle relates to Health Informatics

Describe elements of evidence based medicine that should be used to evaluate Health Informatics

Describe the limitations of evaluating Health Informatics

List several strategies to improve the evaluation of Health Informatics interventions

Background for Evidence Based Health Informatics (EBHI)

Quote About Medical Research

Quote about Health Informatics

“Uncritical adoption of new systems based on the pressures of technological push continue to discredit policy makers……there are great opportunities for researchers interested in evaluation to fill the vacuum left by informatics practitioners who are too busy writing their next line of code”

F. Sullivan 2001

First reference of EBHI I could find was by RB Hayne MD PhD in 1996, but he was speaking about using HIT to improve EBM and not the other way around

E. Ammenwerth PhD (arguably the father of EBHI) defined EBHI as the “conscientious, explicit and judicious use of current best evidence when making decisions about information technology in healthcare” (2006)

“Huge sums of money are spent annually on research seriously flawed through inappropriate designs, unrepresentative small samples, incorrect analysis and faulty interpretation. Errors are so varied that an entire book on the topic is not comprehensive……we need less research, better research and research done for the right reasons”

DG Altman 1994

Implications of Widespread HIT implementation

Overview

In the past decade health information technology (HIT) has received a huge amount of press/attention and money from:

Academic institutions

IT vendors

US Federal government

Is this attention justified?

What is the evidence? What is the ROI?

Can EBM tools can be used to improve the situation?

It will reduce the overall cost of healthcare delivery

It will improve multiple inefficiencies in the system

It will improve the quality of healthcare which in turns improves patient safety

But are we prepared to measure and prove that these interventions do what they were intended to do after spending billions for, e.g. meaningful use of EHRs?

Why is Health Informatics not Evidence Based?

Informatics Areas Where Evidence of Benefit is Lacking, Controversial or Mixed

Gartner’s Hype Cycle Emerging Tech. 2011

Conclusions

Early hype by almost everyone (the enthusiasts)

--Initially heavy influence by HIT industry

--Many HIT experts also are associated with their own companies

Methodological research issues unique to health information technology

Failure to understand “unanticipated consequences” or “E-iatrogenesis”

Electronic Health Records (CPOE and CDSS)

Electronic Prescribing (reducing medical errors)

Telemedicine (eICUs and home telemonitoring)

Chronic Disease Management (improving patient outcomes)

Patient portals and personal health records

Health information exchange

Mobile technology (mHealth)

Example of Academic Hype

But here is the editorial response--

Association or Cause and Effect?

We need to set the bar higher

Combat hype where ever and whenever it arises

We need to adhere to better reporting standards, such as CONSORT and STARE-HI

We need to study all HIT implementations, even if it is not a formal study. “If you can’t measure it, you can’t improve it” (Lord Kelvin)

Studies need to consider apotential negative results

We need to study commercial HIT and not just home-grown HIT

Bates DW, Leape LL, Cullen DJ, et al. Effect of computerized physician order entry and a team intervention on prevention of serious medication errors. JAMA 1998;280(15)

Examples of HIT Vendor Hype

After CPOE there was a drop from 10.7 to 4.9 unintercepted medication errors (preventable and potential ADEs) per 1000 bed days (reported as 55% reduction) but preventable ADEs dropped from 4.7 to 3.9 (p =.37)

Before/after study i.e. not randomized

CPOE adoption with homegrown EHR at The Brigham much easier than commercial EHR at non-medical center

Learning Objectives

Disclaimer

Quotes worth pondering

“Technology is not the destination, it is the transportation”

Dr. Charles Safran AMIA 2008

“The great tragedy of science—the slaying of a beautiful hypothesis by an ugly fact ”

Thomas Huxley 1870

  • I have no conflicts of interest to report

  • I am approaching this subject after many years of practicing clinical medicine and more than a decade of also doing clinical research and teaching Health Informatics

  • References are at end of

presentation

After completion participants should be able to:

  • Enumerate several reasons Health Informatics needs to be evidence based
  • Describe elements of evidence based medicine (EBM) that should be used to evaluate health information technology (HIT)
  • Describe unique challenges to evaluating HIT
  • List possible consequences of a non-evidence based approach to Health Informatics
  • List strategies to improve an evidence based approach

Background for Evidence Based Health Informatics (EBHI)

Quote about Health Informatics research

Quote About Medical Research

“Huge sums of money are spent annually on research seriously flawed through inappropriate designs, unrepresentative small samples, incorrect analysis and faulty interpretation. Errors are so varied that an entire book on the topic is not comprehensive……we need less research, better research and research done for the right reasons”

Douglas G. Altman 1994

“Uncritical adoption of new systems based on the pressures of technological push continue to discredit policy makers……there are great opportunities for researchers interested in evaluation to fill the vacuum

left by informatics practitioners who are too busy writing their next line of code”

Frank Sullivan MD PhD 2001

  • The concept of EBHI is very new

  • Elske Ammenwerth PhD, an Austrian Health Informatician defined EBHI as the:

“conscientious, explicit and judicious use of current best evidence when making decisions about information technology in healthcare” (2006) (1)

  • She has written multiple articles on the importance of EBHI

The scandal of poor medical research. BMJ 1994;308:283-4

What is Health Informatics? J Health Serv Res Policy. 2001 Oct;6(4):251-4.

http://www.elske-ammenwerth.de/index_engl.htm

We should not be surprised by backlash (23-24)

Overview

The implication is that widespread HIT implication will:

  • Reduce the overall cost of healthcare
  • Improve multiple inefficiencies in the healthcare system
  • Improve the quality of healthcare, which in turns improves patient safety

Are we prepared to measure and prove that these interventions do what they were intended to do after spending billions for HITECH programs?*

  • In the past decade health information technology (HIT) has received an inordinate amount of press, attention and money from:
  • Academic institutions
  • IT vendors
  • US Federal Government
  • We should be asking
  • Is this attention justified?
  • What is the evidence for benefit?
  • Can EBM tools can be used to improve the situation?

* $12.3 billion for meaningful use March 2013

Areas Where Evidence of Benefit is Lacking, Controversial or Mixed:

Gartner’s Hype Cycle Emerging Technologies

Where is health information technology on the curve?

Why Health Informatics is often not evidence based.

At least three reasons

1. Early hype by almost everyone (the techno-enthusiasts)

  • Academia (to include the IOM)
  • The HIT industry (to include HIMSS)
  • The Federal Government (to include ONC)

2. There have been serious research flaws and issues unique to health information technology evaluations

3. Failure to anticipate the unintended consequences of HIT or “ E-iatrogenesis” (Weiner 2007) (2)

  • Electronic Health Records (CPOE and CDSS)
  • Electronic Prescribing (reducing medication errors)
  • Telemedicine (eICUs and home telemonitoring)
  • Chronic Disease electronic registries (improving patient outcomes)
  • Patient portals and personal health records
  • Health information exchange
  • Mobile technology (improving outcomes)
  • Barcoding (reducing medication errors)

Example of Academic Hype

2005 Rand study sponsored by Cerner and GE (3)

But here is the response by RAND in 2013 (4)

2005 Article from the Center for Information Technology Leadership (5)

Another early sentinel article 1998 (6)

Examples of HIT Vendor Hype

After CPOE implementation there was a drop from 10.7 to 4.9 unintercepted medication errors (preventable and potential ADEs) per 1000 bed days (reported as a 55%55% relative risk reduction) but preventable ADEs dropped from 4.7 to 3.9 (p =.37). Research issues:

  • Before/after study; i.e. not randomized
  • CPOE adoption with homegrown EHR at The Brigham much easier than commercial EHR at non-medical center
  • This article was widely publicized and soon became gospel

Articles such as these undoubtedly influenced future government spending on HIT

Bates D. The effect of CPOE and a team intervention on prevention of serious medication errors" JAMA 1998;280(15):1311-16

Evidence Based Health Informatics

Robert Hoyt MD FACP

Director, Medical Informatics

University of West Florida

Pensacola, Florida

Unintended Consequences

Progression of paper silos of health information to electronic silos

  • Particularly if health information exchange (HIE) fails

Physicians who do not purchase a certified EHR and adhere to meaningful use will be fined in the future

  • Will that be fair if EHR use is not found to improve quality or reduce costs?
  • How many purchased an EHR they are unhappy with just to avoid future penalties?

30. Inventory of Evaluation Publications http://evaldb.umit.at

31. Nykanen P. Guideline for good evaluation practice in health informatics (GEP-HI). IJMI 2011;80:815-827

32. Health Information Technology Toolkit (AHRQ Publication No. 09-0083-EF) June 2009

33. Talman J. STARE-HI--statement on reporting of evaluation studies in Health Informatics. IJMI 2009;78:1-9

Example of US Government Hype

Unintended Consequences of HIT

Another example of vendor HIT hype (7)

Unintended Consequences

Unintended consequences of HIT

But this is what Francis Collins MD PhD,

NIH Director said at the 2012 mHealth Summit (8)

  • Good 2011 review article by Sittig DF & Singh H (16). Error categories:
  • Technology is unavailable (e.g. Internet is down)
  • Technology malfunctions (e.g. software produces errors)
  • Technology functions but there is human error (e.g. e-prescribing is working but clinicians inputs drug dose incorrectly)

Those most pertinent to Health Informatics:

  • Medical device alarm hazards (alert fatigue)
  • Med administration errors with infusion pumps (non-integrated system)
  • Patient/data mismatches in EHR
  • Medical device interoperability failures with health IT systems
  • Distraction while using mobile devices (e.g. text messaging and accessing social media on the job)
  • Potential harm
  • The Joint Commission issued a Sentinel Event alert in 2008 noting that 25% of medication errors involved some aspect of computer technology
  • Alert fatigue may cause warnings about drugs and abnormal lab results to be ignored (17)
  • Potential increase in healthcare costs
  • Upcoding with EHRs (18)
  • More ordering of tests (19)
  • Long term cost of EHRs post-HITECH and system failures (20)
  • Dec 2012 Survey sponsored by Qualcomm Life and HIMSS
  • Data was derived from healthcare exec's opinions, not clinicians
  • Survey confirmed mHealth popularity but not effect on patient outcomes or any other significant outcome

Top 10 Health Technology Hazards for 2013

ECRI 41(11) November 2012 (21)

25. NYC hospitals may base doctor's pay on performance. Advisory Board briefs. Jan 14 2013. www.advisory.com

26. Kern LM. Accuracy of electronically reported "meaningful use" clinical quality measures. Ann Int Med. 2013;158:77-83

27. Ryan AM. Small physician practices in New York needed sustained help to realize gains in quality from use of electronic health records. Health Affairs. 2013;32(1):53-62

28. Black AD. The impact of eHealth on the quality and safety of health care: a systematic overview. PLoS Medicine. 2011;8(1) www.plosmedicine.org

29. Ammenwerth E. Vision and challenges of evidence based health informatics: a case study of a CPOE meta-analysis. IJMI 2010;79:e-83-88

Robert Hoyt MD FACP

Director Medical Informatics

University of West Florida

Unintended Consequences

Okay. What is the way forward?

Research Methodological Issues

Methodological issues: internal validity

CBO: Evidence on the Costs and Benefits of Health Information Technology May 2008 (9)

EHR system audit log (Practice Fusion). A possible time-motion measure

Use more rigorous research methods

  • Internal Validity: were published studies of the highest quality? Only a minority were randomized controlled trials (RCTs) (10)
  • External Validity: did published articles represent results that could have been achieved by average US medical practices and hospitals? The literature suggests no (9)
  • A prospective randomized controlled trial is the gold standard for proving cause and effect but:

---HIT interventions are highly complex and difficult to control for all variables (another slide)

---Randomization and blinding can be difficult

---There were no universal guidelines for HIT interventions, until SHARE-HI was developed

  • Many informatics studies are observational in nature which limits their validity

Note: association does not equal cause and effect. "Firetrucks do not cause fires" even though one is associated with the other

“For providers and hospitals that are not part of integrated systems, however, the benefits of health IT are not as easy to capture, and perhaps not coincidentally, those physicians and facilities have adopted EHRs at a much slower rate.

Office-based physicians in particular may see no

benefit if they purchase such a product—and may even suffer financial harm”

Combat hype with science. Under promise and over deliver

  • Apply EBM tools to every lay (gray literature) and medical article. Read the original article. Be critical and objective
  • Respond to articles, editorials and blogs by enthusiasts, researchers and the federal government. Play an active role in combating hype
  • Seek objective resources

  • Study international systematic reviews to understand the scope of the challenge for EBHI (28)
  • Study the known methodological research challenges pointed out by others (11)(29)
  • Take advantage of a web-based database on health informatics interventions, 1500+ studies archived (30)
  • Data breaches are on the rise
  • Most breaches are associated with electronic devices; greater potential to breach more records, compared to paper (22)

19. McCormick D. Giving office-based physicians electronic access to patients' prior imaging and lab results did not deter ordering of tests. Health Affairs.2012;31(3):488-496

20. Adler-Milstein J. A survey analysis suggests that electronic health records will yield revenue gains for some practices and losses for many. Health Affairs. 2013;32(3):562-570

21. Top 10 Health technology hazards for 2013. Vol 41 (11). November 2012 www.ecri.org

22. Scandlen G. The HIT scam. Healthblog. Jan 23 2013. http://healthblog.ncpa.org/the-hit-scam

23. Abelson R. In second look, few savings from digital health records. Jan 10 2013. www.nytimes.com

Use More Rigorous Research Methods

What are the consequences of not being evidence based?

Use More Rigorous Research Methods

Complexity of HIT Interventions (techno-social) (11)

Conclusions

1. Follow international HIT research guidelines

  • Good Evaluation Practice in Health Informatics (GEP-HI) (31)
  • Health Information Technology Toolkit (AHRQ Publication No. 09-0083-EF) June 2009 (32)
  • Statement on Reporting of Evaluation Studies in Health Informatics (STARE-HI) IJMI (33)

2. Use confidence intervals and effect size where appropriate

3. Cluster randomize if individual randomization not possible

4. Better total cost/ROI research

5. Look for new ways to objectify the evaluation of HIT (another slide)

  • Poor general perception, confidence and trust by all parties
  • HIT "Winter"
  • Confused and inconsistent public policy
  • Increased, rather than decreased healthcare costs
  • Possible lower quality and less efficient care
  • "Perfect Storm": HITECH funds spent with no clear cut benefit, sagging interest from all parties and no further funding

Our field would benefit from uniform definitions:

  • Why not one field Health Informatics and retire Medical, Clinical, Biomedical and Nursing Informatics?
  • We need standardized definitions of important terms such as medication errors and adverse drug events (ADEs)

9. Orszag P. Evidence on the costs and benefits of health information technology. CBO July 24 2008. www.cbo.gov

10. Keizer NF. The quality of evidence in health informatics: how did the quality of healthcare IT evaluation publications develop from 1982 to 2005? IJMI; 2008;77:41-49

11. Shcherbatykh I. Methodologic Issues in health informatics trials: the complexities of complex interventions. JAIMA 2008;15:575-580

12. Ammenwerth E. A viewpoint on evidence based health informatics, based on a pilot survey on evaluation studies in health care informatics. JAMIA. 2007;14(3);368-371

13. Slawson D. Becoming a medical information master: feeling good about not knowing everything. J Fam Pract 1994;38:505-513

14. Krishna S. An analysis of MEDLINE indexing of medical informatics literature. Proc. AMIA Symp 1998; 1030

15. Chaudry B. Systematic review: impact of health information technology on quality, efficiency and cost of medical care. Ann Int Med. 2006;144(10):742-752

16. Sittig D. Defining health information technology errors: new developments since to Err is Human. Arch Int Med. 2011;171(14):1279-1282

17. Singh H. Information overload and missed test results in electronic health record-based settings. JAMA. 2013. March 4. Online First.

18. Schulte F. How doctors and hospitals have collected billions in questionable Medicare fees. www.publicintegrity.org September 15 2012

  • We are still in our infancy when it comes to understanding the benefits, challenges and overall impact of HIT. We hope both the technology and research will improve in the future
  • We need to combat hype with science
  • We need better studies of major commercial HIT implementations, in real-world scenarios or they will not be funded by the federal government or civilian healthcare organizations
  • We need to standardize meaningful and measurable outcomes related to HIT

Policy Confusion in NYC January 2013? (25)

References

1. Ammenwerth E. Is there sufficient evidence for evidence-based informatics? 2006. www.gmds2006.de/Abstracts/49.pdf

2. Weiner J. "e-Iatrogenesis": The most critical unintended consequence of CPOE and other HIT. JAMIA 2007;14:387-388

3. Hillestad R. Can electronic medical record systems transform healthcare? Potential health benefits, savings and cost. Sept 2006. content.healthaffairs.org/content/24/5/1103.full

4. Kellermann A. What will it take to achieve the as yet unfulfilled promises of health information technology? Health Affairs. 2013;32(1):63-68

5. Walker J. The value of health care information exchange and interoperability. Health Affairs. 2005; 19 January. Online

6. Bates D. The effect of CPOE and a team intervention on prevention of serious mediation errors. JAMA. 1998;280(15):1311-6

7. 2nd Annual HIMSS Mobile Technology Survey. Dec 3 2012. www.himss.org

8. Collins F. Keynote speaker. mHealth Summit. December 2012

Yet, at the very same time

  • Annals of Int. Med. Jan 2013 article from Weill Cornell point out how difficult quality measurement reporting accuracy with EHRs is (26)
  • Health Affairs Jan 2013 article article from Weill Cornell (different authors) reported that just reporting via an EHR didn't guarantee improvement in quality measures; only a few improved and with extensive technical support (27)

Methodological issues: internal validity

  • Usually process or surrogate endpoints are studied e.g, drop in HbA1c levels or diastolic BP and not true clinical endpoints, e.g. reduction in MIs or CVAs

  • Using an EBM analogy (work by Slawson and Shaughnessy), we often measure disease oriented evidence that matters (DOEM) and not patient oriented evidence that matters (POEM) (13)
  • Other internal validity issues:

---Endpoints are usually short term and don’t reflect actual patient outcomes.

---Sample size tends to be small which can lead to type I or II errors

---What is a clinically significant effect? 20 % improvement or just a significant p value?

---You must also measure any negative impact of an intervention

  • Biases: were published studies subject to multiple biases?

---One survey found that 1/3 of informatics-related articles were not published so "negative publication bias" is likely (12)

  • PubMed search of the medical literature related to Health Informatics often difficult due to MeSH limitations
  • Statistically significant does not equal clinically significant

Methodology issues: external validity

Methodological issues: external validity

% Generalisability of Research Papers Presented at 1997 AMIA Conference (14)

“Four benchmark institutions have demonstrated the efficacy of health information technologies in improving quality and efficiency. Whether and how other institutions can achieve similar benefits, and at what costs, are unclear”

Regardless, this article was extensively quoted as showing the benefits of HIT

  • Most early medical articles touting the advantages of, e.g. CPOE, came from medical centers with home grown HIT, impressive IT support, larger budgets and a track record of innovation

  • Contrast that with healthcare systems that have commercial EHRs, limited leadership, small IT budgets and physician resistance. Your typical small rural hospital might fit this description

Chaudhry B et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care Ann Int Med 2006;144:742-752 (15)

Methodological issues: qualitative research?

  • Without a doubt there are technologies that save time and are very popular but are hard to quantify. Here are just a few examples:

---Smartphone apps: e.g. Epocrates

---Patient portals

---Voice recognition

---E-visits

---PACS

  • We need vigorous qualitative methods to study convenience, workflow, etc.
  • Helpful to have mixed methods research that combines qualitative with quantitative approach

Literature since 2005 is full of examples of harm related to HIT

But here is what Francis Collins MD PhD, NIH Director said December 2012

Top 10 Health Technology Hazards for 2013

ECRI 41(11) November 2012

Another example of vendor HIT hype

Example of US Government Hype

Unintended consequences of HIT or

“E-iatrogenesis”

Air embolism hazards

Adult technologies for peds patients

Inadequate reprocessing of endo/surgical instruments

Distraction while using mobile devices

Surgical fires

Alarm hazards

Med administration errors with infusion pumps

Excessive Xray exposure

Patient/data mismatches in EHR

Device interoperability failures with ISs

Dec 2012 Survey sponsored by Qualcomm Life and HIMSS

Data was derived from healthcare execs, not clinicians

Survey confirmed mHealth popularity and not impact on outcomes

Mobile technology maturity rated as 3.3 on scale 1-7

CBO: Evidence on the Costs and Benefits of Health Information Technology May 2008

What about Privacy and Security?

What is the way forward?

EHR system audit log (Practice Fusion)

Methodological Issues

Problems with Internal Validity in HI studies

Internal Validity: were published studies of the highest quality? Only a minority were randomized controlled trials (RCTs)

External Validity: did published articles represent results that could have been achieved by average US medical practices? The literature suggests no

More rigorous research studies

Statement on Reporting Evaluation Studies in Health Informatics (STARE-HI)

CONSORT for RCTs

Guidelines for non-RCTs

Clinical Adoption Framework

Apply EBM tools to every lay and medical article. Request the original article. Be critical and objective

Look for new ways to measure HIT

A prospective randomized controlled trial is the gold standard for proving cause and effect but

HIT interventions are highly complex and difficult to control for all variables (another slide)

Randomization and blinding can be difficult

There were no universal guidelines for HIT interventions, until SHARE-HI was developed

Many informatics studies are observational in nature. Note: association ≠ cause and effect

“For providers and hospitals that are not part of integrated systems, however, the benefits of health IT are not as easy to capture, and perhaps not coincidentally, those physicians and facilities have adopted EHRs at a much slower rate.

Office-based physicians in particular may see no

benefit if they purchase such a product—and may even suffer financial harm”

Data breaches are on the rise

Most breaches occur associated with electronic devices; greater potential to breach more records, compared to paper

Conclusions

Complexity of HIT Interventions

We are still in our infancy when it comes to understanding the impact of HIT

We need better studies of major HIT implementations or they will not be funded by the federal government or civilian healthcare organizations

We need to standardize meaningful and measurable outcomes related to HIT

Problems with Internal Validity in HI studies

Methodological Issues: Internal Validity

Outcomes in HI Studies

Usually process or surrogate endpoints e.g, drop in A1c levels or diastolic BP and not true clinical endpoints e.g. reduction in MIs or CVAs

Using an EBM analogy, we often measure disease oriented evidence that matters (DOEM) and not patient oriented evidence that matters (POEM)

Biases: were published studies subject to multiple biases?

One survey found that 1/3 of informatics-related articles were not published so publication bias is likely

Search of the medical literature related to Health Informatics often difficult due to MeSH limitations

Statistically significant ≠ clinically significant

A prospective randomized controlled trial is the gold standard for proving cause and effect but:

Endpoints are usually short term and don’t reflect actual patient outcomes. Endpoints may be composite but are they valid?

Sample size tends to be small

What is a clinically significant effect? 20 % improvement or just a significant p value?

You must also measure any negative impact of an intervention

Chaudhry B et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care Ann Int Med 2006;144(10)

Methodology Issues: External Validity

% Generalizability of Research Papers Presented at 1997 AMIA Conference

Most early medical articles touting the advantages of e.g. CPOE came from medical centers with home grown HIT, impressive IT support and a track record of innovation e.g. Vanderbilt

Contrast that with healthcare systems that have commercial EHRs, poor leadership, small IT budgets and physician resistance. Your typical small rural hospital would fit this description

This article was extensively quoted as showing the benefits of HIT, however here is the conclusion:

“Four benchmark institutions have demonstrated the efficacy of health information technologies in improving quality and efficiency. Whether and how other institutions can achieve similar benefits, and at what costs, are unclear”

What about Qualitative Research?

Without a doubt there are technologies that save time and are very popular but are hard to quantify. Here are just a few examples:

Smartphone apps: e.g. Epocrates

Patient portals

Web based EHRs, accessible from anywhere

Voice recognition, very successful for some

We need vigorous qualitative methods to study convenience

Learn more about creating dynamic, engaging presentations with Prezi