jueves, 30 de mayo de 2019

How Learning Health Systems Learn: Lessons from the Field | Agency for Healthcare Research & Quality

How Learning Health Systems Learn: Lessons from the Field | Agency for Healthcare Research & Quality



AHRQ--Agency for Healthcare Research and Quality: Advancing Excellence in Health Care



How Learning Health Systems Learn: Lessons from the Field

AHRQ asked healthcare systems how they became learning health systems. Their experiences are below.
A growing number of healthcare organizations are developing their capacity to become learning health systems. In a learning health system, internal data and experience are systematically integrated with external evidence and that knowledge is put into practice. As a result, patients get higher quality, safer, more efficient care, and healthcare delivery organization become better places to work.
The path toward becoming a learning health system is just beginning to take shape. Organizations vary in how they are approaching the systematic use of evidence in the care of patients, and some are further along in this development than others.
AHRQ conducted site visits at a few leading organizations to learn about the steps they have taken to move evidence to the frontlines of care. The purpose was to gather information that could be shared with other organizations involved in the journey toward becoming a learning health system and to identify ways in which AHRQ could support this process.
What emerged from these conversations was the clear sense that these organizations have established some valuable building blocks in their journey toward becoming learning health systems. All of them would characterize themselves as having further to go than they have already come, but they are all committed to the systematic use of evidence to improve performance. While each organization is growing its capacity to generate, adopt, and apply evidence, they all also identified ways they believe that AHRQ could accelerate the process and improve healthcare delivery and patients’ outcomes.
No one organization had fully woven evidence into all aspects of its operations, but together they demonstrated different approaches to how this could be done. These organizations are to varying degrees playing roles as

Evidence GeneratorsEvidence Generators

Each organization, to a varying degree, has a history of employing individuals who conduct research to generate new knowledge to inform practice. Until recently, this may or may not have been focused on the patient population where the investigator was employed. What is in the process of changing is that organizations are paying investigators to focus their attention on their own healthcare delivery system. Based on what was learned through the site visits, organizations are starting with a handful of investigators and not necessarily on a full-time basis. Interview comment indicated that it is in the interest of the organization to encourage investigators to continue to seek extramural funds to expand this work. There was also a sense that it was in the interest of the investigators as they establish their reputations for them to be able to demonstrate that they could be successful in competing for grant funding. In most cases, these in-house funded researchers were developing research questions informed by experiences of being embedded within a clinical service area, such as cardiology, and by the expressed priorities of senior managers within the healthcare organization. Projects that tended to gain the most traction were those that clinical service leaders and health system managers prioritized as meaningful and researchers regarded as innovative. These projects often began with resources available from the health system and sometimes expanded with the availability of extramural funds.
Support for embedded researchers within healthcare systems comes on the heels of major investments by these organizations in electronic health records. Use of computers and electronic data is not entirely new, but what had been for some organizations a collection of isolated electronic data systems is rapidly becoming a single integrated system built on a common platform across different levels and sites of care. This supports care management across different levels of service and provides researchers an opportunity to identify patient groups, characterize care patterns across levels of service, evaluate costs, and determine health outcomes. Analysis of the observational data remains a somewhat labor-intensive activity at this point with little in the way of standardized reporting tools within or across health systems. In some healthcare organizations, there is a developing capacity to not only analyze the observational data but also to conduct experiments (pragmatic trials) or quasi-experiments (step-wedge evaluation) in which the information system is often the backbone of assigning patients to different intervention arms and may be used to collect outcomes on clinical events such as hospitalizations. For example, at Kaiser Permanente Southern California (KPSC) investigators are conducting a pragmatic randomized trial in which the population of patients with chronic obstructive pulmonary disease was identified in the electronic records and then randomized to standard care versus a home-based physical activity coaching program for a 12-month period. Outcomes including hospitalization, emergency department visits, lab studies of metabolic markers, and death are being assessed for more than 1,600 patients in the two arms through the electronic health record system.1 Recognizing the potential power of this information not only for individual patient care but also for population health management, healthcare organizations are seeking ways to generate knowledge from their data systems that can inform quality improvement and cost-efficient delivery of services.

Evidence CuratorsEvidence Curators

Separate from their role as evidence generators, some healthcare organizations are taking responsibility for curating evidence in the published literature. Historically this is something an individual or a group of physicians was assumed to be doing to maintain their clinical expertise. However, healthcare organizations are stepping into this role, in part, because they perceive their clinicians are overwhelmed by attempts to stay on top of the rapidly accumulating knowledge base. In one case, a healthcare organization employed staff with expertise in identifying and systematically summarizing research literature on clinical topics in response to requests from clinical leaders within the organization. Healthcare managers at this organization also asked these staff to perform periodic surveillance of published research to identify potential treatment approaches that would be discussed with the appropriate clinical leaders to determine the relevance for the organization’s practices. Organizations that also have a role as an insurer may use evidence curation to address requests by patients and clinicians for coverage for a treatment that a health plan has not already determined is a covered benefit.
Dr. Michael Kanter and colleagues at KPSC developed a program called E-SCOPE (Evidence Scanning for Clinical, Operational, and Practice Efficiencies) to expedite the adoption of findings from rigorous research studies.2 The program supports a senior evidence specialist at KPSC who actively scans the literature to identify high-quality randomized controlled trials and systematic reviews of randomized controlled trials. The focus is not only on drugs and devices but also on care practices. Quarterly, the senior evidence specialist screens approximately 1,000 newly published trials and typically selects about 150 studies that meet predetermined criteria for study quality; feasibility of implementation within the KPSC system; and improving health outcomes, affordability, efficiency, or utilization. E-SCOPE’s regional quality leaders review the selected studies and select about 30 to 50 studies that are shared with the appropriate physician and operational managers across practice sites. These individuals then discuss the most promising opportunities while considering the expected costs and benefits. Once there is consensus among physician and operational managers to move forward, two implementation project managers are tasked with pulling together a multidisciplinary team to assist with implementing the practice change. Implementation in a clinical service area is designed to occur across all practice locations. Monitoring process outcomes is performed to assess how effectively the practice change has been scaled. Over a 4-year period, KPSC implemented 30 practice changes, including 20 that had not been used previously and 10 that were underused.

Evidence AdoptersEvidence Adopters

Based on knowledge generated within the health system or curated from evidence generated elsewhere, some healthcare organizations are taking the additional step of systematically adopting evidence at an organizational level. One way they are doing this is by creating system-wide guidelines for clinical practices in which there is available evidence. This is a new role for a healthcare organizations that have generally left the choice and use of practice guidelines to individual clinicians. Clinicians tend to make use of practice guidelines produced by their own specialty. This can sometimes result in inconsistent guidelines for the same clinical problem across different specialties within an organization. When organizations attempt to establish system-wide guidelines, they bring together clinical leaders from involved specialties and engage them in a discussion of the evidence to establish consensus on practice recommendations applicable to all clinicians within the organization. Guidelines based on evidence that are established at an organizational level have the potential to harmonize differences across specialties and thereby reduce variation in practice that does not contribute to high-quality care.
Another way organizations are supporting the adoption of evidence is by providing their clinicians with information on their practice variation. This is typically done within a clinical service area with a focus on a common or a set of common clinical practices within the subgroup of relevant clinicians. In some cases, organizations are not only providing the data to various groups of clinicians but also are financially rewarding them to meet and discuss the results on a periodic basis with their colleagues. This is an opportunity for peers to review data on their own practice, to consider the relevant evidence, and to provide feedback that can contribute to adoption of evidence-based practices and the reduction of practice variation that does not contribute to quality.
Jeff Weilburg, M.D., Medical Director of The Mass General Physicians Organization (MGPO), which represents the physicians and associated clinicians at the Massachusetts General Hospital in Boston, MA, led the development of the MGPO Variation Analysis and Reporting (VAR) service. The reports show clinicians how they compare to like colleagues regarding use of clinical services, such as imaging, laboratory studies, medications, and emergency department visits. Measures of the probability a service is provided (e.g., the rate at which a primary care physician [PCP] orders any image on patients in their panel during a 6-month period) as well as the intensity (e.g., the number of image orders submitted for patients with any imaging ordered) are provided. Performance on normed measures, such as the appropriateness of imaging orders and patient communication, are provided as well. The reports are based on the outcome of statistical models that include patient, doctor, and other adjusters derived from the electronic medical record.
VAR initially produced blinded reports to PCPs about the appropriateness and volume of high-cost imaging (computed tomography, magnetic resonance, nuclear, positron emission tomography).3 Over time, the reports become unblinded and the scope increased to include a wide range of services: the rates at which common laboratory studies such as complete blood counts, basic metabolic panels, and thyroid-stimulating hormone tests are used and rates of emergency department use by patients in the PCP’s panel. Pharmacy was reported as total costs of medication prescribed by the PCP, as well the rates at which the PCP prescribed generics. Over the past 3 years, the report added the “Doctor Communication” scores derived from the Consumer Assessment of Healthcare Providers and Systems data for each PCP. PCPs (n=270) with patient panels totaling 170,000 receive individual-level reports that are risk-adjusted over the entire practice. PCPs located at 1 of 22 clinical sites receive information on their individual performance benchmarked against the other PCPs at their clinical location.
VAR has now expanded to include Specialty Care Physicians (SCPs) in the reporting. The SCP reports are based on the ”functional panel” composed of all patients the SCP rendered outpatient visits to in the reporting period. Whereas PCPs are grouped by their practices, SPCs are divided into clinical subgroups within their departments. For example, neurologists are subgrouped into those seeing patients with memory disorders, epilepsy, stroke, neoplasm, etc.
The first time VAR reports were distributed to all MGPO doctors, including primary care (n=270) and specialists (n=1937), they received supplements of an average of $833/clinician/year for opening and reviewing their personal report and completing a survey regarding their reaction to the report. More than 2,000 doctors viewed the reports at least once. Survey responses indicated that more than 80 percent believed the reports to be sensible and useful to encourage organizational change and support clinical quality improvement goals.
The imaging appropriateness component of the VAR has also been incorporated into the MGPO credentialing process. Clinicians with persistently high rates of inappropriate imaging use risk not being re-credentialed. Those who have high rates of inappropriate use are given an opportunity to engage with their practice medical directors to make improvements relative to evidence-based guidelines to avoid the risk of not being re-credentialed.

Evidence DisseminatorsEvidence Disseminators

In addition to providing guidelines and data to support evidence-based practice, some healthcare organizations are actively promoting use of evidence through clinical decision support (CDS) and provider payment incentives. CDS is typically integrated into electronic health record systems and is prompted when clinicians are making relevant diagnostic, testing, or treatment decisions. For example, an organization might embed CDS within its electronic health record system to encourage evidence-based strategies at the time a clinician is attempting to order an imaging study. CDS can be implemented as a purely informational tool or place requirements on clinicians to take additional steps if they wish to pursue a testing or treatment approach that does not conform to what the CDS recommends.
Organizations that have a financial interest in managing costs for a population of patients may go a step further by tying financial incentives, such as payment bonuses or opportunities for shared savings, with clinicians based on their efficient management of resources as well as adherence to evidence-based quality metrics.
Will Shrank, M.D., Chief Medical Officer at the University of Pittsburgh Medical Center (UPMC) Health Plan, has pursued opportunities to use physician payment incentives to promote quality improvement based on evidence within the UPMC integrated delivery system. The current model offers clinicians a share of the joint savings (shared savings) when they meet financial and quality targets. The UPMC Health Plan Shared Savings program began in July 2011 with one primary care practice partner implementing a shared savings payment arrangement in one product line. The program has grown to include 37 shared savings partners  ranging from large multispecialty practices to solo practices and covering more than 500,000 health plan members. An important part of the strategy is to align the financial incentives across payers and health plan product lines.
To date, UPMC Health Plan has moved into risk-based contracting with its PCPs and is in the process of bringing specialists into value-based payment arrangements. For example, UPMC oncologists have been participating since 2016 in the Center for Medicare and Medicaid Services Innovation Center (CMMI) demonstration called the Oncology Care Model. The model is targeted toward Medicare fee-for-service beneficiaries receiving chemotherapy. The demonstration is testing whether payment for the provision of enhanced services, such as patient navigation, care plans, 24/7 patient access, and treatment that is consistent with nationally-recognized clinical guidelines, results in improved patient outcomes and financial savings. In 2017, UPMC Health Plan extended CMMI’s approach with UPMC oncologists for members across all lines of business (Medicare, Medicaid, Medicare Special Needs Plans, commercial, etc.) by providing payments for the enhanced services and offering financial bonuses based on a common set of core quality measures. UPMC Health Plan pays UPMC oncologists an infrastructure support fee of $960 per member per year. This fee is paid in two installments: $720 at episode trigger and the remainder on achievement of quality performance metrics over the year. To receive the full payment, UPMC oncology practices must meet or exceed an all-cause hospital admission measure and at least five of six other quality measure targets: (1) all-cause emergency department visits, (2) number of visits where pain intensity is quantified, (3) number of visits where a plan to address pain is documented, (4) active screening and treatment plan for depression, (5) chemotherapy in last 14 days of life, and (6) percentage of patients receiving 3 or more days of hospice prior to death. These quality targets are based on the oncologists’ own evidence-based guideline. Within the first 6 months, UPMC Health Plan observed significant reductions in costs and improvements in five of the six measures. There was no meaningful change in the hospice-related measure. UPMC Health Plan is working to better integrate UPMC oncology practices with palliative care services.

Evidence ManagersEvidence Managers

Organizations that have a financial interest in managing costs for a population of patients are also in some cases applying evidence outside of the care setting. One application is in purchasing decisions for supplies and equipment, where evidence on effectiveness and cost can be considered as a way to choose among options to maximize value. In cases in which healthcare organizations are not accountable for their costs, the choice of medical equipment is often left to the clinicians who use it. For example, different surgeons at the same healthcare organization who perform joint replacements may choose to use different medical devices. They may be influenced more by their experience with particular devices than by evidence on comparative effectiveness and costs of the various options. Healthcare organizations that are accountable for their costs are in a position to review evidence on the effectiveness of the various options, to discuss the evidence and implications of any limits on purchasing choices with affected clinicians to ensure quality is not compromised, and to use their purchasing power to obtain the best value for their patient population.

Shared Needs

Healthcare organizations are exploring and developing their capacity to become learning health systems that are able to generate, adopt, and apply evidence to support quality improvement and high-value care. However, not all organizations have the resources to invest in this transformation and even those that are report that they could benefit from federal support to catalyze this effort. Specifically, healthcare organizations are seeking information on the strategies other organizations find to be most valuable toward becoming learning health systems. They also stated that having a set of performance metrics that would allow them to evaluate their progress over time and to benchmark it against other healthcare organizations would be valuable for their self-monitoring and planning. Finally, they raised interest in a new research investment strategy that aligned with the workflow, rapid decision-making timeline, and iterative process of testing innovation within a healthcare organization.

Competing Demands

Healthcare organizations in the process of becoming or considering how to become a learning health system face competing demands for their attention and resources. The pace at which healthcare organizations will make progress in generating and applying evidence will depend on whether they perceive a return on their investments in becoming a learning health system, the availability of internal and external resources to help them make this transformation, and the external pressures on them to be accountable for managing the cost and quality of patient care. While all of the organizations visited have a track record of publication and sharing knowledge, there was an acknowledgement that the capacity to participate in knowledge generation could come in conflict with business model for the organization. Publishing takes time, and some of what is learned has the potential to offer the organization a financial advantage that could be jeopardized through public dissemination. Public investment in helping health systems generate new knowledge could come with requirements that ensure learning is shared publicly to offer a benefit for all.

Acknowledgment

This brief was prepared for AHRQ by Dr. Andrew Bindman as part of an Intergovernmental Personnel Act agreement with AHRQ. The final report was submitted November 6, 2019. 

1. Nguyen HQ, Bailey A, Coleman KJ, et al. Patient-centered physical activity coaching in COPD (Walk On!): A study protocol for a pragmatic randomized controlled trial. Contemporary Clinical Trials. October 24, 2015. Available at Patient-centered physical activity coaching in COPD Link to Exit Disclaimer
2. Schottinger J, Whittaker J, Kanter MH. A Model for Implementing Evidence-Based Practice More Quickly. NEJM Catalyst. January 26, 2017. Available at A Model for Implementing Evidence-Based Practice More Quickly Link to Exit Disclaimer
3. Weilburg JB, Sistrom CL, Rosenthal DI, et al. Utilization Management of High-Cost Imaging in an Outpatient Setting in a Large Stable Patient and Provider Cohort over 7 years. Radiology 2017;284 (3):766-776.
Page last reviewed May 2019
Page originally created April 2019
Internet Citation: How Learning Health Systems Learn: Lessons from the Field. Content last reviewed May 2019. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/learning-health-systems/how-lhs-learn.html

No hay comentarios:

Publicar un comentario