Strategies to Integrate Genomic Medicine into Clinical Care: Evidence from the IGNITE Network

The complexity of genomic medicine can be streamlined by implementing some form of clinical decision support (CDS) to guide clinicians in how to use and interpret personalized data; however, it is not yet clear which strategies are best suited for this purpose. In this study, we used implementation science to identify common strategies for applying provider-based CDS interventions across six genomic medicine clinical research projects funded by an NIH consortium. Each project’s strategies were elicited via a structured survey derived from a typology of implementation strategies, the Expert Recommendations for Implementing Change (ERIC), and follow-up interviews guided by both implementation strategy reporting criteria and a planning framework, RE-AIM, to obtain more detail about implementation strategies and desired outcomes. We found that, on average, the three pharmacogenomics implementation projects used more strategies than the disease-focused projects. Overall, projects had four implementation strategies in common; however, operationalization of each differed in accordance with each study’s implementation outcomes. These four common strategies may be important for precision medicine program implementation, and pharmacogenomics may require more integration into clinical care. Understanding how and why these strategies were successfully employed could be useful for others implementing genomic or precision medicine programs in different contexts.


Introduction
Precision medicine represents a new, emerging paradigm for healthcare by tailoring treatments to individuals on the basis of characteristics that include biological, behavioral, and demographic data. The emergence of precision medicine as a viable approach to healthcare, compared to the traditional one-size-fits-all approach, follows in large part technological advances, such as sequencing the human genome and harnessing big datasets. Because of the size, complexity, and novelty of information needed to practice precision medicine, implementation must include tools to help clinicians and patients interpret and act on the information [1]. These tools include clinical decision supports (CDS), i.e., "guidelines, prompts, and assists" that deliver information at the point of healthcare delivery [2] (p. 2). Typically, CDSs are integrated with the electronic health record (EHR) to provide just-in-time prompts for clinicians or information for patients [3]. In particular, they have proven efficacious for translating genomic medicine into clinical care [4,5].
However, there is little understanding about how to implement interventions that include CDS for interpreting and using genomic information. Although genomic discoveries have exponentially advanced following the Human Genome Project to sequence the complete human genome over 10 years ago, to date, little research has focused on best practices to translate discoveries into routine care [6,7]. Barriers to translation center on a lack of coordinated and systematic processes to educate stakeholders about genomic medicine innovations and challenges in their integration with existing platforms [8]. Implementation science, the scientific study of methods to promote uptake of innovations in real-world settings, can provide guidance on selecting strategies for translating genomic medicine innovations into clinical practice [9]. Unlike quality improvement, which focuses on specific problems within specific settings, implementation science aims to produce generalizable knowledge about ways to improve healthcare delivery. As such, implementation research starts with an underutilized evidence-based practice and focuses on processes to deliver the practice, providing a frame for defining, measuring, and reproducing strategies to improve use of the clinical practice-the "how"-in different contexts [10,11].
To better understand processes used to implement genomic medicine-focused CDS, we conducted an in-depth evaluation of implementation strategies across a network focused on implementing genomic medicine, called Implementing Genomics In Practice (IGNITE). Each implementation included a CDS intervention to prompt and support providers to consider genomic information in clinical care [12]. We used implementation science to better understand implementation processes, as well as to identify and describe the common implementation strategies related to each project's context and implementation outcomes. The approach and results of this work offer an implementation sciences-based frame for guiding and evaluating clinical implementations of genomic interventions.

Settings
The IGNITE network consisted of six diverse genomic medicine demonstration projects led by academic medical centers allied with community healthcare systems that varied in their goals and approach. Previous publications have described the projects in detail [12][13][14]. In short, three projects implemented different types of pharmacogenomics (PGx) CDS interventions in the EHR (INGENIOUS: Indiana Genomics Implementation, an Opportunity for the Underserved, Indiana University; Genomic Medicine Implementation: the Personalized Medicine Program (PMP), University of Florida; Integrated, Individualized, and Intelligent Prescribing (I 3 P) Network, Vanderbilt University). Three projects had disease focus (PDMP: the Personalized Diabetes Medicine Program at the University of Maryland School of Medicine to identify individuals with monogenic subtypes of common disease; the GUARDD Study: Genetic Testing to Understand and Address Renal Disease Disparities, Icahn School of Medicine at Mount Sinai, to proactively identify patients at risk for chronic disease; Implementation, Adoption, and Utility of Family History in Diverse Care Settings, Duke University to implement a patient-facing web-based family health history-based risk assessment tool integrated with the EHR). All projects implemented CDS tools into Epic EHR, with two projects, INGENIOUS and I 3 P, additionally including homegrown EHRs in some affiliated health systems.

Frameworks
Theoretical frameworks in implementation science offer common terms and definitions to identify and explain complex phenomena experienced across diverse contexts [15]. One highly cited implementation science framework, Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM), offers dimensions for explicitly reporting key aspects of translating evidence-based interventions into diverse settings, including Reach (R)-the number, proportion, or representativeness of individuals willing to participate and Effectiveness (E)-the impact on outcomes including potential negative effects of the intervention [16,17]. For example, Wu and colleagues (2019) used RE-AIM to illustrate that diverse healthcare settings could successfully implement a new computerized family health history screening tool, although odds of completing the screening decreased with male sex and minority race [18]. Proctor and colleagues (2013) additionally published guidance for reporting implementation strategies, recommending that authors describe specific dimensions such as the stakeholders involved (actors) or when they used the strategy (action), with an eye toward measurement and reproducibility [19]. Powell and colleagues (2015) further developed a taxonomy of evidence-based implementation strategies, known as the Expert Recommendations for Implementing Change (ERIC) project. This taxonomy includes 73 implementation strategies, organized according to nine domains. Prior work has identified ERIC implementation strategies used to meet common barriers for genomic medicine implementation [20]. While the ERIC provides a useful compendium, the full list of strategies, originally developed within the context of mental health research and practice, have yet to be evaluated in conjunction with Proctor's detailed reporting criteria in the context of genomic medicine implementation.

Procedures
Three implementation scientists on the research team worked with project teams to systematically elicit information about implementation strategies and outcomes in two phases. The IRB approved study procedures.
During the first phase, they developed a web-based, self-administered 15 min structured survey to gather information on implementation strategies used at sites. The format was based on a previously published survey of implementation strategies, in which questions about strategies were organized by nine domains, or clusters, grouping conceptually related strategies together (e.g., using evaluative and iterative strategies) [21,22]. The implementation strategies came from the ERIC, a taxonomy of evidence-based implementation strategies. This taxonomy includes 73 implementation strategies, organized according to nine domains. This survey of IGNITE projects queried the use of 72/73 ERIC implementation strategies, excluding a question about the strategy of "developing academic partnerships", because it was integral to the consortium as a whole. The survey asked about use of a cluster of strategies generally (e.g., During IGNITE I, did your project use any of these evaluative and iterative strategies to implement your innovation at any of your project sites?) and then specific strategies within each cluster, with the response options of yes, no, and not sure (see Supplementary Materials Files S1 for survey questions). The survey was programmed in Qualtrics (Provo, UT, USA), and a link was emailed to project coordinators at each site for completion. Survey results were analyzed using Microsoft Excel (Redmond, WA, USA).
Common strategies employed by all six projects were identified for follow-up in a second phase. The implementation scientists conducted 30-45 min phone-based qualitative interviews with project coordinators and one principal investigator, with the exception of two projects, in which the principal investigator (PI) responded directly by email. These interviews included questions about implementation outcomes as specified by the RE-AIM planning framework and detailed information about how strategies were deployed, as specified by the "reporting dimensions of implementation strategies" guidance (actor, temporality, action, justification, target) [16,19]. The respondents received the list of questions approximately 1 week before the phone interview and had the opportunity to add more information later by email and telephone (Supplementary Materials Files S1). NVivo 12 software (Melbourne, Australia) was used to manage, organize, and query the qualitative data for analysis.

Variety of Implementation Strategies Used across the Network
On average, IGNITE projects implemented 32 ERIC strategies. The number of strategies used by each project varied, ranging from 11-47 ( Figure 1). Each of the three PGx projects used over 40 strategies, while the three disease-focused projects used 11-29 strategies (see Supplementary Materials Files S2 for results).
project coordinators at each site for completion. Survey results were analyzed using Microsoft Excel (Redmond, WA, USA).
Common strategies employed by all six projects were identified for follow-up in a second phase. The implementation scientists conducted 30-45 min phone-based qualitative interviews with project coordinators and one principal investigator, with the exception of two projects, in which the principal investigator (PI) responded directly by email. These interviews included questions about implementation outcomes as specified by the RE-AIM planning framework and detailed information about how strategies were deployed, as specified by the "reporting dimensions of implementation strategies" guidance (actor, temporality, action, justification, target) [16,19]. The respondents received the list of questions approximately 1 week before the phone interview and had the opportunity to add more information later by email and telephone (Supplementary Materials Files 1). NVivo 12 software (Melbourne, Australia) was used to manage, organize, and query the qualitative data for analysis.

Variety of Implementation Strategies Used across the Network
On average, IGNITE projects implemented 32 ERIC strategies. The number of strategies used by each project varied, ranging from 11-47 ( Figure 1). Each of the three PGx projects used over 40 strategies, while the three disease-focused projects used 11-29 strategies (see Supplementary Materials Files 2 for results).

Common Implementation Strategies Found among Diverse Implementation Projects
Despite the diversity of project goals and approaches, four strategies from three clusters were used across all six projects ( Figure 2): (1) developing strategies to obtain and use stakeholder feedback (cluster-using evaluative and iterative strategies), (2) identifying early adopters (cluster-developing stakeholder interrelationships), (3) conducting educational meetings (cluster-training and educating stakeholders), and (4) having an expert meet with clinicians to educate them (cluster-training and educating stakeholders).

Common Implementation Strategies Found among Diverse Implementation Projects
Despite the diversity of project goals and approaches, four strategies from three clusters were used across all six projects ( Figure 2): (1) developing strategies to obtain and use stakeholder feedback (cluster-using evaluative and iterative strategies), (2) identifying early adopters (cluster-developing stakeholder interrelationships), (3) conducting educational meetings (cluster-training and educating stakeholders), and (4) having an expert meet with clinicians to educate them (cluster-training and educating stakeholders).
3.2.1. Implementation Strategy 1: Obtaining and Using Stakeholder Feedback (e.g., from Patients, Families, or Providers) to Evaluate and Iteratively Develop the Genomic Program All projects reported obtaining some form of feedback from stakeholders (Table 1). In all cases, experts were involved; however, the sources of feedback varied (researchers, administrators, community advisory board, clinicians, and patients). Generally, the projects obtained stakeholder feedback before project start and continued throughout, although not necessarily systematically, with the exception of GUARDD, which organized standing Stakeholder Advisory Board meetings. Actions for obtaining stakeholder feedback included a mix of informal and formal steps. For example, Implementation, Adoption, and Utility of Family History in Diverse Care Settings conducted pre-implementation meetings with all clinics and formal assessment with providers throughout, and GUARDD had meetings with their Stakeholder Advisory Board, while others informally asked for feedback during existing meetings with providers. All projects justified using stakeholder feedback to make sure that the project would work at the implementing site, mostly a function of the PI's prior experience with multi-site and community-based research, such as knowing with whom to engage to ensure buy-in for testing the project. The targets of change from obtaining feedback varied, including understanding leverage points for implementing genomic medicine within healthcare systems, seeking ways to bolster recruitment and retention, or improving provider knowledge about genomic data. All projects reported obtaining some form of feedback from stakeholders (Table 1). In all cases, experts were involved; however, the sources of feedback varied (researchers, administrators, community advisory board, clinicians, and patients). Generally, the projects obtained stakeholder feedback before project start and continued throughout, although not necessarily systematically, with the exception of GUARDD, which organized standing Stakeholder Advisory Board meetings. Actions for obtaining stakeholder feedback included a mix of informal and formal steps. For example, Implementation, Adoption, and Utility of Family History in Diverse Care Settings conducted pre-implementation meetings with all clinics and formal assessment with providers throughout, and GUARDD had meetings with their Stakeholder Advisory Board, while others informally asked for feedback during existing meetings with providers. All projects justified using stakeholder feedback to make sure that the project would work at the implementing site, mostly a function of the PI's prior experience with multi-site and community-based research, such as knowing with whom to engage to ensure buy-in for testing the project. The targets of change from obtaining feedback varied, including understanding leverage points for implementing genomic medicine within healthcare systems, seeking ways to bolster recruitment and retention, or improving provider knowledge about genomic data.  All projects, prior to implementation, identified champions, i.e., individuals committed to supporting and promoting implementation of the practice, to help obtain buy-in and enroll participants (Table 2) [23]. Typically, projects did not employ specific, prescribed steps to identify champions, with the exception of Implementation, Adoption, and Utility of Family History in Diverse Care Settings, in which the local PIs were each asked to identify a champion at their place. Otherwise, champions spread the word through educational meetings or helped to inform the projects by working with the PI. Each project largely had a pragmatic reason for using this implementation strategy in that site champions would bring attention to the project among providers or offer access to others for support. This implementation strategy mostly targeted provider knowledge and skill to, in turn, change clinical processes to include the genomic information.

Implementation Strategies 3 and 4: Conducting Educational Meetings and Having an Expert Meet with Clinicians to Train or Educate Providers to Deliver the Genomic Program
We present the two strategies having to do with the training and educating stakeholders strategy cluster together in one table (Table 3), because, although they are discrete strategies according to the ERIC typology, results indicated that they went hand-in-hand for these genomic medicine implementations. Strategies used for "training and educating stakeholders" involved PIs presenting information about their project and protocol along with subject experts to clinicians who would likely be involved with implementation. The Implementation, Adoption, and Utility of Family History in Diverse Care Settings approach differed slightly in that the project crossed clinical areas and the PI had expertise in use of the web-based family health history tool. Projects generally used these strategies during pre-implementation, with the exception of PMP, which used it throughout the study on an ad hoc basis. Research teams did not report formal steps for employing this strategy, with meetings set as needed to educate clinicians or, in the case of the GUARDD study, integrated with regular, standing provider meetings. Generally, projects used these stratagies for pragmatic reasons to make sure that clinicians who would be integral to trial implementation understood and accepted the innovations, protocols, and evidence. Experts were used to engage directly with clinicians by providing first-hand experiences (PMP) and to educate peers about empirical evidence behind the project. Across the board, the action, or target, was to change provider knowledge about the content area and bring them into the fold with study protocol. Table 3. Specification of "conducting educational meetings" and "having an expert meet with clinicians" implementation strategies by IGNITE 1 genomic medicine implementation projects.

Implementation Outcomes
All projects focused on patient-level outcomes to evaluate implementation. Table 4 describes outcomes according to RE-AIM dimensions of Reach, Adoption, and Effectiveness. Table 4. Implementation outcomes and strategies of IGNITE 1 genomic medicine project.

Genomic Medicine
Project Implementation Outcomes Implementaiton Strategies

Reach 1 Adoption 1 Effectiveness 1
Who actually was exposed to the service?/ Who is or was intended to benefit from your genomic service?
Where is or was the program applied and who applied it?
What is or was the most important benefit you are or were trying to achieve? Were there negative outcomes?

Discussion
Although we identified common implementation strategies, the detailed reporting criteria revealed different manifestations of the strategies across the projects. For example, all projects employed a strategy for "obtaining and using stakeholder feedback"; however, each project described how they uniquely employed this strategy, including using pre-implementation meetings with clinicians, a stakeholder advisory board reflecting the clinician and patient population [25], involvement of a Clinical and Translational Science Institute, inclusion of patients, and weekly meetings with a multidisciplinary team. Additionally, the strategy for "identifying early adopters" differed across projects, for example, with the Implementation, Adoption, and Utility of Family History in Diverse Care Settings identifying site champions at each clinic who would implement the project versus GUARDD using the project team as champions to increase awareness among providers that they would enroll patients into the study, test them, and return genetic test results. Education strategies varied as well, with, for example, PMP using pharmacists to educate providers via case studies and INGENIOUS training clinicians as part of the study team. This variability in the use of common implementation strategies makes sense when considered alongside each project's RE-AIM implementation outcomes; each project had a different target for adoption (e.g., four major, diverse healthcare systems in the US versus 15 neighborhood-based clinics in one region) or effectiveness (e.g., feasibility of implementing genetic risk assessment in diverse settings vs. improved individual outcomes through genetic testing). This study underscores the importance of defining mechanisms, i.e., precise descriptions of processes or events through which implementation strategies affect implementation outcomes, in describing and evaluating implementation endeavors in general [26].
A previous analysis identified different strategies used by a number of IGNITE projects to meet specific implementation barriers [8]. These included strategies to integrate genomic data into the EHR and engage participants in genomic medicine projects. The present study adds to that prior work by identifying implementation strategies used as part of overall project plans, rather than a response to specific barriers during the course of implementation. Additionally, the earlier query was conducted while the projects were ongoing, while this one was conducted after external funding ended, allowing project coordinators to reflect on core implementation strategies. Differences could also reflect a need for better refinement of the ERIC typology. Perry et al. (2019) also applied ERIC taxonomy in conjunction with Proctor criteria in the context of cardiac prevention in primary care and suggested revisions to refine the taxonomy, including suggestions to combine strategies just as we did in this report with the education strategies in Table 3 [27]. Despite differences between the two analyses of IGNITE implementation strategies, there was some similarity in a common use of educational strategies to improve clinician knowledge and beliefs. Although the ERIC taxonomy does apply to different health-related areas, further work could focus on developing a version of the ERIC taxonomy specifically for genomic medicine implementation.
The three IGNITE PGx projects each reported using a greater number of implementation strategies than the three disease-focused projects. This difference may reflect more extensive infrastructure used to integrate PGx into routine care, for example, financial billing strategies such as new clinic codes or performance indicators such as turnaround time [28][29][30][31]. Additionally, PGx implementation may require more strategies for training or educating providers on how to interpret and use information for the range of drug-gene pairs included than disease-focused projects [28]. In contrast, the GUARDD project, which relied on the project team rather than providers to return results to patients, used the fewest number of implementation strategies. It could be that this difference between PGx and disease-focused projects dissipates when implementing outside of a funded demonstration project. While this study of the IGNITE consortium focused on common, core strategies, future work could further identify and compare strategies by type of genomic medicine implementation.
This study had several limitations. IGNITE genomic demonstration projects received federal research funding support and, thus, may not represent experiences of those seeking to implement genomic medicine interventions without this kind of support. In addition, reports of strategies used reflect the recall of project coordinators and principal investigators. There might have been other implementation strategies used during the course of project implementation. Additionally, these findings reflect implementation experiences from within US healthcare institutions. As such, there may be different strategies used when initiated by a governmental entity. This kind of approach to specify implementation strategies according to published criteria and definitions could be used to compare implementation by national or regional healthcare systems around the world. However, regardless of these limitations, this paper helps to build the evidence base of strategies for implementing genomic medicine.

Conclusions
Implementing a genomic medicine service is a daunting task, and this study yields three key lessons to help guide others interested in implementation. Firstly, genomic medicine projects will end up using a variety of strategies tailored to the environment and practice, with the number of strategies among these demonstration projects ranging from 11 to 47. Secondly, the four strategies highlighted in this analysis can serve as a manageable starting point for future implementation. Thirdly, systematic PGx programs, in which patients' genotypes are made available in the EHR to preemptively guide prescribing, can be complicated to implement; for example, IGNITE PGx projects used more implementation strategies than disease-focused ones. Although this study was not designed to identify which strategies are more critical to implement than others for specific practices or desired endpoints, further work can identify the necessity and sufficiency of particular strategies within specific contexts.