Next Article in Journal
Defensive Medicine in the Management of Cesarean Delivery: A Survey among Italian Physicians
Next Article in Special Issue
Appointment Length with Patients in Medical Consultations in Bangladesh: A Hospital-Based Cross-Sectional Study
Previous Article in Journal
Combating the Infodemic: A Chinese Infodemic Dataset for Misinformation Identification
Previous Article in Special Issue
Student Nurses Undertaking Acute Hospital Paid Placements during COVID-19: Rationale for Opting-In? A Qualitative Inquiry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Understanding the Role of External Facilitation to Drive Quality Improvement for Stroke Care in Hospitals

by
Tharshanah Thayabaranathan
1,*,
Nadine E. Andrew
1,2,
Rohan Grimley
1,3,4,
Enna Stroil-Salama
5,
Brenda Grabsch
6,
Kelvin Hill
7,
Greg Cadigan
3,
Tara Purvis
1,
Sandy Middleton
8,
Monique F. Kilkenny
1,6,
Dominique A. Cadilhac
1,6 and
on behalf of the Stroke123 Investigators and AuSCR Consortium
1
School of Clinical Sciences at Monash Health, Monash University, Clayton, VIC 3168, Australia
2
Peninsula Clinical School, Central Clinical School, Monash University, Frankston, VIC 3199, Australia
3
Queensland State-Wide Stroke Clinical Network, Brisbane, QLD 4000, Australia
4
Sunshine Coast Clinical School, Griffith University, Birtinya, QLD 4575, Australia
5
Metro South Research, Metro South Health, Brisbane, QLD 4102, Australia
6
Stroke Division, The Florey Institute of Neuroscience and Mental Health, Heidelberg, VIC 3052, Australia
7
Stroke Foundation, Melbourne, VIC 3000, Australia
8
Nursing Research Institute, St Vincent’s Health Network Sydney, St Vincent’s Hospital Melbourne, Australia and Australian Catholic University, Sydney, NSW 2010, Australia
*
Author to whom correspondence should be addressed.
The members of the collaborative group are listed in the Supplementary Material.
Healthcare 2021, 9(9), 1095; https://doi.org/10.3390/healthcare9091095
Submission received: 20 July 2021 / Revised: 18 August 2021 / Accepted: 20 August 2021 / Published: 25 August 2021

Abstract

:
The use of external facilitation within the context of multicomponent quality improvement interventions (mQI) is growing. We aimed to evaluate the influence of external facilitation for improving the quality of acute stroke care. Clinicians from hospitals participating in mQI (Queensland, Australia) as part of the Stroke123 study were supported by external facilitators in a single, on-site workshop to review hospital performance against eight clinical processes of care (PoCs) collected in the Australian Stroke Clinical Registry (AuSCR) and develop an action plan. Remote support (i.e., telephone/email) after the workshop was provided. As part of a process evaluation for Stroke123, we recorded the number and mode of contacts between clinicians and facilitators; type of support provided; and frequency of self-directed, hospital-level stroke registry data reviews. Analysis: We measured the association between amount/type of external facilitation, (i) development of action plans, and (ii) adherence to PoCs before and after the intervention using AuSCR data from 2010 to 2015. In total, 14/19 hospitals developed an action plan. There was no significant difference in amount or type of external facilitator support provided between hospitals that did, and did not, develop an action plan. There was no relationship between the amount of external facilitation and change in adherence to PoCs. Most (95%) hospitals accessed stroke registry performance data. In the Stroke123 study, the amount or type of external facilitation did not influence action plan development, and the amount of support did not influence the changes achieved in adherence to PoCs. Remote support may not add value for mQI.

1. Introduction

Despite widely available evidence supporting clinical interventions that improve health outcomes for patients hospitalized for acute stroke, adherence to these recommended interventions is often suboptimal [1,2]. In Australia, there is evidence that adherence to recommended clinical processes of care (PoCs) varies between hospitals that treat patients with acute stroke [3]. Factors that may contribute to this variation include the ability to keep up to date with current evidence, clinicians’ values and attitudes, willingness to change practice, and time constraints [4,5]. Such discrepancies are concerning given that variability in acute patient care has been shown to adversely affect patient outcomes, and it results in ineffective use of health care resources [6].
Grimshaw et al. provided an overview of the effectiveness of professional behavior change strategies available to support quality improvement (QI) activities at a hospital level [7]. These strategies have been categorized by the Cochrane Effective Practice and Organization of Care Group [8] and include techniques such as educational outreach visits; audit and feedback; interprofessional collaboration; reminders; and tailored interventions. However, the most effective implementation strategies may be those that utilize a multicomponent approach [5,9]. The authors of a more recent review offered no compelling evidence on what is an effective strategy to change clinician behavior in acute care settings [10]. Therefore, understanding what components of a QI intervention are effective in changing clinicians’ behaviors to reflect best evidence is challenging.
Although many factors may influence the translation of evidence into practice, active facilitation in some form is often required to elicit change in behavior and, thus, practice [4]. Facilitation is simply a technique by which one person makes things easier for others by aligning the available evidence to the particular context [11]. This may be provided by either internal (i.e., inside the local hospital) and/or external (i.e., outside of the local hospital) facilitators [12]. Facilitation is also a key element of The PARiHS framework (integrated—Promoting Action on Research Implementation in Health Services) that has proved to be a useful practical and conceptual tool for many researchers and clinicians in quality improvement programs [11]. Facilitators are individuals with the specific roles, skills, and knowledge to help clinicians, teams, and organizations deliver evidence-based care by prompting change in their professional behavior [13]. They have a key role in helping individuals and teams understand what, and how, they need to change practice. Some key roles of external facilitators are the following: identify, engage, and connect stakeholders; facilitate collaboration including the development of action plans; support communication and information sharing; provide updates on evidence-based strategies; and evaluate practice change efforts [14]. However, very little is known about the type or amount of external facilitation that is required to achieve changes in practice.
The Stroke123 study was a pragmatic, real-world before and after registry-based study designed to assess the effectiveness of financial incentives (implemented to improve access to stroke units) coupled with a multicomponent quality improvement (mQI) intervention [15]. Collectively, the financial incentives program and the mQI were the intervention tested in the Stroke123 study. The mQI, known as the Enhanced StrokeLink program, included audit and feedback processes, education and action plan development through a single onsite workshop provided by an external facilitator with ongoing remote support (telephone/email) after the workshop (see Online Supplement Figure S1 and Table S1). Overall, there was an 18% improvement in the change in median composite score (i.e., summarized in a single measure as the proportion of all needed care that was given) for adherence to <=8 PoCs measured in the Australian Stroke Clinical Registry (AuSCR) across the study periods (95% CI, 12–24%). In secondary outcome analyses, the Enhanced StrokeLink program was associated with a non-statistically significant additive increase in the primary composite score (4%), and a 3% additive increase in composite score limited to the PoCs included in action plans across study periods [15]. In Stroke123, not all hospitals developed action plans during the workshop. We are also unsure of whether the additional support provided outside of the workshops increased the success of this mQI and sought to explore these issues further using data obtained as part of the process evaluation for Stroke123.
The objective of this study was to provide process evaluation information in relation to the external facilitator role as part of the mQI component of the Stroke123 intervention. Specifically, we sought to evaluate (i) the influence of external facilitation on whether or not action plans were used for improving the quality of acute stroke care as part of the mQI, and (ii) to assess the association between amount/type of external facilitation and adherence to the PoCs specified within the action plans before and after the mQI.

2. Materials and Methods

2.1. Study Design

This paper features a process evaluation for the mQI intervention (Enhanced StrokeLink program) of a pre–post observational study with audit and feedback.

2.2. Population

Staff from public hospitals providing acute stroke care located in the state of Queensland, Australia that collect data using the AuSCR [16] were eligible to participate in the mQI intervention (N = 23). The program was delivered between March 2014 and November 2014. Ethics approval was obtained from all participating hospitals, with the lead ethics committee in Queensland being the Metro South Human Research Ethics Committee (HREC/13/QPAH/31). This study was carried out in accordance with the Declaration of Helsinki. The AuSCR has ethics approval to use an opt-out method of consent for collection of patient clinical data.

2.3. QI Intervention Description

A detailed description of the overall Stroke123 study and data sources has been previously reported [15,16]. In a previous publication, the Alberta Context Tool has been used to describe aspects of organizational context that have affected the delivery of care in these hospitals prior to the workshops being delivered by the facilitator. This preliminary research revealed important insights about the role of context, including that culture and social capital were the main aspects of organizational context affecting the delivery of evidence-based care rather than differences in perceptions of context based on clinician background or the location of the stroke service (e.g., metropolitan or rural location) [17].
The mQI workshop component was based on the Plan–Do–Study–Act (PDSA) method [18], where feedback was provided to clinicians on their hospital performance, and action plans were developed to improve adherence to clinical guideline recommendations. The workshop conducted once at each hospital was run by an external facilitator employed by the Stroke Foundation who had a clinical background and a range of skills including quality improvement methodology [15,19]. As part of the audit and feedback processes, the AuSCR data were used to (1) identify practice gaps, (2) focus the development and implementation of action plans for areas the local staff at these hospitals wanted to work on, and (3) provide a reliable method to routinely monitor performance since the PoC data are collected on consecutive admissions (i.e., adherence to acute stroke PoCs). At the face-to-face workshop (scheduled for ≈3 h duration) the external facilitators educated the participating hospital staff on the current best practice recommendations associated with the PoCs collected in AuSCR. They also highlighted local gaps in practice using the AuSCR data, identified barriers to adherence to these PoCs, and provided customized, site-specific strategies as suggestions to address identified barriers to improve practice. Time permitting, the hospital staff then discussed and agreed on a focused action plan that was to address the PoCs selected to work on improving [16]. Additional support was also provided by external facilitators from the Queensland State-wide Stroke Clinical Network (i.e., leads) and the AuSCR (i.e., project coordinator). Queensland State-wide Stroke Clinical Network clinical leads provided education on a range of topics, and they also promoted networking and shared learning with other stroke clinicians throughout the state. AuSCR project coordinators assisted in the extraction and interpretation of the PoC data, at the time of performance gap identification, action plan design, and implementation. All external facilitators provided ongoing peer support when required, via telephone, face-to-face, and/or email after the initial workshop.

2.4. Data Collection

In order to capture the amount and mode of facilitator support for the Stroke123 process evaluation, a specifically designed data collection tool was developed to enable the prospective and standardized collection of activity data from the external facilitators. Items recorded included the frequency, duration, and mechanism of external facilitation episodes (Table 1). The tool included pre-specified categories to ensure standardized data capture.
The amount of external facilitation was determined by the frequency and duration of professional behavior change support provided to clinicians, mode of support delivery, and time spent delivering support (including the workshop). These parameters were used to understand the influence of external facilitation required to achieve an effect from action plans for this current study.

2.5. Clinical Indicator Data to Assess Quality Improvement Effects

As part of developing and implementing action plans, there were eight acute stroke PoCs collected in AuSCR within Queensland available as options for quality improvement. These options included: treatment in a stroke unit; in acute ischemic stroke, use of intravenous thrombolysis, aspirin <48 h, and prescription of antiplatelet/other antithrombotic medication at discharge; early patient mobilization; use of a swallow screen/assessment before feeding; prescription of antihypertensive medication at discharge; and use of a discharge care plan if the hospital is separate to the community [15].
AuSCR data (from 2010 to 2015) were used as a proxy to assess the successful implementation of action plans by comparing changes in the mean composite score (see below) before and after the development of action plans as part of the mQI program.

2.6. Calculation of Composite Score

The composite score is calculated as the proportion of patients with the documented measure (i.e., acute stroke PoCs) divided by the number of patients eligible for the measure. Where applicable, eligibility criteria for a PoC are specified and decision rules around eligibility are the same as the Stroke123 results paper [15]. For example, adherence to the administration of aspirin within 48 h is relevant only for patients presenting with ischemic stroke or transient ischemic attack. The percent adherence was calculated for each PoC for the participating hospitals. Details in how the composite score was calculated are provided in the Stroke123 results paper [15]. For this study, the absolute change in composite scores between the pre-intervention phase and post-intervention phases was calculated and stratified by whether hospitals developed an action plan and by the amount and mode of facilitator support.

2.7. Data Analyses

Descriptive analyses were undertaken for the discrete and continuous data used for this study. Accurate data for the time duration for writing an email was infeasible to obtain, so we assumed that each email would take an average time of 15 min. Normality tests were conducted to check the distribution of each continuous variable. The Wilcoxon rank-sum test was conducted to explore the association between the development of an action plan or not and the amount of external facilitation provided for each hospital during the intervention period. Mean and standard deviation were calculated for Hedge’s g, which is an effect size used to indicate the standardized difference between two means when sample sizes are <20. Hedge’s g statistics of 0.20 to 0.49 are considered small standardized effect sizes, 0.50 to 0.79 are considered medium standardized effect sizes, and 0.80 or greater is considered a large standardized effect size [20].
A p-value of <0.05 was considered statistically significant. Data were analyzed using Stata (Stata Statistical Software, Release 12.0; Stata Corporation, College Station, TX, USA).

3. Results

Staff from 19 of 23 eligible Queensland hospitals agreed to participate in the mQI intervention (urban hospitals: 55%; rural hospitals: 45%; 16% were teaching hospitals). During the intervention delivery phase (March 2014 to November 2014), nine different external facilitators provided support to the 19 hospitals. The most frequent activities provided by the facilitators were educational outreach (42%), which was followed by interprofessional collaboration (30%), review of audit data (11%), and reminders (6%) (Figure 1).
During the workshops, 14/19 (74%) hospitals developed action plans. Five out of 19 hospitals did not develop an action plan following the workshop: the reasons included already having developed a QI plan and local staff changes. Among the 14 hospitals that developed an action plan, most hospitals selected eight processes of care to work on. Throughout the intervention, 18/19 (95%) hospitals accessed their AuSCR data for performance monitoring (overall number of times accessed AuSCR online data reports: median (Q1, Q3): 12 [7,18]).
The number of times external facilitators provided support and the mode of contact during the intervention period is shown in Table 2. There was a clinically relevant but not statistically greater amount of external facilitator support in hospitals that developed an action plan when compared to hospitals that did not develop an action plan. This finding was relevant overall as well as for the subgroups of mode of contact (Table 2, p = 0.308). Based on Hedge’s g, the effect size of external facilitation provided by face-to-face only was considered medium; telephone was small; and the effect size for total contact time via all modes was medium.
Among the 9/14 hospitals that developed an action plan, the adherence to the PoCs, as measured by mean composite scores, improved post-intervention (Figure 2 and Figure 3). Moreover, 4/5 hospitals that did not develop an action plan also observed an increase in adherence to these PoCs as determined by the mean composite scores (Figure 2).
There was no relationship between amount of external facilitation and absolute change in composite scores (Figure 4 and Figure 5). However, in some hospitals, it appears that support provided by external facilitators to staff at participating hospitals may have had a positive impact on adherence to the acute stroke processes of care.

4. Discussion

This is one of the few studies in which the influence of external facilitators has been assessed for stroke based on the PDSA model. In this process evaluation study, we had a specific focus on the mQI component of the Stroke123 intervention (financial incentives plus mQI) that was previously assessed as part of a controlled before and after study, and it was found to be highly effective [15]. In the current study, we found that on average, seven hours more contact time with staff was provided by facilitators for hospitals that developed an action plan. We did not find a statistically significant association between the amount or mode of external facilitation and the development or success of the action plans with improving process of care adherence. All hospitals improved their performance relative to historical performance, and those with action plans appeared to do better with the receipt of more facilitator time. Perhaps external facilitation cannot be viewed as an individual component, but it is one among many integral components of a PDSA model and our PARiHS framework-based mQI intervention that has contributed to improving adherence to these acute stroke PoCs, as demonstrated in the Stroke123 study [15]. Since the access to financial incentives as part of the Stroke123 intervention was available to all hospitals, it is unlikely to have differentially impacted our assessment of the association between external facilitation and action planning presented in this study.
Few mQI studies include process evaluations with details of the amount and mode of facilitation provided. It is important to evaluate the processes of a study so that other researchers can reliably implement or build on the current intervention. Based on the Template for Intervention Description and Replication (TIDieR) checklist, some of the key features should include reporting of duration, dose or intensity, and mode of contact, as these can influence the effectiveness and potential replicability of the program [21]. All hospitals received a single face-to-face workshop with standardized components. However, the final element, the action plan, was not always delivered due to time constraints. This provided an opportunity to assess whether action plans make a difference in addition to the amount and mode of support provided by external facilitators within the context of acute hospital care settings. The results, based on Hedge’s g, suggests that although support from external facilitators may be useful, much is unknown about the ideal amount or mode of facilitation required to support change. Since all hospitals received a workshop, it appears that additional remote support may not add value for this type of mQI.
To some extent, our results may reflect reverse causation whereby these correlations may have been due to the following: (1) not targeting an achievable or manageable number of processes of care may have led to inefficient efforts and greater facilitator time; and (2) hospitals with significant barriers to improvement, or less effective internal facilitation, may have had a greater need for external assistance whilst having a reduced ability to impact change. It is possible that external facilitation is necessary, but the amount of external facilitation provided may depend on the number of issues needed to be addressed, staff availabilities, organizational context, and resources available per hospital [22]. Increasing amounts of external facilitation may not necessarily result in improved outcomes. It is also important to note the complexity of strategies chosen for improvement. Some changes take longer to implement than others and, in some circumstances, although a specific strategy may be implemented, it might take a longer period to see change in adherence as measured by the AuSCR data.
Several different modes of delivery were provided by different external facilitators. The presence of an external facilitator who provides support (via face-to-face and telephone) may ensure action plans are developed and implementation is followed through to support change in practice. Quality-of-care activities in the ‘Get With The Guidelines Stroke’ program evaluated the processes of workshops and impact of facilitation, which were shown to improve adherence to all primary performance measures, i.e., improve quality of care [23].
In this study, we noted that throughout the mQI, staff from most hospitals accessed the AuSCR data consistently for performance monitoring. This observation indicates that data from the AuSCR are important and can be used by staff to assess their hospital’s performance and make relevant changes where necessary as part of their routine QI practice. A limitation is that we were unable to directly quantify access to the data reports available for hospital staff to download at any time from the AuSCR and include this information in our analysis.
The strengths of this study are the number and variety of hospitals that participated (e.g., teaching, urban and rural hospitals), offering a broad cross-section of hospitals that provide stroke care similar to other parts of Australia or other countries. Additionally, a standardized data collection tool was used to collect data from all external facilitators. All external facilitators had a nursing or allied health background and completed an extensive standardized training program designed and delivered by the Stroke Foundation in order to prepare them for their role [15]. Limitations of this study include the inability of five hospitals to develop an action plan as part of their face-to-face workshop. Evidence from complementary research suggests that future approaches to PDSA model-based studies and action planning may need to be undertaken in more than one workshop or stages. For example, a two-step process may be beneficial whereby the external facilitators initially educate and identify the local barriers and enablers at the hospital, followed by a second meeting that is completely focused on action planning [24]. Another limitation is that contact data were self-reported and only reflected the facilitators’ perspectives, and it did not include those of the hospital staff receiving the support. In addition, no information on internal facilitation at a hospital level, if any, was available. Although most of the time, the external facilitators provided data on a fortnightly basis, on some occasions, there was a two-month delay. Hence, there may have been potential for recall bias. Nevertheless, the external facilitators kept personal diaries and email records, which provide objective data to ensure that any recall bias was minimized. In-depth data on the content of the professional behavior change support provided were not available. Our research presented here offers a case study based on the context of stroke, which may or may not be applicable to other settings. Nevertheless, our findings highlight important insights into how much facilitator time may be required to assist clinicians, teams, or organizations to address variation in evidence-based care by prompting change in their professional behavior based on Plan–Do–Study–Act methods relevant to their local hospital setting. Our study provides useful data to guide the design of future research, especially with respect to the investment in external facilitators. In future studies, there is a need to evaluate the effectiveness of external facilitation from those receiving in-depth support (i.e., clinicians and hospital staff) to directly assess the impact on adherence to care processes and patient outcomes. With the sample size of 19 hospitals, our study was potentially underpowered to detect statistically significant differences for the primary aim of this study. Our results provide much food for thought and assist to advance the field.

5. Conclusions

We were unable to demonstrate a significant relationship between amount or mode of contact of external facilitation and the development of action plans or change in adherence to acute stroke processes of care. External facilitation cannot be viewed as an individual component but may be an integral component of a complex mQI intervention. Further work is required to understand the optimal amount and mode of contact of external facilitation to enable efficient design of mQI interventions. It may be that additional remote support may not add value to this type of mQI.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/healthcare9091095/s1, Figure S1: Overview of the Enhanced StrokeLink Program. AuSCR: Australian Stroke Clinical Registry; QSSCN: Queensland State-wide Stroke Clinical Network; Pre-intervention: control with baseline audit and feedback via StrokeLink (January 2010 to June 2012) and addition of financial incentives, July 2012 to March 2014); Intervention: financial incentives and addition of the Enhanced StrokeLink program, March 2014 to November 2014; and Post-intervention period, November 2014 to December 2015), as outlined in the Figure S1, Table S1: Example Enhanced StrokeLink Action Plan.

Author Contributions

D.A.C. conceived the study design and developed the study protocol. T.T. collected external facilitation data. T.T. and N.E.A. were involved with manuscript development and data analysis with input from D.A.C., R.G., E.S.-S., B.G., K.H., G.C., T.P., S.M. and M.F.K. provided input into manuscript revisions, and all authors read and approved the final version. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the National Health and Medical Research Council (NHMRC ID: 1034415) in Australia. TT was supported by the Australian Government Research Training Program Scholarship. DAC was supported by a fellowship from the National Health and Medical Research Council (1063761 co-funded by Heart Foundation). NEA and MFK were supported by a NHMRC Early Career Fellowship (1072053 and 1109426, respectively). The AuSCR also received restricted education grants from Allergan Australia, Ipsen, and Boehringer Ingelheim, as well as several donations from consumers.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Metro South Human Research Ethics Committee (HREC/13/QPAH/31).

Informed Consent Statement

The AuSCR has ethics approval to use an opt-out method of consent for collection of patient clinical data.

Data Availability Statement

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Acknowledgments

We acknowledge the assistance of Megan Reyneke who was involved with the development of the data collection tool. We also thank Libby Dunstan for her input to the earlier version of the manuscript. We thank the support of external facilitators from the various programs who provided data to use for this case study and other members of the Stroke123 working group.

Conflicts of Interest

The authors declare the following potential conflicts of interest with respect to the research, authorship, and publication of this article: D.A.C. is the current Data Custodian for the Australian Stroke Clinical Registry (AuSCR). R.G. is the clinical lead for the Queensland State-wide Stroke Clinical Network and member Stroke Foundation Clinical Council. K.H. manages the Stroke Foundation’s National Stroke Audit and Stroke Clinical Guidelines programs. S.M., D.A.C., G.C., K.H. and R.G. are members of the AuSCR Steering or Management committees. S.M. was formerly a member of the NHMRC Research Committee but was appointed after this grant was awarded. The other authors report no conflicts.

References

  1. Vestergaard, A.S.; Ehlers, L.H. A Health Economic Evaluation of Stroke Prevention in Atrial Fibrillation: Guideline Adherence versus the Observed Treatment Strategy Prior to 2012 in Denmark. Pharmacoeconomics 2015, 33, 967–979. [Google Scholar] [CrossRef]
  2. Brien, E.C.; Wu, J.; Zhao, X.; Schulte, P.J.; Fonarow, G.C.; Hernandez, A.F.; Schwamm, L.H.; Peterson, E.D.; Bhatt, D.L.; Smith, E.E.; et al. Healthcare Resource Availability, Quality of Care, and Acute Ischemic Stroke Outcomes. J. Am. Heart Assoc. 2017, 6, e003813. [Google Scholar] [CrossRef] [Green Version]
  3. Cadilhac, D.A.; Lannin, N.A.; Anderson, C.S.; Kim, J.; Andrew, N.; Kilkenny, M.; Shehata, S.; Grabsch, B.; Levi, C.; Faux, S.; et al. The Australian Stroke Clinical Registry Annual Report 2015; The Florey Institute of Neuroscience and Mental Health, 2016; Report No 7, pages 42. [Google Scholar]
  4. Grimshaw, J.M.; Eccles, M.P.; Walker, A.E.; Thomas, R.E. Changing physicians’ behavior: What works and thoughts on getting more things to work. J. Contin. Educ. Health Prof. 2002, 22, 237–243. [Google Scholar] [CrossRef]
  5. Forsner, T.; Hansson, J.; Brommels, M.; Wistedt, A.Å.; Forsell, Y. Implementing clinical guidelines in psychiatry: A qualitative study of perceived facilitators and barriers. BMC Psychiatry 2010, 10, 1–10. [Google Scholar] [CrossRef] [Green Version]
  6. Cadilhac, D.A.; Carter, R.C.; Thrift, A.G.; Dewey, H.M. Why invest in a national public health program for stroke?: An example using Australian data to estimate the potential benefits and cost implications. Health Policy 2007, 83, 287–294. [Google Scholar] [CrossRef]
  7. Grimshaw, J.M.; Eccles, M.P.; Lavis, J.N.; Hill, S.J.; Squires, J.E. Knowledge translation of research findings. Implement. Sci. 2012, 7, 1–17. [Google Scholar] [CrossRef]
  8. Cochrane Effective Practice and Organization of Care Group. Data Collection Checklist. 2002. Available online: http://www.epoc.cochrane.org/ (accessed on 25 April 2016).
  9. Grimshaw, J.; Eccles, M.; Tetroe, J. Implementing clinical guidelines: Current evidence and future implications. J. Contin. Educ. Health Prof. 2004, 24, S31–S37. [Google Scholar] [CrossRef] [PubMed]
  10. Squires, J.E.; Sullivan, K.; Eccles, M.P.; Worswick, J.; Grimshaw, J.M. Are multifaceted interventions more effective than single-component interventions in changing health-care professionals’ behaviours? An overview of systematic reviews. Implement. Sci. 2014, 9, 152. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Kitson, A.L.; Rycroft-Malone, J.; Harvey, G.; McCormack, B.; Seers, K.; Titchen, A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: Theoretical and practical challenges. Implement. Sci. 2008, 3, 1. [Google Scholar] [CrossRef]
  12. Gozdzik, A. Applying the PARiHS framework in a knowledge dissemination initiative. CANNT J. J. ACITN 2013 2013, 3, 48–50. [Google Scholar]
  13. Stetler, C.B.; Legro, M.W.; Rycroft-Malone, J.; Bowman, C.; Curran, G.; Guihan, M.; Hagedorn, H.; Pineros, S.; Wallace, C.M. Role of “external facilitation” in implementation of research findings: A qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement. Sci. 2006, 1, 1–15. [Google Scholar] [CrossRef] [PubMed]
  14. Bornbaum, C.C.; Kornas, K.; Peirson, L.; Rosella, L.C. Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: A systematic review and thematic analysis. Implement. Sci. 2015, 10, 162. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Cadilhac, D.A.; Grimley, R.; Kilkenny, M.F.; Andrew, N.E.; Lannin, N.A.; Hill, K.; Grabsch, B.; Levi, C.R.; Thrift, A.G.; Faux, S.G.; et al. Multicenter, Prospective, Controlled, Before-and-After, Quality Improvement Study (Stroke123) of Acute Stroke Care. Stroke 2019, 50, 1525–1530. [Google Scholar] [CrossRef] [PubMed]
  16. Cadilhac, D.A.; Andrew, N.E.; Kilkenny, M.F.; Hill, K.; Grabsch, B.; Lannin, N.A.; Thrift, A.G.; Anderson, C.S.; Donnan, G.A.; Middleton, S.; et al. Improving quality and outcomes of stroke care in hospitals: Protocol and statistical analysis plan for the Stroke123 implementation study. Int. J. Stroke 2018, 13, 96–106. [Google Scholar] [CrossRef] [PubMed]
  17. Andrew, N.E.; Middleton, S.; Grimley, R.; Anderson, C.S.; Donnan, G.A.; Lannin, N.A.; Stroil-Salama, E.; Grabsch, B.; Kilkenny, M.F.; Squires, J.E.; et al. Hospital organizational context and delivery of evidence-based stroke care: A cross-sectional study. Implement. Sci. 2019, 14, 6. [Google Scholar] [CrossRef] [Green Version]
  18. Speroff, T.; O’Connor, G.T. Study Designs for PDSA Quality Improvement Research. Qual. Manag. Healthc. 2004, 13, 17–32. [Google Scholar] [CrossRef] [Green Version]
  19. Donnelly, P.; Kirk, P. Use the PDSA model for effective change management. Educ. Prim. Care 2015, 26, 279–281. [Google Scholar] [CrossRef]
  20. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Academic Press: New York, NY, USA, 1988. [Google Scholar]
  21. Hoffmann, T.C.; Glasziou, P.P.; Boutron, I.; Milne, R.; Perera, R.; Moher, D.; Altman, D.G.; Barbour, V.; Macdonald, H.; Johnston, M.; et al. Better reporting of interventions: Template for intervention description and replication (TIDieR) checklist and guide. Br. Med. J. 2014, 348, g1687. [Google Scholar] [CrossRef] [Green Version]
  22. Bidassie, B.; Williams, L.S.; Woodward-Hagg, H.; Matthias, M.S.; Damush, T.M. Key components of external facilitation in an acute stroke quality improvement collaborative in the Veterans Health Administration. Implement. Sci. 2015, 10, 1–9. [Google Scholar] [CrossRef] [Green Version]
  23. Schwamm, L.H.; Fonarow, G.C.; Reeves, M.J.; Pan, W.; Frankel, M.R.; Smith, E.E.; Ellrodt, G.; Cannon, C.P.; Liang, L.; Peterson, E.; et al. Get With the Guidelines–Stroke Is Associated With Sustained Improvement in Care for Patients Hospitalized With Acute Stroke or Transient Ischemic Attack. Circulation 2009, 119, 107–115. [Google Scholar] [CrossRef] [Green Version]
  24. Cadilhac, D.A.; Andrew, N.E.; Stroil Salama, E.; Hill, K.; Middleton, S.; Horton, E.; Meade, I.; Kuhle, S.; Nelson, M.R.; Grimley, R.; et al. Improving discharge care: The potential of a new organisational intervention to improve discharge after hospitalisation for acute stroke, a controlled before–after pilot study. BMJ Open 2017, 7, e016010. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Proportion of professional behavior change interventions (categorized by the Cochrane Effective Practice and Organization of Care Group) [8] provided by the external facilitators.
Figure 1. Proportion of professional behavior change interventions (categorized by the Cochrane Effective Practice and Organization of Care Group) [8] provided by the external facilitators.
Healthcare 09 01095 g001
Figure 2. Change in mean composite score from <=8 acute stroke processes of care collected in all QLD hospitals participating in the multicomponent quality improvement program—Enhanced StrokeLink. Support provided by external facilitators to the respective hospitals during the multicomponent quality improvement program—Enhanced StrokeLink period is displayed in hours (i.e., h). Missing AuSCR data for one hospital.
Figure 2. Change in mean composite score from <=8 acute stroke processes of care collected in all QLD hospitals participating in the multicomponent quality improvement program—Enhanced StrokeLink. Support provided by external facilitators to the respective hospitals during the multicomponent quality improvement program—Enhanced StrokeLink period is displayed in hours (i.e., h). Missing AuSCR data for one hospital.
Healthcare 09 01095 g002
Figure 3. Change in composite score from acute stroke processes of care nominated by hospitals in their action plans (i.e., only includes hospitals that developed an action plan, and acute stroke processes of care nominated by hospitals in their action plan).
Figure 3. Change in composite score from acute stroke processes of care nominated by hospitals in their action plans (i.e., only includes hospitals that developed an action plan, and acute stroke processes of care nominated by hospitals in their action plan).
Healthcare 09 01095 g003
Figure 4. Absolute change in composite score from <=8 acute stroke processes of care collected in all QLD hospitals vs. support provided by external facilitators to the respective hospitals during the multicomponent quality improvement program—Enhanced StrokeLink period is displayed in hours. Missing AuSCR data for one hospital.
Figure 4. Absolute change in composite score from <=8 acute stroke processes of care collected in all QLD hospitals vs. support provided by external facilitators to the respective hospitals during the multicomponent quality improvement program—Enhanced StrokeLink period is displayed in hours. Missing AuSCR data for one hospital.
Healthcare 09 01095 g004
Figure 5. Absolute change in composite score from acute stroke processes of care nominated by hospitals in their action plans vs. support provided by external facilitators to the respective hospitals during the multicomponent quality improvement program—Enhanced StrokeLink period is displayed in hours.
Figure 5. Absolute change in composite score from acute stroke processes of care nominated by hospitals in their action plans vs. support provided by external facilitators to the respective hospitals during the multicomponent quality improvement program—Enhanced StrokeLink period is displayed in hours.
Healthcare 09 01095 g005
Table 1. Information collected from external facilitators.
Table 1. Information collected from external facilitators.
Type of Information
Number of contacts the external facilitators had with the relevant clinicians and hospital staff
Mode of contact (e.g., telephone, face-to-face, email)
Contact time in minutes for telephone and face-to-face contacts
Professional behavior change support type provided (e.g., reminders, educational outreach) *
Action plan initiated by staff at participating hospitals (Yes/No)
Hospitals accessing their AuSCR data (by downloading online live reports)—Yes or No
* defined by the classifications by the Cochrane Effective Practice and Organization of Care Group [8].
Table 2. Median and mean hours of external facilitator support and development of action plans.
Table 2. Median and mean hours of external facilitator support and development of action plans.
Mode of ContactMeasureAll Hospitals
N = 19
Hospitals that Developed an Action Plan
N = 14
Hospitals That Did Not Develop an Action Plan
N = 5
p ValueHedge’s g
Face-to-faceMedian (Q1, Q3)20 (10, 24)21 (10, 24)10 (10, 12)0.243N/A
Mean (± SD)19 (11)20 (11)14 (10)0.3000.557
TelephoneMedian (Q1, Q3)5 (3, 7)5 (3, 7)3 (3, 6)0.676N/A
Mean (± SD)5 (2)5 (2)4 (3)0.4110.440
EmailMedian (Q1, Q3)6 (4, 9)7 (4, 9)5 (5, 8)0.817N/A
Mean (± SD)7 (4)7 (4)7 (3)0.8590
TotalMedian (Q1, Q3)22 (15, 29)29 (24, 38)20 (16, 31)0.308N/A
Mean (± SD)30 (14)32 (15)25 (10)0.3500.501
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Thayabaranathan, T.; Andrew, N.E.; Grimley, R.; Stroil-Salama, E.; Grabsch, B.; Hill, K.; Cadigan, G.; Purvis, T.; Middleton, S.; Kilkenny, M.F.; et al. Understanding the Role of External Facilitation to Drive Quality Improvement for Stroke Care in Hospitals. Healthcare 2021, 9, 1095. https://doi.org/10.3390/healthcare9091095

AMA Style

Thayabaranathan T, Andrew NE, Grimley R, Stroil-Salama E, Grabsch B, Hill K, Cadigan G, Purvis T, Middleton S, Kilkenny MF, et al. Understanding the Role of External Facilitation to Drive Quality Improvement for Stroke Care in Hospitals. Healthcare. 2021; 9(9):1095. https://doi.org/10.3390/healthcare9091095

Chicago/Turabian Style

Thayabaranathan, Tharshanah, Nadine E. Andrew, Rohan Grimley, Enna Stroil-Salama, Brenda Grabsch, Kelvin Hill, Greg Cadigan, Tara Purvis, Sandy Middleton, Monique F. Kilkenny, and et al. 2021. "Understanding the Role of External Facilitation to Drive Quality Improvement for Stroke Care in Hospitals" Healthcare 9, no. 9: 1095. https://doi.org/10.3390/healthcare9091095

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop