Next Article in Journal
Evaluating Spatial Support for Care Professionals: Combining Cognitive Mapping and Space Syntax Analysis Through the Lens of System Adaptability
Previous Article in Journal
Prediction of Waiting Lists for Medical Specialties in Hospitals in Costa Rica Using Queuing Theory and Monte Carlo Simulation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Framework for a Modular Emergency Departments Registry: A Case Study of the Tasmanian Emergency Care Outcomes Registry (TECOR)

1
Royal Hobart Hospital, Tasmanian Health Service, Hobart 7001, Australia
2
Tasmanian School of Medicine, University of Tasmania, Hobart 7001, Australia
3
Tasmanian Emergency Medicine Research Institute, Hobart 7001, Australia
4
Menzies Institute for Medical Research, University of Tasmania, Hobart 7001, Australia
*
Author to whom correspondence should be addressed.
Hospitals 2025, 2(3), 18; https://doi.org/10.3390/hospitals2030018
Submission received: 3 July 2025 / Revised: 21 July 2025 / Accepted: 22 July 2025 / Published: 23 July 2025

Abstract

Background: The emergency department (ED) often represents the entry point to care for patients that require urgent medical attention or have no alternative for medical treatment. This has implications on scope of practice and how quality of care is measured. A diverse array of methodologies has been developed to evaluate the quality of clinical care and broadly includes quality improvement (QI), quality assurance (QA), observational research (OR) and clinical quality registries (CQRs). Considering the overlap between QI, QA, OR and CQRs, we conceptualized a modular framework for TECOR to effectively and efficiently streamline clinical quality evaluations. Streamlining is both appropriate and justified as it reduces redundancy, enhances clarity and optimizes resource utilization, thereby allowing clinicians to focus on delivering high-quality patient care without being overwhelmed by excessive data and procedural complexities. The objective of this study is to describe the process for designing a modular framework for ED CQRs using TECOR as a case study. Methods: We performed a scoping audit of all quality projects performed in our ED over a 1-year period (1 January 2021 to 31 December 2021) as well as data mapping and categorical formulation of key themes from the TECOR dataset with clinical data sources. Both these processes then informed the design of TECOR. Results: For the audit of quality projects, we identified 29 projects. The quality evaluation methodologies for these projects included 12 QI projects, 5 CQRs and 12 OR projects. Data mapping identified that clinical information was fragmented across 11 distinct data sources. Through thematic analysis during data mapping, we identified three extraction techniques: self-extractable, manual entry and on request. Conclusions: The modular framework for TECOR aims to enable an efficient streamlined approach that caters to all aspects of clinical quality evaluation to enable higher throughput of clinician-led quality evaluations and improvements. TECOR is also an essential component in the development of a learning health system to drive evidence-based practice and the subject of future research.

1. Introduction

The emergency department (ED) often represents the entry point to care when patients feel that they require urgent medical attention [1]. Patients frequently present with undifferentiated conditions, ranging from those who are generally well but experiencing minor ailments with no appropriate alternative care services, to individuals who are critically ill and require immediate medical intervention [2]. Whereas the role of many specialties is to manage a specific scope of disease and symptomatology, the immediate role of the ED is to determine the criticality of patients, provide resuscitation and stabilization and redirect care when needed. Furthermore, a diagnosis is not always apparent or required [3]. Although diagnosis can help guide patient care, it may not be known with immediacy given delays in biochemical and/or radiological reporting, or if the disease progress has not met the threshold for a clinical diagnosis (such as very early appendicitis) [4]. It is therefore not uncommon for patients to leave the ED, either to be admitted or discharged, without a clear diagnosis. This adds a layer of complexity in measuring quality of care in ED [2].
The characteristics of this often undifferentiated patient population, combined with the continuous availability of care regardless of time or day, have significant implications for staffing, the scope of practice and how quality of care is defined for the ED [5]. The unscheduled nature of presentations, combined with an increasingly under-resourced primary care and preventative health sector, has also contributed to rising patient volumes with EDs managing significantly more patients than any other hospital unit, amplifying the impact of any deviation from quality care [6].

1.1. Methods for Evaluating Clinical Quality

Quality of care is often based on the principle of consumers receiving the care they need and based on two components including effectiveness and access [7]. At a managerial level, the focus is often on access, whereas at the clinical level, the focus is on effectiveness. Clinical quality is therefore defined as the extent to which care delivers its intended outcome or results in a desired process, in response to need [7]. A range of methodologies have been developed to evaluate the quality of healthcare delivery, identify deficiencies in clinical practice and inform targeted interventions at both local and system-wide levels [5,8,9]. These approaches encompass quality improvement (QI), quality assurance (QA), observational research (OR) and clinical quality registries (CQRs), each contributing distinct yet complementary perspectives on care quality [10,11,12,13,14]. While these tools differ in their primary objectives—ranging from iterative process enhancement (QI) and compliance monitoring (QA) to empirical investigation (OR) and longitudinal outcome tracking (CQRs)—they share a foundational reliance on systematically collected data to guide analysis and decision-making. Notably, there is considerable conceptual and operational overlap among these modalities, underscoring the integrative nature of data-driven quality strategies in contemporary healthcare systems (Table 1) [15].
Quality improvement in healthcare refers to the systematic, data-driven efforts to enhance patient outcomes, optimize clinical processes and reduce variability in care delivery [16]. Grounded in established methodologies such as Plan-Do-Study-Act cycles and Lean Six Sigma, QI initiatives aim to identify inefficiencies, implement evidence-based interventions and monitor performance metrics to ensure sustained progress [16,17,18]. These efforts are typically time-sensitive with priority areas set by the organization [18].
In contrast, quality assurance encompasses the formal processes and standards designed to ensure that clinical services consistently meet predefined benchmarks of safety, efficacy and regulatory compliance [19]. Unlike QI, which focuses on iterative change and innovation, QA emphasizes adherence to established protocols, systematic audits and performance evaluations to maintain reliability and accountability across care delivery systems [19]. QA frameworks often involve accreditation bodies, peer review mechanisms and standardized reporting tools to detect deviations, enforce corrective actions and uphold institutional integrity [19]. In emergency medicine, QA is essential for safeguarding patient safety in high-pressure environments where consistency is critical and quality evaluations are under resourced [20].
Observational research in healthcare involves the systematic assessment of clinical phenomena without experimental manipulation, allowing investigators to examine real-world practices, outcomes and associations within naturalistic settings [21,22,23]. This methodology is particularly valuable for studying populations where randomized controlled trials may be impractical or unethical [21]. Observational research designs such as cohort, case–control and cross-sectional studies enable the evaluation of exposures and interventions across diverse clinical environments [24]. In emergency medicine, observational research provides critical insights into patient flow, diagnostic accuracy and treatment variability, informing both QI and policy development whilst preserving ecological validity [2,3,24,25,26].
Clinical quality registries are structured data systems designed to systematically collect, analyze and report on health outcomes and care processes for defined patient populations [9,27,28]. By capturing real-world data across institutions and care settings, CQRs enable benchmarking, identify unwarranted variations in practice and support continuous QI and health system accountability [2,3]. These registries often integrate risk-adjusted outcome measures and facilitate feedback loops to clinicians and administrators, promoting evidence-informed decision-making and targeted interventions [29,30,31]. For many disease-specific or specialty-specific CQRs, quality measures are distilled from clinical guidelines created through both consensus and evidence [30,32,33,34,35]. As such, when most CQRs are established, an agreed dataset is established at inception and remains largely unaltered unless significant changes are made to the quality measures through amendments to guidelines or consensus statements.
The most salient distinction among healthcare quality evaluation methods lies in the trade-off between methodological limitations and the effort required for implementation (Figure 1) [15]. For QI, where tighter evaluation timeframes exist, more limitations are expected. Rather than simply accepting these limitations, it is important to identify a minimum threshold for meaningful evaluation to ensure that even though less effort may be involved, the intentions are not wasted through uninterpretable findings [12,13]. QA has standards set by an external body and therefore will require more effort to ensure that evaluations meet these standards [19]. Finally, given the goal of generating generalizable knowledge with a greater influence and impact, OR methodology aims to measure, adjust and minimize bias, and therefore the effort required can be significantly more than for QI and QA [15].

1.2. ED CQRs

Our scoping review on ED clinical quality registries is the only review reported in the literature and revealed a wide range of objectives and scope [2]. We identified 27 ED CQRs, with 6 having a general scope and 21 that were condition- or population-specific. The reported registry periods ranged from 31 days to 4018 days (median 365 days, IQR 181–1309 days) with six registries still ongoing. None of the ED CQRs used an adaptive or modular approach within their respective frameworks.
The Tasmanian Emergency Care Outcomes Registry (TECOR) was created as a quality tool to enhance the provision of care by all four EDs in Tasmania, Australia. The study protocol for TECOR has previously been published [3]. The study protocol includes discussion around the reporting of key data points such as mortality and readmissions. Considering the extensive scope of ED care, the diverse population it serves and the overlap between QI, QA, OR and CQRs, we reimagined the concept of CQRs and adopted a modular design to effectively and efficiently streamline these processes.

1.3. Objectives

The objective of this study is to describe the process for designing a modular framework for ED CQRs using TECOR as a case study.

2. Materials and Methods

To understand the best framework for an ED CQR, we performed a scoping audit of all quality projects performed in the ED as well as data mapping and categorizing key themes from the TECOR dataset with hospital databases. Our ED is part of a single health service operating across four emergency departments across the state of Tasmania, Australia [3].

2.1. Audit of Quality Assurance, Quality Improvement and Observational Research in Tasmanian EDs

To understand the range of quality projects performed by our ED we accessed our annual internal report. Only high-quality projects are recorded in this report. To ensure we captured the breadth of quality projects we supplemented this audit with a search of emails, newsletters, and conference abstracts for additional quality initiatives from our ED over a 1-year period (1 January 2021 to 31 December 2021). Duplicate projects were excluded as were projects led by other departments. For context, the annual census for our mixed-ED is approximately 70,000 presentations. Based on the Modified Monash (MM) Model, we are classified as regional service (MM 2) [36]. These quality projects were categorized into either QI, OR or CQRs based on their methodological approach. We have not included QA as part of our quality audit as data collection and reporting is performed independent of the ED.

2.2. Methods for Data Mapping

The proposed dataset for TECOR is published and available in the protocol paper and was mapped to hospital data sources [3]. It is important to note that the digital environment from which TECOR collates data is based on aging IT systems where routinely collected administrative data and ED notes are captured in electronic form, but all other clinical notes, patient observations and prescriptions are handwritten and scanned as images into a unified electronic record. We used Research Electronic Data Capture (REDCap) as a cost-effective system for TECOR [37,38]. The financial benefits and wide use were also considered during this selection process [37].
A conceptual design to map the TECOR datapoints to known hospital data systems will provide guidance on where to find data through a three-step mapping approach. In step one, we mapped each of the TECOR datapoints to the data systems(s) in which they are held (traditionally known as the source of truth). For step two, one reviewer inductively developed a coding framework (first-order codes) for extractability based on how the data may be procured from the source of truth. The coding framework was then verified by another reviewer with discrepancies resolved through discussion. A third author reconciled any remaining differences. Thirdly, the TECOR data points and their sources of truth were designated a category of extractability using first-order codes.
Exemption from ethics review was provided by the Tasmanian Department of Health Low risk Human Research Ethics Committee on the 6 January 2025 for the mapping process. TECOR is approved by the University of Tasmania Human Research Ethics Committee (HREA30260, 26 February 2024) and publicly registered with the Australian and New Zealand Clinical Trial Registry (ACTRN12624000278538) and conforms with the Australian Commission on Safety and Quality in Health Care CQRs framework.

3. Results

3.1. Scoping Audit of Quality Initatives

A range of quality initiatives were identified to understand the breadth of auditing that was occurring for clinical care in our ED (Table 2). During the audit period, there were 29 identified projects. Of these, there were 12 QI methods, 5 CQRs, and 12 ORs. All data required an initial search to identify patient populations for potential inclusion, with a further manual filter using medical records for final inclusion and exclusion.

3.2. Data Mapping

Data mapping to the TECOR dataset is represented in Table 3. All data points were available and stored in 11 different systems. These included (1) Health Executive Analytics Reporting Tool (HEART): Emergency Department Dashboard and Admissions Dashboard (QlikTech Australia Pty Ltd., Sydney, Australia); (2) i.Patient Manager (Dedalus Australia Pty Ltd., Brisbane, Australia); (3) TrakCare (InterSystems Australia Pty Ltd., Sydney, Australia); (4) Communications Code Call Database (manual entry spreadsheet); (5) MedTasker (Nimblic Pty Ltd., Melbourne, Australia); (6) Digital Medical Records (InfoMedix Pty Ltd., Melbourne, Australia); (7) IntelliSpace Picture Archiving and Communication System (Philips Electronics Australia Ltd., Melbourne, Australia); (8) Pathology Clinical Information System (Kestral Pty Ltd., Heidelberg, Australia); (9) eReferrals (HealthCare Software Pty Ltd., Hobart, Australia); (10) Hospital Clinical Suite (HealthCare Software Pty Ltd., Hobart, Australia); and (11) Safety and Reporting Learning System (RLDatix Australia Pty Ltd., Melbourne, Australia).
Through the mapping process as well as experientially through the quality project audit (Table 2), our research team categorized data extractability into 3 categories (Table 4): self-extractable, manual entry and on request. Self-extractable data was defined when one or more designated staff members employed by the ED is able to extract data in a format that could be manipulated for importing to TECOR. Manual entry on the other hand, was defined as the need to collect individual datapoints from the source of truth whilst using TECOR to support scaled administrative data. On request extraction was defined similar to self-extractable in that the data is extracted in a format that can be manipulated for importing to TECOR, however extraction is unable to be performed by a staff member employed by the ED and instead, a data request is required to be made to the data custodian.

4. Discussion

The ED environment, characterized by its undifferentiated and transient patient population, high staffing requirements, and the persistent challenge of overcrowding, complicates the implementation of a framework adhering to the traditional concept of CQRs [2]. The high-volume nature of the ED has influenced the practice of emergency medicine, where it is widely accepted that a higher degree of risk is accepted, and shared with the patient (shared decision-making) compared to non-generalist specialties [39]. Furthermore, the literature identifies several significant barriers to conducting quality evaluations, including confusion arising from disparate data sources and issues related to data extractability, complexities in navigating ethical and governance procedures, and ambiguity regarding whether projects fall under the jurisdiction of QI or OR governance structures [14,40]. Consequently, clinicians often experience intimidation or frustration with existing systems, particularly due to the steep learning curves associated with conducting quality evaluations [41]. To mitigate these challenges and take advantage of the unique qualities of EDs, we determined that an ED CQR should enable high frequency interrogation of clinical quality whilst identifying and rationalizing the limitations and bias associated with this approach. This concept of balancing project completion with limitations is fundamental to differentiating the spectrum spanning QI to OR (Figure 1) [23]. Our comprehensive scoping audit of various QI projects reinforced the observation that processes were consistently being replicated across each initiative. This repetition underscored the need for an optimized and efficient approach to enhance overall project effectiveness. As such, we designed TECOR principally to facilitate high frequency QI through a modular design with the capability of performing OR (Figure 2).

4.1. Project Setup and the TECOR Protocol Repository

Fundamental to any quality evaluation is the protocol or plan that outlines the objectives and methods [42]. Most QI project protocols are not published, as the contextual limitations prevent reproducibility [16]. When these limitations are acknowledged, the publication of such protocols can still offer valuable contributions to the broader healthcare community by disseminating innovative approaches to QI and innovative methodological approaches [43,44]. To uphold transparency and reproducibility, it is expected that systematic reviews have their protocols published prior to the commencement of the search strategy [45]. We aim to replicate this process by developing a standardized study protocol template for projects through TECOR. The intention will then be to archive these protocols in our TECOR Protocol Repository and to be accessible in an open framework available through open access [46,47].

4.2. Data Management and Security

From our audit of quality projects, it became evident that data management and security were inconsistent when they did not fall within the review process for research ethics or governance [48,49]. Through the implementation of TECOR, we therefore ensure that our data management practices, encompassing data collection, storage, and access procedures, consistently comply with ethical standards, governance policies, and legislative requirements. The TECOR framework also prompts researchers to consider and identify the need for additional ethical and governance approvals, initiating these processes when projects fall outside the scope of the original approvals.
Figure 2. The TECOR framework illustrating a modular design approach to reimagine clinical quality registries for EDs [43,50].
Figure 2. The TECOR framework illustrating a modular design approach to reimagine clinical quality registries for EDs [43,50].
Hospitals 02 00018 g002

4.3. Data Feasability

Emergency care data is foundational to all dimensions of evaluating clinical quality [2]. Our data mapping exercise reflects the broader systematic challenges of decentralized health data and interoperability [51,52,53]. The increasing adoption of large scale electronic medical records and integration into data warehouses or data lakes is expected to facilitate increasingly streamlined and efficient data integration and analysis in the future [54,55,56]. Given the substantial financial and logistical investments required for this, it is likely that many EDs will continue to experience the challenges faced with the data fragmentation demonstrated in our study for the foreseeable future. Our data mapping framework therefore provides a reproducible systematic checklist in which other EDs can map datapoints to their local data sources and determine if the requisite data is self-extractable, manual, on request or not available.
One aspect of feasibility that our mapping exercise is limited by is data linkage with other systems, specifically pre-hospital care. Incorporating ambulance data and primary healthcare data would significantly enhance the comprehensiveness of the framework. As both these systems are governed outside of our health service, this is an important area to explore once the internal aspects are functioning as core business. Furthermore, data that is not currently collected, such as patient-reported experience measures and staff wellbeing are important aspects to consider advocating for and integrating once available.
Limitations and bias represent a fundamental point of difference that differentiates QI from OR (Figure 1). Constrained timeframes are common for QI projects, such that limitations may not be considered. The TECOR framework uses Kaji et al. as a cognitive aid to ensure a consistent evaluation of the limitations and bias within projects [57].

4.4. Quality Module Setup

Once data feasibility is acceptable, a quality module is created in TECOR to collect additional data as outlined in the quality protocol. This is based on the concept of supplemental data capture which is designed to add new data elements or instruments to enrich or extend an original dataset, without necessarily altering the study design [37,38]. The core component of TECOR is the administrative dataset of all patients presenting to the ED [3]. Each module adds branches for those patients identified in the inclusion criteria of each quality project study protocol with the module adding data points to capture and analyze. This mitigates the necessity of repeatedly acquiring and linking the core component data for each quality improvement project.

4.5. Reporting

A structured framework for reporting quality evaluations ensures clarity, transparency and rigor. For observational research, strengthening the reporting of observational studies in epidemiology (STROBE) guidelines for reporting observational studies served as the template for new quality modules [50]. For OR that uses routinely collected data, the Reporting of studies Conducted using Observational Routinely collected Data (RECORD) guidelines is encouraged [58]. For QI projects, the Standards for Quality Improvement Reporting Excellence (SQUIRE 2.0) is required [43]. Emphasis on reporting will also include the exploration of equity in care delivery, such as differences in access, timeliness, and outcomes across socioeconomic, minority and vulnerable populations.

4.6. Integrating TECOR in a Learning Health System

Originally conceptualized 15 years ago, the Learning Health Systems (LHSs) represents an evolution of the earlier evidence-brokering framework known as ‘knowledge translation and exchange’ [59]. Driven by the growing accessibility of routinely collected clinical data, the proliferation of Learning Health Systems underscores the foundational role of data in generating evidence and informing decision-making within these systems [60]. Whereas TECOR supports the acquisition of data, the role of an LHS is to provide a structured and governed approach to both recognition of priority quality improvement and research areas as well as implementing findings into a business-as-usual model [59].

4.7. Future Directions

The current focus of TECOR is retrospective and based on individual hypothesis and quality projects. Integration with artificial intelligence and other emerging technologies is an area of great interest as the legislative requirements for patient data privacy begin to accommodate for these technologies. Real-time data monitoring represents a significant area of interest. However, a notable challenge within a broad specialty such as Emergency Medicine (ED) lies in discerning the most pertinent metrics to monitor. This must be achieved without overwhelming clinicians with an excessive volume of data.
The framework itself serves to provide a structured approach to a modular design. Adoption in other EDs are applicable given the generalizable nature of the framework but also acknowledging common barriers such as local digital infrastructure, data governance and data privacy laws.

4.8. Limitations

Whilst the proposed framework for a modular emergency department registry has its strengths, there are some limitations, as with any framework. One aspect is that the data mapping exercise is limited to data points and data collection that is available within the health service. Data collected, stored and accessed outside the organization requires additional data sharing agreements and data linkage which is an area of future work. In addition to this, a lack of data collection within the service, such as staff wellbeing, is unable to be mapped and is also an area of future work. In addition, the data collected and reported on is purely retrospective adding to the limitations associated with this. Limitations associated with data collection such as manual data entry, inter-rater reliability and documentation also need to be considered and are expanded on in the original TECOR protocol [3]. Equally, we appreciate that setup costs are not unsubstantial even for a registry such as TECOR and this is another area to explore and cost. Finally, our article has not discussed the challenges of implementing this framework, including education, staff training or engagement.

5. Conclusions

We reimagined the traditional CQR model and propose a modular framework tailored for Emergency Departments, designed to accommodate the distinct scope of practice, patient demographics, and risk tolerance inherent to this setting. The modular framework for TECOR aims to enable an efficient streamlined approach that caters to all aspects of quality evaluation to enable a higher throughput of clinician-led quality evaluations including QI and OR. A rich and structured data repository is also foundational to the development of a learning health system to drive evidence-based practice and the subject of future research.

Author Contributions

Conceptualization, V.T.; methodology, V.T. and G.B.; formal analysis, V.T. and G.B.; investigation, V.T. and G.B.; resources, V.T., S.P., and G.B.; data curation, V.T. and G.B.; writing—original draft preparation, V.T.; writing—review and editing, V.T., L.T., S.P. and G.B.; visualization, V.T.; supervision, V.T.; project administration, G.B.; funding acquisition, V.T., L.T., S.P. and G.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Medical Research Future Fund (MRFF), grant number MRF2018041.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki. TECOR has been approved by the University of Tasmania Human Research Ethics Committee (HREA30260, 26 February 2024). The registry has also been publicly registered with the Australian and New Zealand Clinical Trial Registry (ACTRN12624000278538). The data mapping component of this study was exempt from ethics review by the Tasmanian Department of Health Low risk Human Research Ethics Committee on the 6 January 2025.

Informed Consent Statement

Patient consent was waived and approved by the University of. Tasmania Human Research Ethics Committee given that the information published will be de-identified and not re-identifiable and informed consent or opt-out consent for the number of patients expected is not feasible.

Data Availability Statement

The data presented in this study are available on request from the corresponding author and subject to local privacy laws.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
EDEmergency Department
QIQuality Improvement
QAQuality Assurance
CQRClinical Quality Registry
ORObservational Research
TECORThe Tasmanian Emergency Care Outcomes Registry
MMModified Monash
REDCapResearch Electronic Data Capture
STROBEStrengthening the Reporting of Observational Studies in Epidemiology
RECORDReporting of studies Conducted using Observational Routinely collected Data
SQUIREStandards for Quality Improvement Reporting Excellence

References

  1. Lin, M.P.; Sharma, D.; Venkatesh, A.; Epstein, S.K.; Janke, A.; Genes, N.; Mehrotra, A.; Augustine, J.; Malcolm, B.; Goyal, P.; et al. The Clinical Emergency Data Registry: Structure, Use, and Limitations for Research. Ann. Emerg. Med. 2024, 83, 467–476. [Google Scholar] [CrossRef] [PubMed]
  2. Tran, V.; Barrington, G.; Page, S. Emergency Department Clinical Quality Registries: A Scoping Review. Healthcare 2025, 13, 1022. [Google Scholar] [CrossRef] [PubMed]
  3. Tran, V.; Barrington, G.; Page, S. The Tasmanian Emergency Care Outcomes Registry (TECOR) Protocol. Emerg. Care Med. 2024, 1, 153–164. [Google Scholar] [CrossRef]
  4. Lowthian, J.; Cameron, P. Improving Timeliness While Improving the Quality of Emergency Department Care. Emerg. Med. Australas. 2012, 24, 219–221. [Google Scholar] [CrossRef] [PubMed]
  5. Mostafa, R.; El-Atawi, K. Strategies to Measure and Improve Emergency Department Performance: A Review. Cureus 2024, 16, e52879. [Google Scholar] [CrossRef] [PubMed]
  6. Sartini, M.; Carbone, A.; Demartini, A.; Giribone, L.; Oliva, M.; Spagnolo, A.M.; Cremonesi, P.; Canale, F.; Cristina, M.L. Overcrowding in Emergency Department: Causes, Consequences, and Solutions—A Narrative Review. Healthcare 2022, 10, 1625. [Google Scholar] [CrossRef] [PubMed]
  7. Campbell, S.M.; Roland, M.O.; Buetow, S.A. Defining Quality of Care. Soc. Sci. Med. 2000, 51, 1611–1625. [Google Scholar] [CrossRef] [PubMed]
  8. Soldatenkova, A.; Calabrese, A.; Ghiron, N.L.; Tiburzi, L. Emergency Department Performance Assessment Using Administrative Data: A Managerial Framework. PLoS ONE 2023, 18, e0293401. [Google Scholar] [CrossRef] [PubMed]
  9. Craig, S.; O’Reilly, G.M.; Egerton-Warburton, D.; Jones, P.; Than, M.P.; Tran, V.; Taniar, D.; Moore, K.; Alvandi, A.; Tuxen-Vu, J.; et al. Making the Most of What We Have: What Does the Future Hold for Emergency Department Data? Emerg. Med. Australas. 2024, 36, 795–798. [Google Scholar] [CrossRef] [PubMed]
  10. Al-Surimi, K. Research versus Quality Improvement in Healthcare. Glob. J. Qual. Saf. Healthc. 2018, 1, 25–27. [Google Scholar] [CrossRef]
  11. Margolis, P.; Provost, L.P.; Schoettker, P.J.; Britto, M.T. Quality Improvement, Clinical Research, and Quality Improvement Research—Opportunities for Integration. Pediatr. Clin. N. Am. 2009, 56, 831–841. [Google Scholar] [CrossRef] [PubMed]
  12. Doezema, D.; Hauswald, M. Quality Improvement or Research: A Distinction without a Difference? IRB Ethics Hum. Res. 2002, 24, 9–12. [Google Scholar] [CrossRef]
  13. Hirschhorn, L.R.; Ramaswamy, R.; Devnani, M.; Wandersman, A.; Simpson, L.A.; Garcia-Elorrio, E. Research versus Practice in Quality Improvement? Understanding How We Can Bridge the Gap. Int. J. Qual. Health Care 2018, 30, 24–28. [Google Scholar] [CrossRef] [PubMed]
  14. Cooper, J.A.; McNair, L. How to Distinguish Research from Quality Improvement. J. Empir. Res. Hum. Res. Ethics 2015, 10, 209–210. [Google Scholar] [CrossRef] [PubMed]
  15. Solberg, L.I.; Mosser, G.; McDonald, S. The Three Faces of Performance Measurement: Improvement, Accountability, and Research. Jt. Comm. J. Qual. Improv. 1997, 23, 135–147. [Google Scholar] [CrossRef] [PubMed]
  16. Jones, B.; Vaux, E.; Olsson-Brown, A. How to Get Started in Quality Improvement. BMJ 2019, 364, k5408. [Google Scholar] [CrossRef] [PubMed]
  17. Zimmermann, G.d.S.; Siqueira, L.D.; Bohomol, E. Lean Six Sigma Methodology Application in Health Care Settings: An Integrative Review. Rev. Bras. Enferm. 2020, 73, e20190861. [Google Scholar] [CrossRef] [PubMed]
  18. Coughlin, K.; Posencheg, M.A. Common Quality Improvement Methodologies Including the Model for Improvement, Lean, and Six Sigma. Clin. Perinatol. 2023, 50, 285–306. [Google Scholar] [CrossRef] [PubMed]
  19. Hussein, M.; Pavlova, M.; Ghalwash, M.; Groot, W. The Impact of Hospital Accreditation on the Quality of Healthcare: A Systematic Literature Review. BMC Health Serv. Res. 2021, 21, 1057. [Google Scholar] [CrossRef] [PubMed]
  20. Hansen, K.; Boyle, A.; Holroyd, B.; Phillips, G.; Benger, J.; Chartier, L.B.; Lecky, F.; Vaillancourt, S.; Cameron, P.; Waligora, G.; et al. Updated Framework on Quality and Safety in Emergency Medicine. Emerg. Med. J. 2020, 37, 437–442. [Google Scholar] [CrossRef] [PubMed]
  21. Benson, K.; Hartz, A.J. A Comparison of Observational Studies and Randomized, Controlled Trials. N. Engl. J. Med. 2000, 342, 1878–1886. [Google Scholar] [CrossRef] [PubMed]
  22. Concato, J.; Shah, N.; Horwitz, R.I. Randomized, Controlled Trials, Observational Studies, and the Hierarchy of Research Designs. N. Engl. J. Med. 2000, 342, 1887–1892. [Google Scholar] [CrossRef]
  23. Vandenbroucke, J.P. Observational Research and Evidence-Based Medicine: What Should We Teach Young Physicians? J. Clin. Epidemiol. 1998, 51, 467–472. [Google Scholar] [CrossRef] [PubMed]
  24. Mann, C.J. Observational Research Methods. Research Design II: Cohort, Cross Sectional, and Case-Control Studies. Emerg. Med. J. 2003, 20, 54. [Google Scholar] [CrossRef] [PubMed]
  25. Thurlow, L.E.; Dam, P.J.V.; Prior, S.J.; Tran, V. How Tasmanian Emergency Departments ‘Choose Wisely’ When Investigating Suspected Pulmonary Embolism. Healthcare 2023, 11, 1599. [Google Scholar] [CrossRef] [PubMed]
  26. Tran, V.; Whitfield, J.; Askaroff, N.; Barrington, G. Procedural Sedation and Analgesia in an Australian Emergency Department: Results of the First 3 Months of a Procedural Sedation Registry. Anesthesia Res. 2024, 1, 157–167. [Google Scholar] [CrossRef]
  27. Evans, S.M.; Scott, I.A.; Johnson, N.P.; Cameron, P.A.; McNeil, J.J. Development of Clinical-quality Registries in Australia: The Way Forward. Med. J. Aust. 2011, 194, 360–363. [Google Scholar] [CrossRef] [PubMed]
  28. Hoque, D.M.E.; Kumari, V.; Hoque, M.; Ruseckaite, R.; Romero, L.; Evans, S.M. Impact of Clinical Registries on Quality of Patient Care and Clinical Outcomes: A Systematic Review. PLoS ONE 2017, 12, e0183667. [Google Scholar] [CrossRef] [PubMed]
  29. Hoque, D.M.E.; Kumari, V.; Ruseckaite, R.; Romero, L.; Evans, S.M. Impact of Clinical Registries on Quality of Patient Care and Health Outcomes: Protocol for a Systematic Review. BMJ Open 2016, 6, e010654. [Google Scholar] [CrossRef] [PubMed]
  30. Larsson, S.; Lawyer, P.; Garellick, G.; Lindahl, B.; Lundström, M. Use Of 13 Disease Registries In 5 Countries Demonstrates the Potential To Use Outcome Data To Improve Health Care’s Value. Health Aff. 2017, 31, 220–227. [Google Scholar] [CrossRef] [PubMed]
  31. Saberwal, G. Clinical Trial Registries: The Good, and the Not so Good. J. Biosci. 2024, 49, 90. [Google Scholar] [CrossRef]
  32. Gong, J.; Singer, Y.; Cleland, H.; Wood, F.; Cameron, P.; Tracy, L.M.; Gabbe, B.J. Driving Improved Burns Care and Patient Outcomes through Clinical Registry Data: A Review of Quality Indicators in the Burns Registry of Australia and New Zealand. Burns 2021, 47, 14–24. [Google Scholar] [CrossRef] [PubMed]
  33. Cadilhac, D.A.; Kim, J.; Lannin, N.A.; Kapral, M.K.; Schwamm, L.H.; Dennis, M.S.; Norrving, B.; Meretoja, A. National Stroke Registries for Monitoring and Improving the Quality of Hospital Care: A Systematic Review. Int. J. Stroke 2015, 11, 28–40. [Google Scholar] [CrossRef] [PubMed]
  34. Cameron, P.A.; Fitzgerald, M.C.; Curtis, K.; McKie, E.; Gabbe, B.; Earnest, A.; Christey, G.; Clarke, C.; Crozier, J.; Dinh, M.; et al. Over View of Major Traumatic Injury in Australia––Implications for Trauma System Design. Injury 2020, 51, 114–121. [Google Scholar] [CrossRef] [PubMed]
  35. Etkin, C.D.; Springer, B.D. The American Joint Replacement Registry—The First 5 Years. Arthroplast. Today 2017, 3, 67–69. [Google Scholar] [CrossRef] [PubMed]
  36. Versace, V.L.; Skinner, T.C.; Bourke, L.; Harvey, P.; Barnett, T. National Analysis of the Modified Monash Model, Population Distribution and a Socio-economic Index to Inform Rural Health Workforce Planning. Aust. J. Rural. Health 2021, 29, 801–810. [Google Scholar] [CrossRef] [PubMed]
  37. Harris, P.A.; Taylor, R.; Minor, B.L.; Elliott, V.; Fernandez, M.; O’Neal, L.; McLeod, L.; Delacqua, G.; Delacqua, F.; Kirby, J.; et al. The REDCap Consortium: Building an International Community of Software Platform Partners. J. Biomed. Inform. 2019, 95, 103208. [Google Scholar] [CrossRef] [PubMed]
  38. Harris, P.A.; Taylor, R.; Thielke, R.; Payne, J.; Gonzalez, N.; Conde, J.G. Research Electronic Data Capture (REDCap)—A Metadata-Driven Methodology and Workflow Process for Providing Translational Research Informatics Support. J. Biomed. Inform. 2009, 42, 377–381. [Google Scholar] [CrossRef] [PubMed]
  39. Ubbink, D.T.; Matthijssen, M.; Lemrini, S.; van Etten-Jamaludin, F.S.; Bloemers, F.W. Systematic Review of Barriers, Facilitators, and Tools to Promote Shared Decision Making in the Emergency Department. Acad. Emerg. Med. 2024, 31, 1037–1049. [Google Scholar] [CrossRef] [PubMed]
  40. Cohen, I.; Diao, Z.; Goyal, P.; Gupta, A.; Hawk, K.; Malcom, B.; Malicki, C.; Sharma, D.; Sweeney, B.; Weiner, S.G.; et al. Mapping Emergency Medicine Data to the Observational Medical Outcomes Partnership Common Data Model: A Gap Analysis of the American College of Emergency Physicians Clinical Emergency Data Registry. J. Am. Coll. Emerg. 2025, 6, 100016. [Google Scholar] [CrossRef] [PubMed]
  41. Carbonell, C.; Adegbulugbe, A.; Cheung, W.; Ruff, P. Barriers and Challenges to Implementing a Quality Improvement Program: Political and Administrative Challenges. JCO Glob. Oncol. 2024, 10, e2300455. [Google Scholar] [CrossRef] [PubMed]
  42. Chan, A.-W.; Boutron, I.; Hopewell, S.; Moher, D.; Schulz, K.F.; Collins, G.S.; Tunn, R.; Aggarwal, R.; Berkwits, M.; Berlin, J.A.; et al. SPIRIT 2025 Statement: Updated Guideline for Protocols of Randomised Trials. BMJ 2025, 389, e081477. [Google Scholar] [CrossRef] [PubMed]
  43. Ogrinc, G.; Davies, L.; Goodman, D.; Batalden, P.; Davidoff, F.; Stevens, D. SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): Revised Publication Guidelines from a Detailed Consensus Process. BMJ Qual. Saf. 2016, 25, 986–992. [Google Scholar] [CrossRef] [PubMed]
  44. Kennedy, M.; Barnsteiner, J. The Quality of Clinician and Student Quality Improvement Reports: An Analysis of 8 Years of Submissions. J. Nurs. Sch. 2024, 56, 836–842. [Google Scholar] [CrossRef] [PubMed]
  45. Tawfik, G.M.; Giang, H.T.N.; Ghozy, S.; Altibi, A.M.; Kandil, H.; Le, H.-H.; Eid, P.S.; Radwan, I.; Makram, O.M.; Hien, T.T.T.; et al. Protocol Registration Issues of Systematic Review and Meta-Analysis Studies: A Survey of Global Researchers. BMC Méd. Res. Methodol. 2020, 20, 213. [Google Scholar] [CrossRef] [PubMed]
  46. Mačiulienė, M. Beyond Open Access: Conceptualizing Open Science for Knowledge Co-Creation. Front. Commun. 2022, 7, 907745. [Google Scholar] [CrossRef]
  47. Thibault, R.T.; Amaral, O.B.; Argolo, F.; Bandrowski, A.E.; Davidson, A.R.; Drude, N.I. Open Science 2.0: Towards a Truly Collaborative Research Ecosystem. PLoS Biol. 2023, 21, e3002362. [Google Scholar] [CrossRef] [PubMed]
  48. Pool, J.; Akhlaghpour, S.; Fatehi, F.; Burton-Jones, A. A Systematic Analysis of Failures in Protecting Personal Health Data: A Scoping Review. Int. J. Inf. Manag. 2024, 74, 102719. [Google Scholar] [CrossRef]
  49. Au, S.; Murray, E. Data Management for Quality Improvement: How to Collect and Manage Data. AACN Adv. Crit. Care 2021, 32, 213–218. [Google Scholar] [CrossRef] [PubMed]
  50. Von Elm, E.; Altman, D.G.; Egger, M.; Pocock, S.J.; Gøtzsche, P.C.; Vandenbroucke, J.P.; Initiative, S. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: Guidelines for Reporting Observational Studies. BMJ 2007, 335, 806. [Google Scholar] [CrossRef] [PubMed]
  51. Torab-Miandoab, A.; Samad-Soltani, T.; Jodati, A.; Rezaei-Hachesu, P. Interoperability of Heterogeneous Health Information Systems: A Systematic Literature Review. BMC Méd. Inform. Decis. Mak. 2023, 23, 18. [Google Scholar] [CrossRef] [PubMed]
  52. Azarm-Daigle, M.; Kuziemsky, C.; Peyton, L. A Review of Cross Organizational Healthcare Data Sharing. Procedia Comput. Sci. 2015, 63, 425–432. [Google Scholar] [CrossRef]
  53. Sonkamble, R.G.; Phansalkar, S.P.; Potdar, V.M.; Bongale, A.M. Survey of Interoperability in Electronic Health Records Management and Proposed Blockchain Based Framework: MyBlockEHR. IEEE Access 2021, 9, 158367–158401. [Google Scholar] [CrossRef]
  54. AbuHalimeh, A. Improving Data Quality in Clinical Research Informatics Tools. Front. Big Data 2022, 5, 871897. [Google Scholar] [CrossRef] [PubMed]
  55. Chishtie, J.; Sapiro, N.; Wiebe, N.; Rabatach, L.; Lorenzetti, D.; Leung, A.A.; Rabi, D.; Quan, H.; Eastwood, C.A. Use of Epic Electronic Health Record System for Health Care Research: Scoping Review. J. Méd. Internet Res. 2023, 25, e51003. [Google Scholar] [CrossRef] [PubMed]
  56. Gavrilov, G.; Vlahu-Gjorgievska, E.; Trajkovik, V. Healthcare Data Warehouse System Supporting Cross-Border Interoperability. Health Inform. J. 2020, 26, 1321–1332. [Google Scholar] [CrossRef] [PubMed]
  57. Kaji, A.H.; Schriger, D.; Green, S. Looking Through the Retrospectoscope: Reducing Bias in Emergency Medicine Chart Review Studies. Ann. Emerg. Med. 2014, 64, 292–298. [Google Scholar] [CrossRef] [PubMed]
  58. Benchimol, E.I.; Smeeth, L.; Guttmann, A.; Harron, K.; Moher, D.; Petersen, I.; Sørensen, H.T.; von Elm, E.; Langan, S.M.; Committee, R.W. The REporting of Studies Conducted Using Observational Routinely-Collected Health Data (RECORD) Statement. PLoS Med. 2015, 12, e1001885. [Google Scholar] [CrossRef] [PubMed]
  59. Reid, R.J.; Wodchis, W.P.; Kuluski, K.; Lee-Foon, N.K.; Lavis, J.N.; Rosella, L.C.; Desveaux, L. Actioning the Learning Health System: An Applied Framework for Integrating Research into Health Systems. SSM-Health Syst. 2024, 2, 100010. [Google Scholar] [CrossRef]
  60. Teede, H.; Cadilhac, D.A.; Purvis, T.; Kilkenny, M.F.; Campbell, B.C.V.; English, C.; Johnson, A.; Callander, E.; Grimley, R.S.; Levi, C.; et al. Learning Together for Better Health Using an Evidence-Based Learning Health System Framework: A Case Study in Stroke. BMC Med. 2024, 22, 198. [Google Scholar] [CrossRef] [PubMed]
Figure 1. A graphical depiction of the relationship between effort and limitations for various healthcare quality evaluation methods.
Figure 1. A graphical depiction of the relationship between effort and limitations for various healthcare quality evaluation methods.
Hospitals 02 00018 g001
Table 1. Characteristics of clinical quality evaluation methods.
Table 1. Characteristics of clinical quality evaluation methods.
CharacteristicQuality
Improvement
Quality
Assurance
Observational
Research
Clinical Quality
Registry
GoalImprovementComplianceNew knowledgeAll
InstigatorSelf-determinantRegulated standardsSelf-determinantCo-designed
Action ApproachProactiveReactiveProactiveProactive
TimelineDefinedContinuousDefinedContinuous
Sample SizeJust enoughJust enoughJust in caseAs much as possible
BiasAcceptMinimizeMeasure/AdjustMeasure/Adjust
Ethical AuthorityOrganizationRegulatorEthics CommitteeEthics Committee
GovernanceOrganizationRegulatorResearcher(s)Registry owner(s)
EffortLow-MediumMediumMedium-HighHigh
Table 2. Quality projects performed in the ED and quality method used.
Table 2. Quality projects performed in the ED and quality method used.
MethodProject
QIAdherence to the Chest Pain Pathway at the Royal Hobart Hospital ED
QIAlcohol and Drug Presentations in Youth and Adults
CQRAustralia and New Zealand Emergency Department Airway Registry
CQRAustralia and New Zealand Hip Fracture Registry
CQRAustralia and New Zealand Trauma Registry
CQRAustralian Stroke Clinical Registry
ORDoes prescribing of immediate release oxycodone by emergency medicine physicians result in persistence of Schedule 8 opioids following discharge?
OREmergency department presentations with a mental health diagnosis in Australia, by jurisdiction and by sex, 2004–05 to 2016–17
CQREmergency Drugs Network of Australia
OREpidemiology and clinical features of emergency department patients with suspected and confirmed COVID-19: A multisite report from the COVED Quality Improvement Project for July 2020 (COVED 3)
OREpidemiology and clinical features of emergency department patients with suspected COVID-19: Insights from Australia’s ‘second wave’ (COVED-4)
OREvaluating the use of computed tomography pulmonary angiography in Tasmanian emergency departments
QIGeriatric trauma management
ORIs there a ‘weekend effect’ in subacute and minor injuries at a mixed tertiary ED?
QI 1Medical Emergency Team Call Database
ORMedical management of blood pressure and heart rate in acute type b aortic dissections: A single quaternary Centre perspective
ORPain Management and Sedation in Paediatric Ileocolic Intussusception: A Global, Multicenter, Retrospective Study
QIPain relief for major burns patients in ED: Experience at a tertiary burns referral centre and literature review
QIPelvic Pain management in the ED
QI 1Procedural Sedation in Emergency Departments
ORRuptured abdominal aortic aneurysms a study of prevalence, associated comorbidities, intervention techniques and mortality
QISepsis—time to antibiotics and pathway audit
QISyncope clinical quality audit
QITrauma primary surveys
ORTrend of emergency department presentations with a mental health diagnosis in Australia by diagnostic group, 2004–05 to 2016–17
ORTrend of emergency department presentations with a mental health diagnosis in Australia by jurisdiction and by sex, 2004–05 to 2016–17
ORTrends of emergency department presentations with a mental health diagnosis by age, Australia, 2004-05 to 2016–17: A secondary data analysis
QIValidation of Alvarado Score in Male Paediatric population
QIWhole body CT scan ordering in Trauma
QI Quality Improvement; OR Observational Research; CQR Clinical Quality Registry. 1 These quality improvement initiatives use a clinical quality registry methodology at a local level.
Table 3. Data mapping of TECOR minimum dataset and additional data.
Table 3. Data mapping of TECOR minimum dataset and additional data.
Data Sources for
TECOR Datapoint(s)
Data
Extraction
Health Executive Analytics Reporting Tool (HEART): Emergency Department Dashboard by QlikTech Australia Pty Ltd.
Age at presentation 1
Sex 1
Australian Postcode 1
Remoteness classification 1
Country of birth 1
Indigenous status 1
Hospital 1
Mode of arrival 1
Date, time of presentation 1
Date, time of triage 1
ED end date, time 1
Service episode length (total min) 1
Type of visit 1
Triage category 1
ED wait time 1
Disposition 1
Date seen by medical officer 1 Time seen by medical officer 1
ED stay—urgency related group major diagnostic block 1
ED stay—principal diagnosis 1
ED stay—diagnosis classification type 1
ED stay—physical departure date 1
ED stay—physical departure time 1
Patient—compensable status 1
ED stay—additional diagnosis, code 1
Universal Record Number 1
Emergency Attendance ID 1
ED triage/presenting complaint 1
Admission time, date 1
Episode of care—source of funding, patient funding source 1
Emergency Department Capacity
Self-extractable
Health Executive Analytics Reporting Tool (HEART): Admitted Patient Dashboard by QlikTech Australia Pty Ltd.
Episode of admitted patient care
Admission date, time 1
Separation date, time, mode
Admission urgency status
Days of hospital-in-the-home care
Hospital Capacity
Episode of admitted patient care
Length of stay in intensive care unit, total hours
Episode of care
Principal diagnosis, code
Secondary diagnosis and beyond, code
Self-extractable
i.Patient Manager (IPM) by Dedalus (Australia) Pty Ltd.
Bed request time, date
Bed available time, date
Admission/Discharge/Transfer timeManual entry
TrakCare by InterSystems Australia Pty Ltd.
Pathology testing requested time, date and typeImaging testing requested time, date and typeOn Request
Communications Code Call Database 2
Code Black, Red, Yellow, Purple, Brown, Blue, Care Call date, time, location
Code Stroke, STEMI, Trauma Call date, time, location
Manual entry
MedTasker by Nimblic Pty Ltd.
Referred by, position, specialty, time, date
Referred to, position, specialty, time, date
Consultation, position, specialty, time, dateSelf-extractable
Digital Medical Records by InfoMedix Pty Ltd.
Clinical observations
Blood products given time and date
Procedures performed
Medication(s) prescribed
Clinical progress notes
Goals of Care
Radiology, Endoscopy result(s)
Echocardiogram, Cardiac angiogram result(s)
Manual entry
IntelliSpace Picture Archiving and Communication System by Philips Electronics Australia Ltd.
Radiological imaging modality, date, timeOn Request
Pathology Clinical Information System by Kestral Pty Ltd.
Pathology test(s) and blood product(s) result(s) On Request
eReferrals by HealthCare Software Pty Ltd.
Outpatient referral(s)On Request
Hospital Clinical Suite by HealthCare Software Pty Ltd.
Discharge Prescription(s)On Request
Safety and Reporting Learning System (SLRS) by RLDatix Australia Pty Ltd.
Incident and safety event reportingOn Request
1 Minimum dataset. 2 Manual entry spreadsheets.
Table 4. Coding framework for data extractability.
Table 4. Coding framework for data extractability.
CodeDefinition
Self-extractableStaff member(s) employed by the ED are ABLE to extract data to import to TECOR
On RequestStaff member(s) employed by the ED are UNABLE to extract data but able to request data to import to TECOR
Manual entryDatapoints must be entered manually from the source of truth
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tran, V.; Thurlow, L.; Page, S.; Barrington, G. Framework for a Modular Emergency Departments Registry: A Case Study of the Tasmanian Emergency Care Outcomes Registry (TECOR). Hospitals 2025, 2, 18. https://doi.org/10.3390/hospitals2030018

AMA Style

Tran V, Thurlow L, Page S, Barrington G. Framework for a Modular Emergency Departments Registry: A Case Study of the Tasmanian Emergency Care Outcomes Registry (TECOR). Hospitals. 2025; 2(3):18. https://doi.org/10.3390/hospitals2030018

Chicago/Turabian Style

Tran, Viet, Lauren Thurlow, Simone Page, and Giles Barrington. 2025. "Framework for a Modular Emergency Departments Registry: A Case Study of the Tasmanian Emergency Care Outcomes Registry (TECOR)" Hospitals 2, no. 3: 18. https://doi.org/10.3390/hospitals2030018

APA Style

Tran, V., Thurlow, L., Page, S., & Barrington, G. (2025). Framework for a Modular Emergency Departments Registry: A Case Study of the Tasmanian Emergency Care Outcomes Registry (TECOR). Hospitals, 2(3), 18. https://doi.org/10.3390/hospitals2030018

Article Metrics

Back to TopTop