Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (600)

Search Parameters:
Keywords = ED management

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 1254 KiB  
Systematic Review
How Do the Psychological Functions of Eating Disorder Behaviours Compare with Self-Harm? A Systematic Qualitative Evidence Synthesis
by Faye Ambler, Andrew J. Hill, Thomas A. Willis, Benjamin Gregory, Samia Mujahid, Daniel Romeu and Cathy Brennan
Healthcare 2025, 13(15), 1914; https://doi.org/10.3390/healthcare13151914 - 5 Aug 2025
Abstract
Background: Eating disorders (EDs) and self-harm (SH) are both associated with distress, poor psychosocial functioning, and increased risk of mortality. Much of the literature discusses the complex interplay between SH and ED behaviours where co-occurrence is common. The onset of both is typically [...] Read more.
Background: Eating disorders (EDs) and self-harm (SH) are both associated with distress, poor psychosocial functioning, and increased risk of mortality. Much of the literature discusses the complex interplay between SH and ED behaviours where co-occurrence is common. The onset of both is typically seen during teenage years into early adulthood. A better understanding of the functions of these behaviours is needed to guide effective prevention and treatment, particularly during the crucial developmental years. An earlier review has explored the functions of self-harm, but an equivalent review for eating disorder behaviours does not appear to have been completed. Objectives: This evidence synthesis had two objectives. First, to identify and synthesise published first-hand accounts of the reasons why people engage in eating disorder behaviours with the view to develop a broad theoretical framework of functions. Second, to draw comparisons between the functions of eating disorder behaviours and self-harm. Methods: A qualitative evidence synthesis reporting first-hand accounts of the reasons for engaging in eating disorder behaviours. A ‘best fit’ framework synthesis, using the a priori framework from the review of self-harm functions, was undertaken with thematic analysis to categorise responses. Results: Following a systematic search and rigorous screening process, 144 studies were included in the final review. The most commonly reported functions of eating disorder behaviours were distress management (affect regulation) and interpersonal influence. This review identified significant overlap in functions between self-harm and eating disorder behaviours. Gender identity, responding to food insecurity, to delay growing up and responding to weight, shape, and body ideals were identified as functions more salient to eating disorder behaviours. Similarly, some self-harm functions were not identified in the eating disorder literature. These were experimenting, averting suicide, personal language, and exploring/maintaining boundaries. Conclusions: This evidence synthesis identified a prominent overlap between psychological functions of eating disorder behaviours and self-harm, specifically in relation to distress management (affect regulation). Despite clear overlap in certain areas, some functions were found to be distinct to each behaviour. The implications for delivering and adapting targeted interventions are discussed. Full article
Show Figures

Figure 1

10 pages, 209 KiB  
Article
“Hangry” in Forensic Psychiatry? Analysis of the Relationship Between Eating Disorders and Aggressive Behavior in Patients with Substance Use Disorders
by Judith Streb, Tinatin Deisenhofer, Samira Schneider, Victoria Peters and Manuela Dudeck
Brain Sci. 2025, 15(8), 836; https://doi.org/10.3390/brainsci15080836 (registering DOI) - 4 Aug 2025
Abstract
Background/Objectives: Substance use disorders and eating disorders frequently co-occur and are both associated with increased aggression. As a result, individuals with these conditions are overrepresented in prison populations. The present study investigated whether symptoms of eating disorders in male forensic psychiatric inpatients [...] Read more.
Background/Objectives: Substance use disorders and eating disorders frequently co-occur and are both associated with increased aggression. As a result, individuals with these conditions are overrepresented in prison populations. The present study investigated whether symptoms of eating disorders in male forensic psychiatric inpatients with substance use disorders are associated with heightened aggression. To this end, various forms of aggressive behavior—including spontaneous and reactive aggression, excitability, and violent offenses—were analyzed. Methods: Fifty-six male patients from two forensic psychiatric hospitals in Germany participated in the study. Symptoms of eating disorders were evaluated with the German version of the Eating Disorder Examination Questionnaire (EDE-Q), and aggression was measured with the Short Questionnaire for the Assessment of Aggression Factors (K-FAF) and by considering the violent index offense. Data were analyzed by generalized linear models, with age and body mass index (BMI) included as covariates. Results: Higher EDE-Q scores significantly predicted increased spontaneous aggression and excitability. However, no significant association was found between eating disorder symptoms and reactive aggression or the likelihood of a violent index offense. Age and BMI did not significantly influence any aggression subscales. Conclusions: The findings suggest that in patients with substance use disorder, eating disorder symptoms may be linked to heightened internalized forms of aggression. These results support the clinical relevance of screening for eating disorder symptoms in forensic psychiatric settings and integrating dietary interventions into therapeutic efforts to manage aggression. Full article
(This article belongs to the Special Issue Substance Abuse in the Psychiatric Population)
14 pages, 626 KiB  
Article
Mapping Clinical Questions to the Nursing Interventions Classification: An Evidence-Based Needs Assessment in Emergency and Intensive Care Nursing Practice in South Korea
by Jaeyong Yoo
Healthcare 2025, 13(15), 1892; https://doi.org/10.3390/healthcare13151892 - 2 Aug 2025
Viewed by 254
Abstract
Background/Objectives: Evidence-based nursing practice (EBNP) is essential in high-acuity settings such as intensive care units (ICUs) and emergency departments (EDs), where nurses are frequently required to make time-critical, high-stakes clinical decisions that directly influence patient safety and outcomes. Despite its recognized importance, [...] Read more.
Background/Objectives: Evidence-based nursing practice (EBNP) is essential in high-acuity settings such as intensive care units (ICUs) and emergency departments (EDs), where nurses are frequently required to make time-critical, high-stakes clinical decisions that directly influence patient safety and outcomes. Despite its recognized importance, the implementation of EBNP remains inconsistent, with frontline nurses often facing barriers to accessing and applying current evidence. Methods: This descriptive, cross-sectional study systematically mapped and prioritized clinical questions generated by ICU and ED nurses at a tertiary hospital in South Korea. Using open-ended questionnaires, 204 clinical questions were collected from 112 nurses. Each question was coded and classified according to the Nursing Interventions Classification (NIC) taxonomy (8th edition) through a structured cross-mapping methodology. Inter-rater reliability was assessed using Cohen’s kappa coefficient. Results: The majority of clinical questions (56.9%) were mapped to the Physiological: Complex domain, with infection control, ventilator management, and tissue perfusion management identified as the most frequent areas of inquiry. Patient safety was the second most common domain (21.6%). Notably, no clinical questions were mapped to the Family or Community domains, highlighting a gap in holistic and transitional care considerations. The mapping process demonstrated high inter-rater reliability (κ = 0.85, 95% CI: 0.80–0.89). Conclusions: Frontline nurses in high-acuity environments predominantly seek evidence related to complex physiological interventions and patient safety, while holistic and community-oriented care remain underrepresented in clinical inquiry. Utilizing the NIC taxonomy for systematic mapping establishes a reliable framework to identify evidence gaps and support targeted interventions in nursing practice. Regular protocol evaluation, alignment of continuing education with empirically identified priorities, and the integration of concise evidence summaries into clinical workflows are recommended to enhance EBNP implementation. Future research should expand to multicenter and interdisciplinary settings, incorporate advanced technologies such as artificial intelligence for automated mapping, and assess the long-term impact of evidence-based interventions on patient outcomes. Full article
(This article belongs to the Section Nursing)
Show Figures

Figure 1

22 pages, 1724 KiB  
Article
Development and Clinical Interpretation of an Explainable AI Model for Predicting Patient Pathways in the Emergency Department: A Retrospective Study
by Émilien Arnaud, Pedro Antonio Moreno-Sanchez, Mahmoud Elbattah, Christine Ammirati, Mark van Gils, Gilles Dequen and Daniel Aiham Ghazali
Appl. Sci. 2025, 15(15), 8449; https://doi.org/10.3390/app15158449 - 30 Jul 2025
Viewed by 348
Abstract
Background: Overcrowded emergency departments (EDs) create significant challenges for patient management and hospital efficiency. In response, Amiens Picardy University Hospital (APUH) developed the “Prediction of the Patient Pathway in the Emergency Department” (3P-U) model to enhance patient flow management. Objectives: To develop and [...] Read more.
Background: Overcrowded emergency departments (EDs) create significant challenges for patient management and hospital efficiency. In response, Amiens Picardy University Hospital (APUH) developed the “Prediction of the Patient Pathway in the Emergency Department” (3P-U) model to enhance patient flow management. Objectives: To develop and clinically validate an explainable artificial intelligence (XAI) model for hospital admission predictions, using structured triage data, and demonstrate its real-world applicability in the ED setting. Methods: Our retrospective, single-center study involved 351,019 patients consulting in APUH’s EDs between 2015 and 2018. Various models (including a cross-validation artificial neural network (ANN), a k-nearest neighbors (KNN) model, a logistic regression (LR) model, and a random forest (RF) model) were trained and assessed for performance with regard to the area under the receiver operating characteristic curve (AUROC). The best model was validated internally with a test set, and the F1 score was used to determine the best threshold for recall, precision, and accuracy. XAI techniques, such as Shapley additive explanations (SHAP) and partial dependence plots (PDP) were employed, and the clinical explanations were evaluated by emergency physicians. Results: The ANN gave the best performance during the training stage, with an AUROC of 83.1% (SD: 0.2%) for the test set; it surpassed the RF (AUROC: 71.6%, SD: 0.1%), KNN (AUROC: 67.2%, SD: 0.2%), and LR (AUROC: 71.5%, SD: 0.2%) models. In an internal validation, the ANN’s AUROC was 83.2%. The best F1 score (0.67) determined that 0.35 was the optimal threshold; the corresponding recall, precision, and accuracy were 75.7%, 59.7%, and 75.3%, respectively. The SHAP and PDP XAI techniques (as assessed by emergency physicians) highlighted patient age, heart rate, and presentation with multiple injuries as the features that most specifically influenced the admission from the ED to a hospital ward. These insights are being used in bed allocation and patient prioritization, directly improving ED operations. Conclusions: The 3P-U model demonstrates practical utility by reducing ED crowding and enhancing decision-making processes at APUH. Its transparency and physician validation foster trust, facilitating its adoption in clinical practice and offering a replicable framework for other hospitals to optimize patient flow. Full article
Show Figures

Figure 1

18 pages, 3277 KiB  
Article
A Clinical Prediction Model for Personalised Emergency Department Discharge Decisions for Residential Care Facility Residents Post-Fall
by Gigi Guan, Kadison Michel, Charlie Corke and Geetha Ranmuthugala
J. Pers. Med. 2025, 15(8), 332; https://doi.org/10.3390/jpm15080332 - 30 Jul 2025
Viewed by 177
Abstract
Introduction: Falls are the leading cause of Emergency Department (ED) presentations among residents from residential aged care facilities (RACFs). While most current studies focus on post-fall evaluations and fall prevention, limited research has been conducted on decision-making in post-fall management. Objective: [...] Read more.
Introduction: Falls are the leading cause of Emergency Department (ED) presentations among residents from residential aged care facilities (RACFs). While most current studies focus on post-fall evaluations and fall prevention, limited research has been conducted on decision-making in post-fall management. Objective: To develop and internally validate a model that can predict the likelihood of RACF residents being discharged from the ED after being presented for a fall. Methods: The study sample was obtained from a previous study conducted in Shepparton, Victoria, Australia. Consecutive samples were selected from January 2023 to November 2023. Participants aged 65 and over were included in this study. Results: A total of 261 fall presentations were initially identified. One patient with Australasian Triage Scale category 1 was excluded to avoid overfitting, leaving 260 presentations for analysis. Two logistic regression models were developed using prehospital and ED variables. The ED predictor model variables included duration of ED stay, injury severity, and the presence of an advance care directive (ACD). It demonstrated excellent discrimination (AUROC = 0.83; 95% CI: 0.79–0.89) compared to the prehospital model (AUROC = 0.77, 95% CI: 0.72–0.83). A simplified four-variable Discharge Eligibility after Fall in Elderly Residents (DEFER) score was derived from the prehospital model. The score achieved an AUROC of 0.76 (95% CI: 0.71–0.82). At a cut-off score of ≥5, the DEFER score exhibited a sensitivity of 79.7%, a specificity of 60.3%, a diagnostic odds ratio of 5.96, and a positive predictive value of 85.0%. Conclusions: The DEFER score is the first validated discharge prediction model for residents of RACFs who present to the ED after a fall. Importantly, the DEFER score advances personalised medicine in emergency care by integrating patient-specific factors, such as ACDs, to guide individualised discharge decisions for post-fall residents from RACFs. Full article
Show Figures

Figure 1

31 pages, 1317 KiB  
Article
Privacy-Preserving Clinical Decision Support for Emergency Triage Using LLMs: System Architecture and Real-World Evaluation
by Alper Karamanlıoğlu, Berkan Demirel, Onur Tural, Osman Tufan Doğan and Ferda Nur Alpaslan
Appl. Sci. 2025, 15(15), 8412; https://doi.org/10.3390/app15158412 - 29 Jul 2025
Viewed by 346
Abstract
This study presents a next-generation clinical decision-support architecture for Clinical Decision Support Systems (CDSS) focused on emergency triage. By integrating Large Language Models (LLMs), Federated Learning (FL), and low-latency streaming analytics within a modular, privacy-preserving framework, the system addresses key deployment challenges in [...] Read more.
This study presents a next-generation clinical decision-support architecture for Clinical Decision Support Systems (CDSS) focused on emergency triage. By integrating Large Language Models (LLMs), Federated Learning (FL), and low-latency streaming analytics within a modular, privacy-preserving framework, the system addresses key deployment challenges in high-stakes clinical settings. Unlike traditional models, the architecture processes both structured (vitals, labs) and unstructured (clinical notes) data to enable context-aware reasoning with clinically acceptable latency at the point of care. It leverages big data infrastructure for large-scale EHR management and incorporates digital twin concepts for live patient monitoring. Federated training allows institutions to collaboratively improve models without sharing raw data, ensuring compliance with GDPR/HIPAA, and FAIR principles. Privacy is further protected through differential privacy, secure aggregation, and inference isolation. We evaluate the system through two studies: (1) a benchmark of 750+ USMLE-style questions validating the medical reasoning of fine-tuned LLMs; and (2) a real-world case study (n = 132, 75.8% first-pass agreement) using de-identified MIMIC-III data to assess triage accuracy and responsiveness. The system demonstrated clinically acceptable latency and promising alignment with expert judgment on reviewed cases. The infectious disease triage case demonstrates low-latency recognition of sepsis-like presentations in the ED. This work offers a scalable, audit-compliant, and clinician-validated blueprint for CDSS, enabling low-latency triage and extensibility across specialties. Full article
(This article belongs to the Special Issue Large Language Models: Transforming E-health)
Show Figures

Figure 1

16 pages, 646 KiB  
Article
Psychometric Properties of the Diabetes Eating Problem Survey—Revised in Arab Adolescents with Type 1 Diabetes: A Cross-Cultural Validation Study
by Abdullah M. Alguwaihes, Shuliweeh Alenezi, Renad Almutawa, Rema Almutawa, Elaf Almusahel, Metib S. Alotaibi, Mohammed E. Al-Sofiani and Abdulmajeed AlSubaihin
Behav. Sci. 2025, 15(8), 1026; https://doi.org/10.3390/bs15081026 - 29 Jul 2025
Viewed by 311
Abstract
Objectives: The objective of this manuscript is to translate, adapt, and validate an Arabic version of the Diabetes Eating Problem Survey—Revised (DEPS-R) questionnaire to assess disordered eating behaviors (DEBs) in adolescents with T1D in Saudi Arabia. Additionally, the study sought to estimate the [...] Read more.
Objectives: The objective of this manuscript is to translate, adapt, and validate an Arabic version of the Diabetes Eating Problem Survey—Revised (DEPS-R) questionnaire to assess disordered eating behaviors (DEBs) in adolescents with T1D in Saudi Arabia. Additionally, the study sought to estimate the prevalence of DEBs and analyze its associations with glycemic control and diabetes-related complications. Methods: A cross-cultural validation study was conducted following the COSMIN guidelines. The DEPS-R questionnaire was translated into Arabic through forward and backward translation involving expert panels, including psychiatrists, diabetologists, and linguists. A sample of 409 people with type 1 diabetes (PwT1D) (58.4% females) aged 12–20 years was recruited from outpatient diabetes clinics in the five main regions of Saudi Arabia. Participants completed the Arabic DEPS-R and the validated Arabic version of the SCOFF questionnaire. Sociodemographic, anthropometric, and biochemical data were collected, and statistical analyses, including confirmatory factor analysis (CFA) and internal consistency tests, were conducted. Results: The Arabic DEPS-R exhibits strong internal consistency (Cronbach’s alpha = 0.829) and high test–retest reliability (ICC = 0.861), with a CFA supporting a three-factor structure, namely body weight perception, disordered eating behaviors (DEBs), and bulimic tendencies. Notably, higher DEPS-R scores are significantly linked to elevated HbA1c levels, increased BMI, and more frequent insulin use. Alarmingly, 52.8% of participants show high-risk DEB, which is directly associated with poor glycemic control (HbA1c ≥ 8.1%) and a heightened risk of diabetic ketoacidosis (DKA). Conclusions: The Arabic DEPS-R is a valid and reliable tool for screening DEBs among Saudi adolescents with T1D. Findings underscore the necessity for early identification and intervention to mitigate the impact of EDs on diabetes management and overall health outcomes. Full article
(This article belongs to the Section Child and Adolescent Psychiatry)
Show Figures

Figure 1

17 pages, 540 KiB  
Article
Kalemia Significantly Influences Clinical Outcomes in Patients with Severe Traumatic Brain Injury (TBI)
by Bharti Sharma, Munirah Hasan, Usha S. Govindarajulu, George Agriantonis, Navin D. Bhatia, Jasmine Dave, Juan Mestre, Shalini Arora, Saad Bhatti, Zahra Shafaee, Suganda Phalakornkul, Kate Twelker and Jennifer Whittington
Diagnostics 2025, 15(15), 1878; https://doi.org/10.3390/diagnostics15151878 - 26 Jul 2025
Viewed by 309
Abstract
Objective: Potassium levels (KLs) influence clinical outcomes in severe traumatic brain injury (TBI). This study investigates the relationship between KLs and clinical outcomes to improve prognosis and guide management. Method: A retrospective study was conducted at a level 1 trauma center [...] Read more.
Objective: Potassium levels (KLs) influence clinical outcomes in severe traumatic brain injury (TBI). This study investigates the relationship between KLs and clinical outcomes to improve prognosis and guide management. Method: A retrospective study was conducted at a level 1 trauma center in Queens, New York, from January 2020 to December 2023. Patients with an AIS score of 3 or higher were included. KLs were measured at the time of hospital admission, ICU admission, ICU discharge, hospital discharge, and death, if applicable. Clinical outcomes such as age, race, length of hospital stay (H LOS), ICU length of stay (ICU LOS), ventilation days (VDs), Glasgow Coma Scale (GCS), and mortality were assessed. Results: KLs were categorized into five groups: extreme hypokalemia (<2.5 mEq/L), hypokalemia (2.6–3.5 mEq/L), normokalemia (3.5–5.2 mEq/L), hyperkalemia (5.2–7.0 mEq/L), and extreme hyperkalemia (>7.0 mEq/L). Significant correlations were observed between KLs at hospital admission and age (p = 0.0113), race (p = 0.003), and H LOS (p = 0.079). ICU KLs showed positive correlations with AIS head score (p = 0.038), ISS (p = 7.84 × 10−6), and GCS (p = 2.6 × 10−6). ICU KLs were also associated with LOS in the Emergency Department (ED) (p = 6.875 × 10−6) and ICU (p = 1.34 × 10−21), as well as VDs (p = 7.19 × 10−7). ICU discharge KLs correlated with ISS (p = 2.316 × 10−3), GCS (p = 2.201 × 10−3), ED LOS (p = 3.163 × 10−4), and VDs (p = 7.44 × 10−4). KLs at discharge were linked with mortality (p < 0.0001) and H LOS (p = 0.0091). Additionally, KLs at the time of death were correlated with ISS (p = 0.01965), GCS (p = 0.01219), ED LOS (p = 0.00594), ICU LOS (p = 0.049), VDs (p = 0.00005), and mortality (p < 0.0001). Conclusions: Potassium imbalances, especially hypokalemia, significantly affect outcomes in severe TBI patients. Monitoring and managing KLs may improve prognosis. Full article
(This article belongs to the Special Issue Diagnostics in the Emergency and Critical Care Medicine)
Show Figures

Figure 1

17 pages, 2548 KiB  
Article
Enhancing Multi-Step Reservoir Inflow Forecasting: A Time-Variant Encoder–Decoder Approach
by Ming Fan, Dan Lu and Sudershan Gangrade
Geosciences 2025, 15(8), 279; https://doi.org/10.3390/geosciences15080279 - 24 Jul 2025
Viewed by 266
Abstract
Accurate reservoir inflow forecasting is vital for effective water resource management. Reliable forecasts enable operators to optimize storage and release strategies to meet competing sectoral demands—such as water supply, irrigation, and hydropower scheduling—while also mitigating flood and drought risks. To address this need, [...] Read more.
Accurate reservoir inflow forecasting is vital for effective water resource management. Reliable forecasts enable operators to optimize storage and release strategies to meet competing sectoral demands—such as water supply, irrigation, and hydropower scheduling—while also mitigating flood and drought risks. To address this need, in this study, we propose a novel time-variant encoder–decoder (ED) model designed specifically to improve multi-step reservoir inflow forecasting, enabling accurate predictions of reservoir inflows up to seven days ahead. Unlike conventional ED-LSTM and recursive ED-LSTM models, which use fixed encoder parameters or recursively propagate predictions, our model incorporates an adaptive encoder structure that dynamically adjusts to evolving conditions at each forecast horizon. Additionally, we introduce the Expected Baseline Integrated Gradients (EB-IGs) method for variable importance analysis, enhancing interpretability of inflow by incorporating multiple baselines to capture a broader range of hydrometeorological conditions. The proposed methods are demonstrated at several diverse reservoirs across the United States. Our results show that they outperform traditional methods, particularly at longer lead times, while also offering insights into the key drivers of inflow forecasting. These advancements contribute to enhanced reservoir management through improved forecasting accuracy and practical decision-making insights under complex hydroclimatic conditions. Full article
(This article belongs to the Special Issue AI and Machine Learning in Hydrogeology)
Show Figures

Figure 1

23 pages, 7168 KiB  
Article
Enhancing Soil Phosphorus Availability in Intercropping Systems: Roles of Plant Growth Regulators
by Chunhua Gao, Weilin Kong, Fengtao Zhao, Feiyan Ju, Ping Liu, Zongxin Li, Kaichang Liu and Haijun Zhao
Agronomy 2025, 15(7), 1748; https://doi.org/10.3390/agronomy15071748 - 20 Jul 2025
Viewed by 322
Abstract
Plant growth regulators (PGRs) enhance crop stress resistance but their roles in microbial-mediated phosphorus cycling within intercropping systems are unclear. Thus, We conducted a two-year field study using corn (Zea mays L. cv. Denghai 605) and soybean (Glycine max L. cv. [...] Read more.
Plant growth regulators (PGRs) enhance crop stress resistance but their roles in microbial-mediated phosphorus cycling within intercropping systems are unclear. Thus, We conducted a two-year field study using corn (Zea mays L. cv. Denghai 605) and soybean (Glycine max L. cv. Hedou 22) in fluvisols and luvisols soil according to World Reference Base for Soil Resources (WRB) standard. Under a 4-row corn and 6-row soybean strip intercropping system, three treatments were applied: a water control (CK), and two plant growth regulators—T1 (EC: ethephon [300 mg/L] + cycocel [2 g/L]) and T2 (ED: ethephon [300 mg/L] + 2-Diethyl aminoethyl hexanoate [10 mg/L]). Foliar applications were administered at the V7 stage (seventh leaf) of intercropped corn plants to assess how foliar-applied PGRs (T1/T2) modulated the soil phosphorus availability, microbial communities, and functional genes in maize intercropping systems. PGRs increased the soil organic phosphorus and available phosphorus contents, and alkaline phosphatase activity, but not total phosphorus. PGRs declined the α-diversity in fluvisols soil but increased the α-diversity in luvisols soil. The major taxa changed from Actinobacteria (CK) to Proteobacteria (T1) and Saccharibacteria (T2) in fluvisols soil, and from Actinobacteria/Gemmatimonadetes (CK) to Saccharibacteria (T1) and Acidobacteria (T2) in luvisols soil. Functional gene dynamics indicated soil-specific regulation, where fluvisols soil harbored more phoD (organic phosphorus mineralization) and relA (polyphosphate degradation) genes, whereas phnP gene dominated in luvisols soil. T1 stimulated organic phosphorus mineralization and inorganic phosphorus solubilization in fluvisols soil, upregulating regulation genes, and T2 enhanced polyphosphate synthesis and transport gene expression in luvisols soil. Proteobacteria, Nitrospirae, and Chloroflexi were positively correlated with organic phosphorus mineralization and polyphosphate cycling genes, whereas Bacteroidetes and Verrucomicrobia correlated with available potassium (AP), total phosphorus (TP), and alkaline phosphatase (ALP) activity. Thus, PGRs activated soil phosphorus by restructuring soil type-dependent microbial functional networks, connecting PGRs-induced shifts with microbial phosphorus cycling mechanisms. These findings facilitate the targeted use of PGRs to optimize microbial-driven phosphorus efficiency in strategies for sustainable phosphorus management in diverse agricultural soils. Full article
(This article belongs to the Section Innovative Cropping Systems)
Show Figures

Figure 1

18 pages, 11724 KiB  
Article
Hydrogen–Rock Interactions in Carbonate and Siliceous Reservoirs: A Petrophysical Perspective
by Rami Doukeh, Iuliana Veronica Ghețiu, Timur Vasile Chiș, Doru Bogdan Stoica, Gheorghe Brănoiu, Ibrahim Naim Ramadan, Ștefan Alexandru Gavrilă, Marius Gabriel Petrescu and Rami Harkouss
Appl. Sci. 2025, 15(14), 7957; https://doi.org/10.3390/app15147957 - 17 Jul 2025
Viewed by 769
Abstract
Underground hydrogen storage (UHS) in carbonate and siliceous formations presents a promising solution for managing intermittent renewable energy. However, experimental data on hydrogen–rock interactions under representative subsurface conditions remain limited. This study systematically investigates mineralogical and petrophysical alterations in dolomite, calcite-rich limestone, and [...] Read more.
Underground hydrogen storage (UHS) in carbonate and siliceous formations presents a promising solution for managing intermittent renewable energy. However, experimental data on hydrogen–rock interactions under representative subsurface conditions remain limited. This study systematically investigates mineralogical and petrophysical alterations in dolomite, calcite-rich limestone, and quartz-dominant siliceous cores subjected to high-pressure hydrogen (100 bar, 70 °C, 100 days). Distinct from prior research focused on diffraction peak shifts, our analysis prioritizes quantitative changes in mineral concentration (%) as a direct metric of reactivity and structural integrity, offering more robust insights into long-term storage viability. Hydrogen exposure induced significant dolomite dissolution, evidenced by reduced crystalline content (from 12.20% to 10.53%) and accessory phase loss, indicative of partial decarbonation and ankerite-like formation via cation exchange. Conversely, limestone exhibited more pronounced carbonate reduction (vaterite from 6.05% to 4.82% and calcite from 2.35% to 0%), signaling high reactivity, mineral instability, and potential pore clogging from secondary precipitation. In contrast, quartz-rich cores demonstrated exceptional chemical inertness, maintaining consistent mineral concentrations. Furthermore, Brunauer–Emmett–Teller (BET) surface area and Barrett–Joyner–Halenda (BJH) pore distribution analyses revealed enhanced porosity and permeability in dolomite (pore volume increased >10×), while calcite showed declining properties and quartz showed negligible changes. SEM-EDS supported these trends, detailing Fe migration and textural evolution in dolomite, microfissuring in calcite, and structural preservation in quartz. This research establishes a unique experimental framework for understanding hydrogen–rock interactions under reservoir-relevant conditions. It provides crucial insights into mineralogical compatibility and structural resilience for UHS, identifying dolomite as a highly promising host and highlighting calcitic rocks’ limitations for long-term hydrogen containment. Full article
(This article belongs to the Topic Exploitation and Underground Storage of Oil and Gas)
Show Figures

Figure 1

16 pages, 1855 KiB  
Article
Clinical and Imaging Characteristics to Discriminate Between Complicated and Uncomplicated Acute Cholecystitis: A Regression Model and Decision Tree Analysis
by Yu Chen, Ning Kuo, Hui-An Lin, Chun-Chieh Chao, Suhwon Lee, Cheng-Han Tsai, Sheng-Feng Lin and Sen-Kuang Hou
Diagnostics 2025, 15(14), 1777; https://doi.org/10.3390/diagnostics15141777 - 14 Jul 2025
Viewed by 300
Abstract
Background: Acute complicated cholecystitis (ACC) is associated with prolonged hospitalization, increased morbidity, and higher mortality. However, objective imaging-based criteria to guide early clinical decision-making remain limited. This study aimed to develop a predictive scoring system integrating clinical characteristics, laboratory biomarkers, and computed [...] Read more.
Background: Acute complicated cholecystitis (ACC) is associated with prolonged hospitalization, increased morbidity, and higher mortality. However, objective imaging-based criteria to guide early clinical decision-making remain limited. This study aimed to develop a predictive scoring system integrating clinical characteristics, laboratory biomarkers, and computed tomography (CT) findings to facilitate the early identification of ACC in the emergency department (ED). Methods: We conducted a retrospective study at an urban tertiary care center in Taiwan, screening 729 patients who presented to the ED with suspected cholecystitis between 1 January 2018 and 31 December 2020. Eligible patients included adults (≥18 years) with a confirmed diagnosis of acute cholecystitis based on the Tokyo Guidelines 2018 (TG18) and who were subsequently admitted for further management. Exclusion criteria included (a) the absence of contrast-enhanced CT imaging, (b) no hospital admission, (c) alternative final diagnosis, and (d) incomplete clinical data. A total of 390 patients met the inclusion criteria. Demographic data, laboratory results, and CT imaging features were analyzed. Logistic regression and decision tree analyses were used to construct predictive models. Results: Among the 390 included patients, 170 had mild, 170 had moderate, and 50 had severe cholecystitis. Key predictors of ACC included gangrenous changes, gallbladder wall attenuation > 80 Hounsfield units, CRP > 3 mg/dL, and WBC > 11,000/μL. A novel scoring system incorporating these variables demonstrated good diagnostic performance, with an area under the curve (AUC) of 0.775 and an optimal cutoff score of ≥2 points. Decision tree analysis similarly identified these four predictors as critical determinants in stratifying disease severity. Conclusions: This CT- and biomarker-based scoring system, alongside a decision tree model, provides a practical and robust tool for the early identification of complicated cholecystitis in the ED. Its implementation may enhance diagnostic accuracy and support timely clinical intervention. Full article
(This article belongs to the Section Medical Imaging and Theranostics)
Show Figures

Figure 1

12 pages, 295 KiB  
Article
Implementation of Telemedicine for Patients Referred to Emergency Medical Services
by Francesca Cortellaro, Lucia Taurino, Marzia Delorenzo, Paolo Pausilli, Valeria Ilardo, Andrea Duca, Giuseppe Stirparo, Giorgio Costantino, Filippo Galbiati, Ernesto Contro, Guido Bertolini, Lorenzo Fenech and Giuseppe Maria Sechi
Epidemiologia 2025, 6(3), 36; https://doi.org/10.3390/epidemiologia6030036 - 11 Jul 2025
Viewed by 381
Abstract
Background: he surge in the use of Pre-hospital Emergency Medical Systems (EMS) and Emergency Departments (ED) has become a pressing issue worldwide after the COVID-19 pandemic. To address this challenge, we developed an experimental and innovative care pathway supported by telemedicine. The aim [...] Read more.
Background: he surge in the use of Pre-hospital Emergency Medical Systems (EMS) and Emergency Departments (ED) has become a pressing issue worldwide after the COVID-19 pandemic. To address this challenge, we developed an experimental and innovative care pathway supported by telemedicine. The aim of this study is to describe the activity of the Integrated Medical Center (CMI): a new telemedicine-based care model for patients referring to the Emergency Medical System. Methods: A prospective observational study was conducted from January 2022 to December 2022. The CMI was established to manage patients referring to the Emergency Medical System. Results: From January to December 2022, a total of 8680 calls were managed by CMI, with an average of 24 calls per day. 6243 patients (71.9%) were managed without ED access of whom 4884 patients (78.2%) were managed through telemedicine evaluation only, and 1359 (21.8%) with telemedicine evaluation and dispatch of the Home Rapid Response Team (HRRT). The population treated by the HRRT exhibited a higher age. The mean satisfaction score was 9.1/10. Conclusions: Telemedicine evaluation allowed for remote assessments, treatment prescriptions, and teleconsultation for HRRT and was associated with high patient satisfaction. This model could be useful in future pandemics for managing patients with non-urgent illnesses at home, preventing hospital admissions for potentially infectious patients, and thereby reducing in-hospital transmission. Full article
Show Figures

Figure 1

12 pages, 486 KiB  
Article
Five-Year Retrospective Analysis of Traumatic and Non-Traumatic Pneumothorax in 2797 Patients
by Ayhan Tabur and Alper Tabur
Healthcare 2025, 13(14), 1660; https://doi.org/10.3390/healthcare13141660 - 10 Jul 2025
Viewed by 333
Abstract
Objectives: Pneumothorax is a critical condition frequently encountered in emergency departments (EDs), with spontaneous pneumothorax (SP) and traumatic pneumothorax (TP) presenting distinct clinical challenges. This study aimed to evaluate the epidemiological characteristics, clinical outcomes, and treatment strategies for SP and TP across different [...] Read more.
Objectives: Pneumothorax is a critical condition frequently encountered in emergency departments (EDs), with spontaneous pneumothorax (SP) and traumatic pneumothorax (TP) presenting distinct clinical challenges. This study aimed to evaluate the epidemiological characteristics, clinical outcomes, and treatment strategies for SP and TP across different age groups and provide insights for optimizing emergency management protocols. Methods: This retrospective cohort study analyzed 2797 cases of pneumothorax over five years (2018–2023) at a tertiary care center. Patients were stratified by age (18–39, 40–64, and >65 years) and pneumothorax type (SP vs. TP). Data on demographics, clinical presentation, treatment, hospital stay, recurrence, and complications were extracted from medical records. Comparative statistical analyses were also conducted. Results: The mean age of patients with SP was 32.5 ± 14.7 years, whereas patients with TP were older (37.8 ± 16.2 years, p < 0.001). Male predominance was observed in both groups: 2085 (87.0%) in the SP group and 368 (92.0%) in the TP group (p = 0.01). The right lung was more frequently affected in the SP (64.2%) and TP (56.0%) groups (p < 0.001). Age-related differences were evident in both groups of patients. In the SP group, younger patients (18–39 years) represented the majority of cases, whereas older patients (≥65 years) were more likely to present with SSP and required more invasive management (p < 0.01). In the TP group, younger patients often had pneumothorax due to high-energy trauma, whereas older individuals developed pneumothorax due to falls or iatrogenic causes (p < 0.01). SP predominantly affected younger patients, with a history of smoking and male predominance associated with younger age (p < 0.01). TP is more frequent in older patients, often because of falls or iatrogenic injuries. Management strategies varied by age group; younger patients were often managed conservatively, whereas older patients underwent more invasive procedures (p < 0.01). Surgical intervention was more common in younger patients in the TP group, whereas conservative management was more frequent in elderly patients (p < 0.01). The clinical outcomes differed significantly, with older patients having longer hospital stays and higher rates of persistent air leaks (p < 0.01). Recurrence was more common in younger patients with SP, whereas TP recurrence rates were lower across all age groups (p < 0.01). No significant differences were observed in re-expansion pulmonary edema, empyema, or mortality rates between the age groups, suggesting that age alone was not an independent predictor of these complications when adjusted for pneumothorax severity and management strategy (p = 0.22). Conclusions: Age, pneumothorax subtype, and underlying pulmonary comorbidities were identified as key predictors of clinical outcomes. Advanced age, secondary spontaneous pneumothorax, and COPD were independently associated with recurrence, prolonged hospitalization, and in-hospital mortality, respectively. These findings highlight the need for risk-adapted management strategies to improve triaging and treatment decisions for spontaneous and traumatic pneumothorax. Full article
Show Figures

Figure 1

14 pages, 802 KiB  
Article
Clinical Impact of Red Blood Cell Transfusion Location on Gastrointestinal Bleeding Outcomes: Emergency Department vs. Inpatient Unit
by Mehmet Toprak, Harun Yildirim, Ertan Sönmez, Murtaza Kaya, Ali Halici, Abdil Coskun and Mehmed Ulu
Healthcare 2025, 13(14), 1656; https://doi.org/10.3390/healthcare13141656 - 9 Jul 2025
Viewed by 278
Abstract
Background: Gastrointestinal (GI) bleeding is a common and potentially life-threatening condition frequently encountered in emergency departments (EDs). The optimal strategy for red blood cell suspension (RBCS) transfusion, including timing and location, remains unclear. This study aimed to evaluate the impact of transfusion location [...] Read more.
Background: Gastrointestinal (GI) bleeding is a common and potentially life-threatening condition frequently encountered in emergency departments (EDs). The optimal strategy for red blood cell suspension (RBCS) transfusion, including timing and location, remains unclear. This study aimed to evaluate the impact of transfusion location (ED vs. inpatient units) on mortality and hospital stay in patients with GI bleeding. Methods: A cross-sectional descriptive study was conducted in the ED of a tertiary care hospital. Patients admitted with GI bleeding between 1 June 2021, and 1 June 2023, who received RBCS transfusion were included. Data on demographics, laboratory parameters, transfusion details, and clinical outcomes were collected from the hospital information system. Logistic regression was used to identify mortality predictors. Results: A total of 244 patients were included. Patients transfused in the ED had a significantly shorter hospital stay compared to those transfused in inpatient units. However, mortality did not differ between the groups. Logistic regression identified age, albumin, hemoglobin, creatinine, and hospital stay as independent mortality predictors, while transfusion location was not significant. Conclusions: Early RBCS transfusion in the ED may reduce hospital stay but does not significantly impact mortality. Identifying mortality-associated factors is crucial for optimizing patient management. Further prospective studies are needed to clarify the role of transfusion location in GI bleeding outcomes. Full article
Show Figures

Figure 1

Back to TopTop