Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (184)

Search Parameters:
Keywords = ambulatory/standards

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 3367 KB  
Review
Assessment and Treatment of Varus Foot Deformity in Children with Cerebral Palsy: A Review
by Robert M. Kay and Susan A. Rethlefsen
J. Clin. Med. 2026, 15(3), 1147; https://doi.org/10.3390/jcm15031147 - 2 Feb 2026
Viewed by 38
Abstract
Cerebral palsy (CP) is a developmental disability caused by injury to the fetal or infant brain, affecting between 1.6 to 3.7 per 1000 live births worldwide. Ambulatory patients with cerebral palsy experience various gait problems, for which they seek treatment from medical professionals. [...] Read more.
Cerebral palsy (CP) is a developmental disability caused by injury to the fetal or infant brain, affecting between 1.6 to 3.7 per 1000 live births worldwide. Ambulatory patients with cerebral palsy experience various gait problems, for which they seek treatment from medical professionals. Varus foot deformities are among the most problematic for patients. Varus foot deformity is characterized by the inner border of the foot being tilted upward and the hindfoot inward, increasing weightbearing on the lateral aspect of the foot. This positioning increases weight-bearing pressure under the lateral (outside) of the foot and often under the fifth metatarsal head when walking. As such, varus foot deformity can contribute to in-toeing, make shoe and brace-wearing difficult and painful, compromise gait stability, and sometimes lead to metatarsal fractures. Current knowledge of CP etiology and classifications, as well as principles and advances in assessment and treatment decision making for varus foot deformities, are outlined in this narrative review. In younger children with flexible deformities, non-operative interventions such as bracing, botulinum toxin injection, and serial casting are effective. The literature and expert consensus suggest that, if possible, surgery should be delayed until after the age of 8 years. When surgery is indicated, soft tissue procedures are used for flexible deformities. In addition to the soft tissue procedures, bone surgery is needed for rigid deformities. Careful pre-operative foot assessment is needed, including assessment of deformity flexibility and range of motion, X-rays, and computerized gait analysis if possible. Strategies are presented for thorough assessment when gait analysis is not available or feasible. Research reports of surgical outcomes for soft tissue and bony correction are positive, but should be interpreted with caution. The quality of evidence on surgical outcomes is compromised by use of varying research design methods and selection of outcome measures, with few including measures of function or patient-reported outcomes. It is recommended that surgical outcome be assessed using standardized assessment tools, such as the Foot Posture Index, which have had their validity and reliability established. Recent advances in 3D kinematic foot model development and musculoskeletal modeling have the potential to greatly improve surgical outcomes for patients with CP. Full article
(This article belongs to the Special Issue Cerebral Palsy: Recent Advances in Clinical Management)
Show Figures

Figure 1

14 pages, 9609 KB  
Review
Routine Echocardiographic Assessment in LVAD Patients—A Structured Approach to Acquisition and Interpretation
by Nicolas Merke, Felix Schoenrath, Evgenij Potapov and Jan Knierim
J. Cardiovasc. Dev. Dis. 2026, 13(2), 70; https://doi.org/10.3390/jcdd13020070 - 30 Jan 2026
Viewed by 585
Abstract
Durable left ventricular assist devices (LVADs) are an established and highly effective therapy for patients with advanced heart failure. Ongoing technological improvements and structured follow-up programs have significantly enhanced device durability, reduced complications, and improved long-term survival. Consequently, a growing number of patients [...] Read more.
Durable left ventricular assist devices (LVADs) are an established and highly effective therapy for patients with advanced heart failure. Ongoing technological improvements and structured follow-up programs have significantly enhanced device durability, reduced complications, and improved long-term survival. Consequently, a growing number of patients with LVAD support require long-term outpatient care and increasingly present to both specialized and non-specialized hospitals, including for admissions unrelated to heart failure. In this context, echocardiography plays a central role. It is essential not only for routine follow-up at dedicated LVAD clinics but also for the assessment of cardiac status during inpatient admissions for extracardiac conditions. However, echocardiographic evaluation in LVAD patients is technically demanding and requires a solid understanding of LVAD physiology, device–heart interactions, and the specific hemodynamic conditions of continuous-flow support. Without this knowledge, standard echocardiographic parameters may be misleading. This review provides sonographers and cardiologists with a practical, clinically oriented framework for routine transthoracic echocardiography in patients with durable LVAD support. We summarize key principles of LVAD hemodynamics, discuss interpretation of LVAD console parameters, propose a standardized imaging protocol, and outline a structured approach to common echocardiographic findings in routine ambulatory and inpatient settings. Full article
Show Figures

Graphical abstract

10 pages, 557 KB  
Article
Effect of the Transradial Approach on Wrist Function in Diagnostic Cerebral Angiography
by Julian Kifmann, Michael Braun, Johannes Steinhart, Nico Sollmann, Christopher Kloth, Maria Pedro, Jens Dreyhaupt, Meinrad Beer, Bernd Schmitz and Johannes Rosskopf
Healthcare 2026, 14(2), 254; https://doi.org/10.3390/healthcare14020254 - 20 Jan 2026
Viewed by 183
Abstract
Background: With increasing demand for ambulatory catheter angiography, interest in the transradial approach for diagnostic cerebral procedures has grown markedly. This study aimed to evaluate the effect of the transradial approach for catheter-based diagnostic cerebral procedures on wrist function. Methods: Wrist [...] Read more.
Background: With increasing demand for ambulatory catheter angiography, interest in the transradial approach for diagnostic cerebral procedures has grown markedly. This study aimed to evaluate the effect of the transradial approach for catheter-based diagnostic cerebral procedures on wrist function. Methods: Wrist function was quantified by the Patient-Rated Wrist Evaluation (PRWE) questionnaire. The PRWE score ranged from 0 to 100, with 0 indicating no functional impairment. Association analyses with demographic and clinical parameters were performed using univariate logistic regression models. Results: A total of 88 patients underwent ambulatory diagnostic cerebral angiography during the 15-month observation period; of these, 40 (45%) participated in a telephone interview. Overall, 47.5% (n = 19) of patients reported no wrist impairment (PRWE = 0) after the transradial approach. The remaining 52.5% (n= 21) showed a mean PRWE score of 21.3 ± 22.5 (standard deviation), with a median value of 11.0 and a range from 1.0 to 87.0. Interestingly, univariate logistic regression models revealed a trend towards association between the dichotomized PRWE score and body mass index (p = 0.051). No associations were found with age, sex, prior neurosurgical status, total procedure duration, dose area product, fluoroscopy time, or dominant hand (p > 0.05). Conclusions: Following transradial cerebral catheter angiography, 52.5% of patients reported some degree of wrist impairment at follow-up; whether this represents procedure-related deterioration cannot be determined without baseline values. Full article
Show Figures

Figure 1

16 pages, 371 KB  
Article
Under-Detection of Depressive Symptoms in Older Ambulatory Patients with Post-COVID Syndrome
by Natalya Rakhalskaya, Nurlan Jainakbayev, Maria Kostousova, Timur Tastaibek, Almagul Mansharipova and Saida Yeshimbetova
Psychiatry Int. 2026, 7(1), 21; https://doi.org/10.3390/psychiatryint7010021 - 15 Jan 2026
Viewed by 172
Abstract
Background: Depressive symptoms are frequent sequelae of COVID-19 and may remain unrecognized in older outpatients, particularly those with post-COVID syndrome. The objective of the current study was to assess the under-detection of depressive symptoms in older ambulatory patients and to examine its relationship [...] Read more.
Background: Depressive symptoms are frequent sequelae of COVID-19 and may remain unrecognized in older outpatients, particularly those with post-COVID syndrome. The objective of the current study was to assess the under-detection of depressive symptoms in older ambulatory patients and to examine its relationship with post-COVID syndrome status. Methods: We conducted an observational outpatient cohort study of adults aged 60–89 years with prior SARS-CoV-2 infection (N = 85), recruited at two city polyclinics. Depressive symptoms were assessed through three detection channels: spontaneous complaint during the visit, a standardized direct question about current depressive symptoms, and the 15-item Geriatric Depression Scale (GDS-15). Agreement between complaint and direct question was evaluated using Cohen’s κ and McNemar’s test. Screening performance of complaint and direct question was assessed against GDS-15 thresholds (≥5; sensitivity analysis ≥ 6). Associations between post-COVID syndrome status and binary depressive-symptom indicators were expressed as risk ratios (RRs). Results: Spontaneous complaints missed a substantial proportion of cases: among complaint-negative patients, 18.3% (15/82) reported depressive symptoms on the direct question (κ = 0.149; McNemar p = 0.00052). Against GDS-15 ≥ 5, complaint sensitivity was 10.3% with specificity 100.0% (F1 = 0.19), whereas the direct question showed higher sensitivity (34.5%) with specificity 87.5% (F1 = 0.43). Using the alternative threshold GDS-15 ≥ 6, complaint sensitivity was 15.0% with specificity 100.0% (F1 = 0.26), and direct question sensitivity was 45.0% with specificity 87.7% (F1 = 0.49). A positive response to the direct question was more frequent in patients with post-COVID syndrome than in controls (RR = 2.70 (1.04–7.00)); stratified estimates suggested higher RRs in patients ≤ 75 years (RR = 4.55 (1.08–19.10)) and in women (RR = 2.67 (1.04–6.83)), with limited precision due to sparse events. Conclusions: In older post-COVID outpatients, reliance on spontaneous complaints leads to marked under-detection of GDS-15 screen-positive depressive symptoms. A standardized direct question improves initial case-finding but does not replace a validated screening scale; a stepped approach (brief direct question followed by a scale when indicated) may be warranted. Full article
Show Figures

Figure 1

27 pages, 1331 KB  
Study Protocol
Application of Telemedicine and Artificial Intelligence in Outpatient Cardiology Care: TeleAI-CVD Study (Design)
by Stefan Toth, Marianna Barbierik Vachalcova, Kamil Barbierik, Adriana Jarolimkova, Pavol Fulop, Mariana Dvoroznakova, Dominik Pella and Tibor Poruban
Diagnostics 2026, 16(1), 145; https://doi.org/10.3390/diagnostics16010145 - 1 Jan 2026
Viewed by 623
Abstract
Background/Objectives: Cardiovascular (CV) diseases remain the leading cause of morbidity and mortality across Europe. Despite substantial progress in prevention, diagnostics, and therapeutics, outpatient cardiology care continues to face systemic challenges, including limited consultation time, workforce constraints, and incomplete clinical information at the point [...] Read more.
Background/Objectives: Cardiovascular (CV) diseases remain the leading cause of morbidity and mortality across Europe. Despite substantial progress in prevention, diagnostics, and therapeutics, outpatient cardiology care continues to face systemic challenges, including limited consultation time, workforce constraints, and incomplete clinical information at the point of care. The primary objective of this study is threefold. First, to evaluate whether AI-enhanced telemedicine improves clinical control of hypertension, dyslipidemia, and heart failure compared to standard ambulatory care. Second, to assess the impact on physician workflow efficiency and documentation burden through AI-assisted clinical documentation. Third, to determine patient satisfaction and safety profiles of integrated telemedicine–AI systems. Clinical control will be measured by a composite endpoint of disease-specific targets assessed at the 12-month follow-up visit. Methods: The TeleAI-CVD Concept Study aims to evaluate the integration of telemedicine and artificial intelligence (AI) to enhance the efficiency, quality, and individualization of cardiovascular disease management in the ambulatory setting. Within this framework, AI-driven tools will be employed to collect structured clinical histories and current symptomatology from patients prior to outpatient visits using digital questionnaires and conversational interfaces. Results: Obtained data, combined with telemonitoring metrics, laboratory parameters, and existing clinical records, will be synthesized to support clinical decision-making. Conclusions: This approach is expected to streamline consultations, increase diagnostic accuracy, and enable personalized, data-driven care through continuous evaluation of patient trajectories. The anticipated outcomes of the TeleAI-CVD study include the development of optimized, AI-assisted management protocols for cardiology patients, a reduction in unnecessary in-person visits through effective telemedicine-based follow-up, and accelerated attainment of therapeutic targets. Ultimately, this concept seeks to redefine the paradigm of outpatient cardiovascular care by embedding advanced digital technologies within routine clinical workflows. Full article
Show Figures

Figure 1

20 pages, 2451 KB  
Article
Toward Embedded Multi-Level Classification of 12-Lead ECG Signal Quality Using Spectrograms and CNNs
by Francisco David Pérez Reynoso, Jorge Alberto Soto Cajiga, Luis Alberto Gordillo Roblero and Paola Andrea Niño Suárez
Appl. Sci. 2025, 15(24), 12976; https://doi.org/10.3390/app152412976 - 9 Dec 2025
Viewed by 892
Abstract
This study presents an open and replicable methodology for multi-lead ECG signal quality assessment (SQA), implemented on a 12-lead embedded acquisition platform. Signal quality is a critical software component for diagnostic reliability and compliance with international standards such as IEC 60601-2-27 (clinical ECG [...] Read more.
This study presents an open and replicable methodology for multi-lead ECG signal quality assessment (SQA), implemented on a 12-lead embedded acquisition platform. Signal quality is a critical software component for diagnostic reliability and compliance with international standards such as IEC 60601-2-27 (clinical ECG monitors), IEC 60601-2-47 (ambulatory ECG systems), and IEC 62304 (software life cycle for medical devices) which define the essential engineering requirements and functional performance for medical devices. Unlike proprietary SQA algorithms embedded in closed commercial systems such as Philips DXL™, the proposed method provides a transparent and auditable framework that enables independent validation and supports adaptation for research and clinical prototyping. Our approach combines convolutional neural networks (CNNs) with FFT-derived spectrograms to perform four-level signal quality classification (High, Medium, Low, and Unidentifiable), achieving up to 95.67% accuracy on the test set, confirming the robustness of the CNN-based spectrogram classification model. The algorithm has been validated on a custom controlled dataset generated using the Fluke PS420™ hardware simulator, enabling controlled replication of signal artifacts for software-level evaluation. Designed for execution on resource-constrained embedded platforms, the system integrates real-time preprocessing and wireless transmission, demonstrating its feasibility for deployment in mobile or decentralized ECG monitoring solutions. These results establish a software validation proof-of-concept that goes beyond algorithmic performance, addressing regulatory expectations such as those outlined in FDA’s Good Machine Learning Practice (GMLP). While clinical validation remains pending, this work contributes a standards-aligned methodology to democratize advanced SQA functionality and support future regulatory-compliant development of embedded ECG system. Full article
(This article belongs to the Special Issue AI-Based Biomedical Signal Processing—2nd Edition)
Show Figures

Figure 1

12 pages, 754 KB  
Article
Time to Death and Nursing Home Admission in Older Adults with Hip Fracture: A Retrospective Cohort Study
by Yoichi Ito, Norio Yamamoto, Yosuke Tomita, Kotaro Adachi, Masaaki Konishi and Kunihiko Miyazawa
J. Clin. Med. 2025, 14(23), 8603; https://doi.org/10.3390/jcm14238603 - 4 Dec 2025
Viewed by 892
Abstract
Background: Hip fractures in older adults are sentinel events linked to high mortality and functional decline. Few studies have quantified long-term survival probabilities, standardized mortality ratios (SMRs), and risks of new nursing home admission alongside patient-related predictors. Methods: We retrospectively analyzed [...] Read more.
Background: Hip fractures in older adults are sentinel events linked to high mortality and functional decline. Few studies have quantified long-term survival probabilities, standardized mortality ratios (SMRs), and risks of new nursing home admission alongside patient-related predictors. Methods: We retrospectively analyzed 355 patients aged ≥ 60 years who underwent hip fracture surgery at a general hospital in Japan (2020–2024). Primary outcomes were mortality and new nursing home admission. Survival probabilities and remaining life expectancy were estimated, and SMRs were calculated using age- and sex-matched national data. Cox regression identified independent predictors. Results: Mean age was 84 years; 76% were female. Mortality probabilities at 1, 2, and 3 years were 23%, 41%, and 60%, respectively; SMRs consistently exceeded 9. Median remaining life expectancy was 260 days. New nursing home admissions occurred in 42%, with cumulative probabilities of 16%, 27%, and 35% at 1, 2, and 3 years, respectively, showing a rapid rise within 9 months. Independent predictors of mortality were delayed surgery, higher Charlson Comorbidity Index, and low Geriatric Nutritional Risk Index. Older age and failure to regain ambulatory ability at 3 months predicted institutionalization. Conclusions: Older adults with hip fractures face persistently high mortality and institutionalization risks, comparable to advanced malignancies or neurodegenerative diseases. Surgical timing, comorbidities, nutrition, and functional recovery critically influence prognosis and should guide perioperative care and discharge planning. Full article
Show Figures

Figure 1

15 pages, 322 KB  
Review
Comprehensive Overview of Current Pleural Drainage Practice: A Tactical Guide for Surgeons and Clinicians
by Paolo Albino Ferrari, Cosimo Bruno Salis, Elisabetta Pusceddu, Massimiliano Santoru, Gianluca Canu, Antonio Ferrari, Alessandro Giuseppe Fois and Antonio Maccio
Surgeries 2025, 6(4), 108; https://doi.org/10.3390/surgeries6040108 - 2 Dec 2025
Viewed by 1212
Abstract
Introduction: Chest drainage is central to thoracic surgery, pleural medicine, and emergency care, yet practice remains heterogeneous in tube caliber, access, suction, device selection, and removal thresholds. This narrative review aims to synthesize evidence and translate it into guidance. Materials and Methods: We [...] Read more.
Introduction: Chest drainage is central to thoracic surgery, pleural medicine, and emergency care, yet practice remains heterogeneous in tube caliber, access, suction, device selection, and removal thresholds. This narrative review aims to synthesize evidence and translate it into guidance. Materials and Methods: We performed a narrative review with PRISMA-modeled transparency. Using backward citation from recent comprehensive overviews, we included randomized trials, meta-analyses, guidelines/consensus statements, and high-quality observational studies. We extracted data on indications, technique, tube size, analog versus digital drainage, suction versus water-seal drainage, removal criteria, and key pleural conditions. Due to heterogeneity in device generations, suction targets, and outcomes, we synthesized the findings qualitatively according to converged evidence. Results: After lung resection, single-drain strategies, early use of water-seal, and standardized removal at ≤300–500 mL/day reduce pain and length of stay without increasing the need for reintervention; digital systems support objective removal using sustained low-flow thresholds (approximately 20–40 mL/min). Small-bore (≤14 Fr) Seldinger catheters perform comparably to larger tubes for secondary and primary pneumothorax and enable ambulatory pathways. In trauma, small-bore approaches can match large-bore drainage in stable patients when paired with surveillance and early escalation of care. For pleural infection, image-guided drainage, combined with fibrinolytics or surgery, is key. Indwelling pleural catheters provide relief comparable to talc in dyspnea associated with malignant effusions in patients with non-expandable lungs. Complications are mitigated by ultrasound guidance and avoiding abrupt high suction after chronic collapse; however, these strategies must be balanced against risks of malposition, occlusion or retained collections, prolonged air leaks, and device complexity, which demand protocolized escalation and team training. Conclusions: Practice coalesces around three pillars—right tube, right system, proper criteria. Adopt standardized pathways, device-agnostic thresholds, and volume or airflow criteria. Trials should harmonize “seal” definitions and validate telemetry-informed removal strategies. Full article
Show Figures

Graphical abstract

13 pages, 1624 KB  
Article
Efficacy of Injectable Calcium Composite Bone Substitute Augmentation for Osteoporotic Intertrochanteric Fractures: A Prospective, Non-Randomized Controlled Study
by Chae Hun Lee, Hyoung Tae Kim, Hong Moon Sohn, Gwui Cheol Kim, Eun Ju Jin and Suenghwan Jo
J. Clin. Med. 2025, 14(23), 8536; https://doi.org/10.3390/jcm14238536 - 1 Dec 2025
Cited by 1 | Viewed by 393
Abstract
Background/Objectives: Femoral intertrochanteric fractures (ITFs) in older adults are associated with a substantial risk of mechanical failure after fixation, which can lead to persistent pain, delayed mobilization, and increased mortality. Injectable calcium composite bone substitute (ICCBS) augmentation has been proposed as a strategy [...] Read more.
Background/Objectives: Femoral intertrochanteric fractures (ITFs) in older adults are associated with a substantial risk of mechanical failure after fixation, which can lead to persistent pain, delayed mobilization, and increased mortality. Injectable calcium composite bone substitute (ICCBS) augmentation has been proposed as a strategy to enhance construct stability and promote bone healing, but clinical evidence remains limited. The purpose of this study was to evaluate the efficacy of ICCBS in the management of osteoporotic ITFs. Methods: We conducted a multicenter, prospective, non-randomized controlled study of patients undergoing surgical fixation for osteoporotic ITFs using proximal femoral nails. Patients who consented to augmentation received ICCBS, while the control group underwent standard fixation alone. Demographic and injury-related variables were documented, and outcome data were prospectively collected. The primary outcome was time to radiographic bone union, while secondary outcomes included functional recovery (pain and ambulatory status) and complications, including fixation failure. Results: The mean time to radiographic bone union did not differ significantly between groups (p = 0.28). However, patients receiving ICCBS augmentation reported significantly lower postoperative pain scores up to 6 weeks and demonstrated reduced lag screw sliding and varus collapse at the time of bone union. There were no significant differences in complication rates, fixation failure, or ambulatory status at last follow-up between the two groups. Conclusions: ICCBS augmentation may improve early postoperative pain, construct stability, and functional recovery in patients with osteoporotic ITFs, although its effect on fracture healing and long-term outcomes remains uncertain. Further high-quality randomized trials are warranted to confirm these findings. Full article
(This article belongs to the Section Orthopedics)
Show Figures

Figure 1

21 pages, 1858 KB  
Review
Idealized Framework for Assisting Pharmacovigilance Reporting in an Ambulatory Primary Care and Chronic Disease Management Clinic
by Patrick J. Silva, Sara L. Rogers, Zoya Hassan-Toufique, Jian Tao, Scott A. Bruce, Paula K. Shireman and Kenneth S. Ramos
Pharmacoepidemiology 2025, 4(4), 26; https://doi.org/10.3390/pharma4040026 - 21 Nov 2025
Viewed by 839
Abstract
Pharmacovigilance approaches have conventionally focused on the use of epidemiological data to detect emergent adverse drug reactions (ADRs). Recent advances in the use and availability of real-world data have expanded opportunities to detect ADR signals in medical records. We provide a limited review [...] Read more.
Pharmacovigilance approaches have conventionally focused on the use of epidemiological data to detect emergent adverse drug reactions (ADRs). Recent advances in the use and availability of real-world data have expanded opportunities to detect ADR signals in medical records. We provide a limited review of pharmacovigilance practices and tools we have specifically considered implementing into our comprehensive medication management clinic and associated research programs. Use of pharmacogenomic variants has proven useful only on a limited scale as such data are reliant on low-dimensional approaches matching variants to drugs, often with small effect sizes. As such, most ADRs go unrecognized, undocumented, and unactionable. We posit that an idealized pharmacovigilance framework that relies on artificial-intelligence-assisted reporting with adjudication by pharmacovigilance experts and new models of ambulatory pharmaceutical practice would establish the following attributes: (1) all metadata relating to medication use would be available in the medical record in computable and interoperable data models, (2) digital surveillance tools would detect most ADR events with attributed pharmacological contributions, (3) all events would be characterized using standard adjudication rubrics, and (4) all events would iteratively inform an ADR knowledgebase and improve models to advance detection and prediction of ADR during the course of patient care with a focus on having the necessary tools for clinicians to prevent ADRs. This review provides a limited and focused framework for more systematic documentation of ADRs and tactics to mitigate the idiopathic nature of most ADRs. Full article
Show Figures

Figure 1

10 pages, 1380 KB  
Article
TUS-EPIC: Thoracic Ultrasonography for Exclusion of Iatrogenic Pneumothorax in Post Transbronchial Lung Cryobiopsy—A Safe Alternative to Chest X-Ray
by Ismael Matus, Sameer Akhtar and Vamsi Matta
J. Respir. 2025, 5(4), 18; https://doi.org/10.3390/jor5040018 - 5 Nov 2025
Viewed by 727
Abstract
Background: The incidence of iatrogenic pneumothorax (IPTX) following transbronchial lung cryobiopsy (TBLCB) ranges from 1.4% to 20.2%. While chest X-ray (CXR) is the standard imaging modality to exclude IPTX, thoracic ultrasound (TUS) has demonstrated superior accuracy in detecting pneumothorax across various contexts. This [...] Read more.
Background: The incidence of iatrogenic pneumothorax (IPTX) following transbronchial lung cryobiopsy (TBLCB) ranges from 1.4% to 20.2%. While chest X-ray (CXR) is the standard imaging modality to exclude IPTX, thoracic ultrasound (TUS) has demonstrated superior accuracy in detecting pneumothorax across various contexts. This study evaluates TUS as a reliable alternative to routine CXR for ruling out IPTX after TBLCB. Methods: A retrospective observational study included 51 patients undergoing ambulatory TBLCB. Pre- and post-TBLCB TUS were performed. CXR was reserved for cases where TUS findings were inconclusive (absence of sliding lung [SL] and seashore sign [SS] in any lung zones) or if patients exhibited symptoms or signs of IPTX. Results: TUS findings were concordant in 44 (86.1%) patients, of whom 42 (95.5%) did not require CXR. Two patients (4.5%) with symptomatic IPTX were identified and managed. Among the seven patients (13.7%) requiring CXR due to inconclusive TUS or symptoms, five (71.4%) were negative for IPTX, and two (28.6%) had asymptomatic IPTX. Conclusion: Our TUS protocol effectively ruled out clinically significant IPTX, eliminating routine CXR in 95.5% of patients. TUS is a safe alternative to CXR post-TBLCB, with CXR reserved for inconclusive TUS findings or symptomatic cases. Full article
Show Figures

Figure 1

10 pages, 250 KB  
Article
Validity of Empatica E4 Wristband for Detection of Autonomic Dysfunction Compared to Established Laboratory Testing
by Jenny Stritzelberger, Marie Kirmse, Matthias C. Borutta, Stephanie Gollwitzer, Caroline Reindl, Tamara M. Welte, Hajo M. Hamer and Julia Koehn
Diagnostics 2025, 15(20), 2604; https://doi.org/10.3390/diagnostics15202604 - 16 Oct 2025
Viewed by 2672
Abstract
Background: Heart rate variability (HRV) is a well-established marker of autonomic nervous system (ANS) activity. It is also an important tool for investigating cardiovascular and neurological health. Changes in HRV have been associated with epilepsy and sudden unexpected death in epilepsy (SUDEP), conditions [...] Read more.
Background: Heart rate variability (HRV) is a well-established marker of autonomic nervous system (ANS) activity. It is also an important tool for investigating cardiovascular and neurological health. Changes in HRV have been associated with epilepsy and sudden unexpected death in epilepsy (SUDEP), conditions in which autonomic dysregulation is believed to play a significant role. HRV is traditionally measured using electrocardiography (ECG) under standardized laboratory conditions. Recently, however, wearable devices such as the Empatica E4 wristband have emerged as promising tools for continuous, noninvasive HRV monitoring in real-life, ambulatory, and clinical settings where laboratory infrastructure may be lacking. Methods: We evaluated the validity and clinical utility of the Empatica E4 wristband in two cohorts. In the first cohort of healthy controls (n = 29), we compared HRV measures obtained with the E4 against those obtained with a gold-standard laboratory ECG device under seated rest and metronomic breathing conditions. In persons with epilepsy (PWE, n = 42), we assessed HRV across wake and sleep states, as well as during exposure to sodium channel blockers. This was done to determine whether the device could detect physiologically and clinically meaningful changes in autonomic nervous system (ANS) function. Results: In healthy participants, the Empatica E4 provided heart rate (HR), root mean square of successive R-R intervals (RMSSD), and standard deviation of all interbeat intervals (SDNN) values that were strongly correlated with laboratory measurements. Both devices detected the expected increase in RMSSD during metronomic breathing; however, the E4 consistently reported higher absolute values than the ECG. In patients with epilepsy (PWE), the E4 reliably captured parasympathetic activation during sleep and detected a significant reduction in heart rate variability (HRV) in patients taking sodium channel blockers, demonstrating its sensitivity to clinically relevant autonomic changes. Conclusions: The Empatica E4 wristband is valid for measuring HRV in research and clinical contexts. It can detect modulations of ANS activity that are physiologically meaningful. While HRV metrics were robust, other signals, such as electrodermal activity and temperature, were less reliable. These results highlight the potential of wearable devices as practical alternatives to laboratory-based autonomic testing, especially in emergency and resource-limited settings, and emphasize their importance in epilepsy care risk assessment. Full article
(This article belongs to the Special Issue Emergency Medicine: Diagnostic Insights)
12 pages, 775 KB  
Article
Assessment of Fine Motor Abilities Among Children with Spinal Muscular Atrophy Treated with Nusinersen Using a New Touchscreen Application: A Pilot Study
by Inbal Klemm, Alexandra Danial-Saad, Alexis R. Karlin, Rya Nassar-Yassien, Iuliana Eshel, Hagit Levine, Tamar Steinberg and Sharon Aharoni
Children 2025, 12(10), 1378; https://doi.org/10.3390/children12101378 - 12 Oct 2025
Viewed by 691
Abstract
Background/Objectives: Spinal Muscular Atrophy (SMA) is a genetic neurodegenerative disease characterized by severe muscle weakness and atrophy. Advances in disease-modifying therapies have dramatically changed the natural history of SMA and the outcome measures that are used to assess the clinical response to therapy. [...] Read more.
Background/Objectives: Spinal Muscular Atrophy (SMA) is a genetic neurodegenerative disease characterized by severe muscle weakness and atrophy. Advances in disease-modifying therapies have dramatically changed the natural history of SMA and the outcome measures that are used to assess the clinical response to therapy. Standard assessment methods for SMA are limited in their ability to detect minor changes in fine motor abilities and in patients’ daily functions. The aim of this pilot study was to evaluate the feasibility and preliminary use of the Touchscreen-Assessment Tool (TATOO) alongside standardized tools to detect changes in upper extremity motor function among individuals with SMA receiving nusinersen therapy. Methods: Thirteen individuals with genetically-confirmed SMA, aged 6–23 years, eight with SMA type 2, and five with SMA type 3, participated. The patients continued the maintenance dosing of nusinersen during the study period. They were evaluated at the onset of the study, then twice more at intervals at least six months apart. Upper extremity functional assessments were performed via the TATOO and standardized tools: the Hand Grip Dynamometer (HGD), Pinch Dynamometer (PD), Revised Upper Limb Module (RULM), and Nine-Hole Peg Test (NHPT). Results: Significant changes in fine motor function were detected using the TATOO together with other standardized tools. Participants demonstrated notable improvements in hand grip strength and fine motor performance, as measured by the NHPT. The RULM results were not statistically significant for the total study group, particularly in ambulatory patients with SMA type 3. TATOO provided detailed metrics, and revealed enhancements in accuracy and speed across various tasks. However, given the small sample size, the lack of a control group, and the lack of baseline assessment before receiving therapy, these findings should be considered preliminary and exploratory. Conclusions: The findings suggest that the TATOO, alongside traditional assessment tools, offers a sensitive measure of fine motor function changes in patients with SMA. This study highlights the potential of touchscreen-based assessments to address gaps in current outcome measures and emphasizes the need for larger, multicenter studies that will include pre-treatment, baseline, and control data. Full article
Show Figures

Figure 1

14 pages, 549 KB  
Article
Sleep Posture and Autonomic Nervous System Activity Across Age and Sex in a Clinical Cohort: Analysis of a Nationwide Ambulatory ECG Database
by Emi Yuda and Junichiro Hayano
Sensors 2025, 25(19), 5982; https://doi.org/10.3390/s25195982 - 26 Sep 2025
Viewed by 1576
Abstract
Sleep posture has received limited attention in studies of autonomic nervous system (ANS) activity during sleep, particularly in clinical populations. We analyzed data from 130,885 individuals (56.1% female) in the Allostatic State Mapping by Ambulatory ECG Repository (ALLSTAR), a nationwide Japanese database of [...] Read more.
Sleep posture has received limited attention in studies of autonomic nervous system (ANS) activity during sleep, particularly in clinical populations. We analyzed data from 130,885 individuals (56.1% female) in the Allostatic State Mapping by Ambulatory ECG Repository (ALLSTAR), a nationwide Japanese database of 24 h Holter ECG recordings obtained for clinical purposes. Sleep posture was classified as supine, right lateral, left lateral, or prone using triaxial accelerometer data. Heart rate variability (HRV) indices—including heart rate (HR), standard deviation of RR intervals (SDRR), high-frequency (HF), low-frequency (LF), very low-frequency (VLF) components, cyclic variation in heart rate (CVHR), and HF spectral power concentration index (Hsi)—were calculated for each posture and stratified by age and sex. HR was consistently lowest in the left lateral posture and highest in the right lateral posture across most age groups. Other HRV indices also showed consistent laterality, although the effect sizes were generally small. Posture distribution differed slightly by estimated sleep apnea severity, but the effect size was negligible (η2 = 0.0013). These findings highlight sleep posture as a statistically significant and independent factor influencing ANS activity during sleep, though the magnitude of differences should be interpreted in the context of their clinical relevance. Full article
Show Figures

Figure 1

15 pages, 957 KB  
Article
Effectiveness of a Nutritional Intervention in Patients with Chronic Heart Failure at Risk of Malnutrition: A Prespecified Subanalysis of the PACMAN-HF Trial
by Carolina Ortiz-Cortés, Purificación Rey-Sánchez, Paula Gómez-Turégano, Ramón Bover-Freire, Julián F. Calderón-García, Jose Javier Gómez-Barrado and Sergio Rico-Martín
Nutrients 2025, 17(17), 2899; https://doi.org/10.3390/nu17172899 - 8 Sep 2025
Viewed by 1389
Abstract
Background and objectives: Nutritional disorders are common in patients with heart failure (HF) and are associated with reduced functional capacity and poor prognosis. In this study, we evaluated the prognostic, nutritional and functional impact of a structured nutritional intervention in patients with [...] Read more.
Background and objectives: Nutritional disorders are common in patients with heart failure (HF) and are associated with reduced functional capacity and poor prognosis. In this study, we evaluated the prognostic, nutritional and functional impact of a structured nutritional intervention in patients with chronic HF at risk of malnutrition. Methods: This is a prespecified subanalysis of the randomized controlled trial Prognostic And Clinical iMpAct of a Nutritional intervention in patients with chronic HF (PACMAN-HF). Ambulatory patients with chronic HF at risk of malnutrition were identified using the Mini Nutritional Assessment (MNA) questionnaire and randomized to receive either an individualised nutritional intervention (intervention group) or standard care (control group). We evaluated the frequency of malnutrition risk and the impact of the intervention on clinical outcomes, defined as a composite of all-cause mortality or time to first HF hospitalisation, as well as nutritional status and functional capacity at 3- and 12-month follow-ups. Results: A total of 225 patients were screened. Of these, 72 (32%) were identified as being at risk of malnutrition and 64 (28.4%) met the inclusion criteria and were randomized (31 in the intervention group and 33 in the control group). There were no significant differences between the groups in terms of all-cause mortality or time to first HF hospitalisation (HR = 0.34 [0.11–1.09]; p = 0.072). At 12 months, the intervention group demonstrated a significant improvement in functional capacity, with an increase of 31.3 metres in the 6-minute walk test (6MWT) (p = 0.002), whereas no significant change was observed in the control group. Nutritional status improved significantly in the intervention group (MNA score +4.12, p < 0.001) and declined in the control group (−1.15, p = 0.029). At 12 months, body mass index, tricipital skinfold thickness, arm circumference, and serum albumin levels increased in the intervention group. Conclusions: A structured and individualised nutritional intervention significantly improved nutritional status and functional capacity over 12 months, although it did not impact major clinical outcomes. Full article
(This article belongs to the Section Clinical Nutrition)
Show Figures

Graphical abstract

Back to TopTop