Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (28,410)

Search Parameters:
Keywords = real-world

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
32 pages, 2577 KB  
Article
ETGB-SEF: Entmax-TabNet Gradient Boosting Stacked Ensemble Framework for Disease Stage Prediction
by Bowen Yang and Wenying He
Symmetry 2026, 18(5), 779; https://doi.org/10.3390/sym18050779 (registering DOI) - 1 May 2026
Abstract
Disease staging is a critical component of clinical diagnosis, treatment, and prognosis assessment. However, structured clinical data typically exhibit high-dimensional, nonlinear feature interactions; stage-specific dominant features; and threshold-based discontinuities. These characteristics make it challenging for a single model to achieve both global feature [...] Read more.
Disease staging is a critical component of clinical diagnosis, treatment, and prognosis assessment. However, structured clinical data typically exhibit high-dimensional, nonlinear feature interactions; stage-specific dominant features; and threshold-based discontinuities. These characteristics make it challenging for a single model to achieve both global feature modeling capability and local discriminative power, thereby limiting further improvements in prediction accuracy. To address this limitation, we propose a novel deep ensemble learning framework, ETGB-SEF (Entmax-TabNet Gradient Boosting Stacked Ensemble Framework), for multiclass disease staging. First, at the base model level, Entmax-1.5 replaces Sparsemax in TabNet, thereby enabling an adjustable sparse feature selection mechanism that enhances the ability to model weakly correlated clinical features while preserving interpretability. Second, at the model-fusion level, a stacked ensemble architecture in the probability space is developed. This architecture integrates the modified TabNet with Gradient Boosting Decision Trees (GBDT) in a complementary way, enabling the former to capture global nonlinear semantic dependencies while the latter captures threshold-based discriminative boundaries among clinical features. Extensive experiments on real-world datasets demonstrate that the proposed method consistently outperforms existing state-of-the-art approaches. Full article
(This article belongs to the Section Computer)
38 pages, 2680 KB  
Article
EKEO: An Enhanced Kangaroo Escape Optimizer with Balanced Search for Global Optimization and Engineering Design
by Xuemei Zhu, Weijie Guo, Yang Shen, Jingchun Guo, Shirong Li and Zhiqiang Chang
Biomimetics 2026, 11(5), 308; https://doi.org/10.3390/biomimetics11050308 (registering DOI) - 1 May 2026
Abstract
The Kangaroo Escape Optimizer (KEO) is a recently proposed biomimetic metaheuristic inspired by the adaptive escape strategies of kangaroos in predator–prey interactions. Although effective, KEO-like algorithms based on many populations may suffer from premature convergence and loss of population diversity when addressing complex, [...] Read more.
The Kangaroo Escape Optimizer (KEO) is a recently proposed biomimetic metaheuristic inspired by the adaptive escape strategies of kangaroos in predator–prey interactions. Although effective, KEO-like algorithms based on many populations may suffer from premature convergence and loss of population diversity when addressing complex, multimodal, and constrained optimization problems. This paper proposes an Enhanced Kangaroo Escape Optimizer (EKEO) that integrates Differential Evolution Mutation (DEM) and Quasi-Oppositional Learning (QOL) to address fundamental limitations in exploration–exploitation balance. From a biomimetic perspective, DEM mimics the refined high-frequency muscular adjustments of a kangaroo during close-range evasion, enabling local refinement around promising solutions, while QOL emulates the animal’s sudden directional changes and scanning behavior to preserve population diversity and escape local optima. Their principled integration yields a robust optimization framework that consistently outperforms state-of-the-art and classical metaheuristics across benchmark functions and real-world engineering problems. The findings suggest a generalizable design principle for biomimetic hybrid metaheuristics, demonstrating that coupling directed exploitation with diversity-preserving exploration leads to reliable high-performance optimization. The performance of EKEO is rigorously evaluated in two phases. First, its optimization accuracy and convergence speed are benchmarked against 11 state-of-the-art and classical metaheuristics on 23 classical benchmark functions and the CEC 2019 test suite. Second, its practical applicability and constraint-handling effectiveness are validated on four real-world engineering design problems: step-cone pulley, gear system, tubular column, and pressure vessel design. The experimental results are supported by comprehensive statistical analyses (including Wilcoxon rank-sum tests) and convergence curves, showing that EKEO consistently outperforms its competitors in solution quality, convergence speed, and robustness. These findings establish EKEO as a competitive, reliable, and versatile biomimetic optimization tool suitable for solving complex continuous and constrained engineering optimization problems. Full article
(This article belongs to the Section Biological Optimisation and Management)
35 pages, 1539 KB  
Review
Circular Economy Integration in Healthcare Waste Management, a Zero-Waste Paradigm: A Review
by Thobile Zikhathile, Harrison Atagana, Joseph Bwapwa and Taurai Mutanda
Recycling 2026, 11(5), 83; https://doi.org/10.3390/recycling11050083 (registering DOI) - 1 May 2026
Abstract
Healthcare waste management is a growing environmental and economic challenge due to increasing waste volumes, hazardous materials, and continued reliance on linear disposal methods such as incineration and landfilling. This review aims to examine how circular economy and zero-waste approaches can be applied [...] Read more.
Healthcare waste management is a growing environmental and economic challenge due to increasing waste volumes, hazardous materials, and continued reliance on linear disposal methods such as incineration and landfilling. This review aims to examine how circular economy and zero-waste approaches can be applied to healthcare waste management to improve sustainability, resource efficiency, and system performance. A structured narrative review was conducted using peer-reviewed literature obtained from prominent scientific databases, concentrating on circular strategies, zero-waste initiatives, digital technologies, and policy frameworks relevant to healthcare waste systems. The reviewed studies indicate that practices such as improved waste segregation, recycling and material recovery, reusable product design, digital waste tracking, and Extended Producer Responsibility can significantly reduce waste generation, lower environmental impacts, and achieve cost savings, while maintaining infection control and patient safety. However, the review also identifies key barriers to implementation, including regulatory complexity, limited infrastructure, financial constraints, and weak coordination among stakeholders. The novelty of this review lies in its integrated analysis of circular economy and zero-waste strategies through the lens of digital enablement, offering a systems-based framework for transforming healthcare waste management beyond incremental improvements. The findings highlight that successful circular healthcare waste management requires strong institutional leadership, supportive policies, and the integration of digital technologies to enable monitoring, traceability, and decision-making. This review enhances the comprehension of how circular economy principles can facilitate the transition from linear to sustainable healthcare waste systems and provides guidance for policymakers, healthcare managers, and researchers. Future research should focus on evaluating real-world implementation, advancing recyclable and reusable medical materials, and developing standardised indicators to measure circular performance in healthcare settings. Full article
36 pages, 4746 KB  
Review
Polymer–Graphene Composites for Electrochemical Sensing: A Comprehensive Review of Functionalization Pathways and Sustainable Design Strategies
by Domingo César Carrascal-Hernández, Andrea Ramos-Hernández, Nataly J. Galán-Freyle, Daniel Insuasty and Maximiliano Méndez-López
Polymers 2026, 18(9), 1120; https://doi.org/10.3390/polym18091120 (registering DOI) - 1 May 2026
Abstract
Environmental pollution constitutes an increasingly complex global challenge, largely driven by industrial expansion and the consequent release of toxic species such as Cd2+, Pb2+, Cu2+, Hg2+, Fe3+, As3+, and Rh3+ [...] Read more.
Environmental pollution constitutes an increasingly complex global challenge, largely driven by industrial expansion and the consequent release of toxic species such as Cd2+, Pb2+, Cu2+, Hg2+, Fe3+, As3+, and Rh3+ into natural ecosystems. These contaminants pose significant risks to environmental integrity and public health, motivating the development of analytical technologies capable of sensitive, selective, and reliable detection. In this context, graphene-based electrochemical sensors have emerged as versatile platforms for monitoring a broad range of analytes, particularly in environmental applications involving heavy-metal detection. The intrinsic physicochemical properties of graphene derivatives have enabled low detection limits, rapid response times, and tunable selectivity. Despite analytical advances, critical challenges persist regarding operational stability in complex matrices, inter-batch reproducibility, and robustness to interfering species, which continue to hinder large-scale deployment and real-world applicability. However, challenges remain regarding stability and performance in complex arrays, reproducibility, and resistance to interference, necessitating innovative strategies for functionalization and molecular recognition. This review article establishes a comparative framework based on functionalization strategies (covalent, non-covalent, and hybrid), the chemical nature of graphene (GO, rGO, and doping), and various types of polymers (conductors and insulators), using statistical metrics such as the limit of detection (LOD), linear range, working potential, stability, and interferences, employing a bibliometric analysis using the PRISMA 2020 methodology. This comparative framework enables analysis and explanation of performance trends, and the generation of design and functionalization recommendations for versatile applications, including criteria for reproducibility and sustainability. Full article
Show Figures

Figure 1

21 pages, 479 KB  
Article
On Simple EM Acceleration Schemes Suitable for Mixture Modelling with High Overlap Between Components
by Branislav Panić, Jernej Klemenc, Marko Nagode and Simon Oman
Mathematics 2026, 14(9), 1543; https://doi.org/10.3390/math14091543 - 1 May 2026
Abstract
The Expectation-Maximisation (EM) algorithm is widely used for maximum likelihood estimation in incomplete data problems such as mixture modelling, but it often converges slowly, particularly when mixture components overlap substantially. This study presents a comprehensive empirical evaluation of simple EM acceleration schemes for [...] Read more.
The Expectation-Maximisation (EM) algorithm is widely used for maximum likelihood estimation in incomplete data problems such as mixture modelling, but it often converges slowly, particularly when mixture components overlap substantially. This study presents a comprehensive empirical evaluation of simple EM acceleration schemes for Gaussian mixture models, comparing linear (STEM), quadratic (SQUAREM), and greedy (line search, golden section) methods across 240 simulated mixture configurations spanning three dimensionalities, four component counts, five overlap levels, and four sample sizes. A key contribution is the first systematic comparison of the three acceleration parameter estimates (α1, α2, α3) in the mixture modelling context: we show that only α3, which is derived as the geometric mean estimate of α1 and α2, provides genuine acceleration, while α1 and α2 consistently increase iteration counts by 50–110% relative to α3, effectively acting as deceleration. With α3, SQUAREM reduces iterations by up to 48% with negligible computational overhead, while greedy methods achieve similar iteration reductions but at 50–110% greater wall-clock time due to repeated log-likelihood evaluations. Crucially, acceleration does not degrade parameter estimation quality under any tested combination of initialisation, overlap, dimensionality, or number of components. We further examine the interaction between acceleration and initialisation, finding that k-means benefits most from acceleration (up to 50% time savings), while the REBMIX (Rough-Enhanced-Bayes MIXture estimation) algorithm benefits least as it already starts near the optimum. Among REBMIX configurations, histogram preprocessing with the outliers mode traversing strategy offers the best trade-off between quality and computational cost. The findings are validated on a real-world Backblaze hard drive failure dataset, confirming the practical utility of EM acceleration. All methods are implemented in the free and open-source R package rebmix, accompanied by full source code. Full article
(This article belongs to the Section E1: Mathematics and Computer Science)
26 pages, 36319 KB  
Article
Monitoring Spatiotemporal Evolution of Dynamic Fields via Sensor Network Datastream: A Decentralized Event-Driven Approach
by Roger Cesarié Ntankouo Njila, Mir Abolfazl Mostafavi, Jean Brodeur and Sonia Rivest
ISPRS Int. J. Geo-Inf. 2026, 15(5), 194; https://doi.org/10.3390/ijgi15050194 - 1 May 2026
Abstract
Sensor data are increasingly used in monitoring spatiotemporal phenomena for diverse applications such as flood management, urban traffic, air quality control, forest fire management, etc. Real-time modelling and representation of such evolving phenomena is fundamental for efficient and near-real-time decision-making processes. In addition [...] Read more.
Sensor data are increasingly used in monitoring spatiotemporal phenomena for diverse applications such as flood management, urban traffic, air quality control, forest fire management, etc. Real-time modelling and representation of such evolving phenomena is fundamental for efficient and near-real-time decision-making processes. In addition to simple and local alerts about occurring changes over time at a given location, as is the case in Sensor Event Service (SES), the decision-making process may require more global spatial information, such as knowing if the monitored phenomenon is expanding or contracting around a given spot or if it is moving from one spot to another, especially for non-punctual spatial features. For such cases, spatiotemporal information should be computed over the whole set of distributed data from which the geometry of monitored phenomena can be assessed. This paper proposes an event-driven fuzzy rule-based decentralized spatial reasoning approach to compute spatiotemporal changes occurring in vague shape phenomena from distributed sensor data streams. Inferring local and partial spatial changes from individual nodes over the sensor network is prior to the computation of developing changes that the monitored phenomenon undergoes over the whole area covered by the sensor network. In this approach, we suggest a Fuzzy-Extended Spatiotemporal Change Pattern (FESTCP) to compute spatiotemporal changes about fuzzy regions. To evaluate our method, simulated case studies of ambient air pollution in Quebec City are carried out. The results reveal that the proposed method could provide satisfactory information about spatiotemporal changes in real-world phenomena monitored by a sensor network for a real-time decision-making process. Full article
Show Figures

Figure 1

15 pages, 3605 KB  
Article
Cytotoxic Potential of Environmentally Relevant PVC Micro- and Nanoplastics of Varied Size, Shape, and Surface Degradation
by Phyo Bo Bo Aung, Yuya Haga, Sota Manabe, Wakaba Idehara, Mii Hokaku, Yuto Motoyama, Ayaha Mori, Kazuma Higashisaka and Yasuo Tsutsumi
Microplastics 2026, 5(2), 83; https://doi.org/10.3390/microplastics5020083 - 1 May 2026
Abstract
Microplastics (MPs), i.e., plastic particles <5 mm, and nanoplastics (NPs), i.e., plastic particles <1 µm, are widespread in the environment. MPs and NPs (MNPs) have also been detected in human tissues. Environmental MNPs exhibit diverse physicochemical properties such as size, shape, and surface [...] Read more.
Microplastics (MPs), i.e., plastic particles <5 mm, and nanoplastics (NPs), i.e., plastic particles <1 µm, are widespread in the environment. MPs and NPs (MNPs) have also been detected in human tissues. Environmental MNPs exhibit diverse physicochemical properties such as size, shape, and surface degradation. However, most experimental studies have used pristine MNPs, which poorly represent real-world conditions, and only a limited number of studies have focused on preparing environmentally relevant MNPs. Therefore, we focused on the key physicochemical properties of MNPs, particularly their shape, size, and surface degradation, using polyvinyl chloride (PVC) as the model polymer. In this study, fragment and spherical PVC-MNPs were utilized, and surface degradation was introduced through exposure to vacuum ultraviolet (VUV) radiation at a wavelength of 172 nm. Attenuated Total Reflectance-Fourier Transform Infrared (ATR-FTIR) analysis revealed the formation of additional carbonyl groups after VUV exposure. We investigated the cytotoxic effects of the degraded and non-degraded PVC-MNPs on A549, Caco-2, and THP-1 cells. The results indicated that the degraded PVC-MNP-treated groups induced higher cytotoxic effects than those in the non-degraded groups. Notably, the degraded PVC-NPs induced stronger cytotoxicity than the degraded PVC-MPs. These findings highlight the potential health risks associated with environmental MNPs. Full article
Show Figures

Figure 1

35 pages, 4097 KB  
Article
A Privacy-Preserving Quadratic Optimisation with Additive Homomorphic Encryption in Cyber-Physical Systems
by Ying He, Yang Pu, Rui Ye and Zhenyong Zhang
Mathematics 2026, 14(9), 1540; https://doi.org/10.3390/math14091540 - 1 May 2026
Abstract
In this paper, we propose a secure protocol to compute the quadratic optimisation problem under a three-party outsourcing architecture in the scenario of cyber-physical systems. To enable real-world implementation, we propose an encoding framework that uses a fixed-point expression and a truncated-mapping scheme [...] Read more.
In this paper, we propose a secure protocol to compute the quadratic optimisation problem under a three-party outsourcing architecture in the scenario of cyber-physical systems. To enable real-world implementation, we propose an encoding framework that uses a fixed-point expression and a truncated-mapping scheme to map real numbers into multiple data blocks, improving the protocol’s efficiency. Based on this, we define the recovery operations for decryption, addition, and multiplication. Considering computations involving three parties to solve the quadratic optimisation problem, we thoroughly analyse privacy issues during the interaction process. Then, a secure protocol is developed by designing privacy-preserving addition, multiplication, and comparison protocols based on the additive homomorphic encryption scheme. The data blowup and “0”-privacy leakage problems are addressed specifically for the gradient descent process by designing a secure addition protocol for block data and a secure comparison protocol. The efficiency and security of the proposed protocol are formally analysed in depth. Finally, through intensive experiments, we demonstrate the efficiency and security of our protocol. Full article
25 pages, 3725 KB  
Article
Handcrafted Versus Deep Feature Extraction Methods for MRI-Based Multiple Sclerosis Diagnosis
by Samah Yahia, Tahani Bouchrika and Wided Bouchelligua
Diagnostics 2026, 16(9), 1379; https://doi.org/10.3390/diagnostics16091379 - 1 May 2026
Abstract
Background: Despite significant advances in medical image analysis, automated diagnosis of Multiple Sclerosis (MS) from magnetic resonance imaging (MRI) remains challenging due to the complexity of 3D brain data and the variability of lesion appearance. Objective: In this work, we propose an [...] Read more.
Background: Despite significant advances in medical image analysis, automated diagnosis of Multiple Sclerosis (MS) from magnetic resonance imaging (MRI) remains challenging due to the complexity of 3D brain data and the variability of lesion appearance. Objective: In this work, we propose an efficient and optimized feature extraction framework for automated MS diagnosis using FLAIR, T1-, and T2-weighted MRI. The approach enhances Decimal Descriptor Patterns (DDP) by integrating local gradient information, producing a 3D texture representation that is more discriminative and expressive. Methods: The study is divided into two main parts: (i) detection of MS, and (ii) assessment of disease progression in affected patients. In each part, features are extracted from the relevant MRI data and classified using multiple classical machine learning classifiers, including Linear Discriminant Analysis (LDA), Support Vector Machines (SVM), k-Nearest Neighbors (KNN), and Logistic Regression. Furthermore, the performance of the proposed handcrafted feature-based approach was compared to features extracted using a deep learning-based model (vision–language model, VLM), specifically CLIP (Contrastive Language–Image Pretraining), enabling a clear comparison of their performance. To assess robustness and generalizability, two complementary validation strategies were adopted: (i) controlled experiments on the BrainWeb dataset under varying T1/T2 contrast conditions, and (ii) validation on a the real-world FLAIR MRI dataset, reflecting clinically relevant lesion visibility. Results:Gradient-DDP features achieve the best overall performance for MS progression, reaching up to 97% accuracy on T2-weighted MRI with SVM, while LDA and Logistic Regression also remain strong with accuracies around 83–96% on T2. For binary MS detection, the proposed method attains near-perfect results, with up to 99% accuracy on FLAIR (SVM/KNN) and 98% on T2-weighted images across SVM, while other classifiers also maintain high performance above 90%. Conclusions: Gradient-DDP provides strong consistency and transparency, offering an interpretable link between texture patterns and diagnostic outcomes. While VLM features perform well when lesion patterns are clearly defined (e.g., in T2), Gradient-DDP demonstrates greater robustness in more challenging modalities such as Flair, where deep representations may be less stable. Full article
(This article belongs to the Special Issue Diagnostic Imaging in Multiple Sclerosis)
50 pages, 2629 KB  
Article
An Enhanced Black-Winged Kite Algorithm with Multiple Strategies for Global Optimization and Constrained Engineering Applications
by Chengtao Du, Jinzhong Zhang and Jie Fang
Biomimetics 2026, 11(5), 309; https://doi.org/10.3390/biomimetics11050309 - 1 May 2026
Abstract
The black-winged kite algorithm (BKA) integrates the Cauchy mutation strategy and the leader selection strategy to simulate high-altitude circling exploration, fixed-point diving attack, and group cooperative migration of the black-winged kites to approximate the global optimal solution. The BKA exhibits deficiencies in ponderous [...] Read more.
The black-winged kite algorithm (BKA) integrates the Cauchy mutation strategy and the leader selection strategy to simulate high-altitude circling exploration, fixed-point diving attack, and group cooperative migration of the black-winged kites to approximate the global optimal solution. The BKA exhibits deficiencies in ponderous convergence efficacy, inefficient calculation precision, and insufficient population diversity. To strengthen the convergence property and computational practicability, an enhanced BKA with multiple strategies (MSBKA) is advocated to accommodate global optimization and constrained engineering applications. The objective is to systematically verify its advancement and competitiveness and accurately actualize the global optimal solution. The ranking-based differential mutation can strengthen population information interaction, accelerate convergence efficiency, restrain premature convergence, diminish homogenization competition, promote exploration and exploitation, intensify elite individual guidance, downscale ineffective iterations, and materialize orderly population renewal. The simplex method can execute the local refinement operations of reflection, expansion, compression and contraction, strengthen local mining efficiency, ameliorate solution accuracy, abate parameter sensitivity, eschew local optimal traps, accelerate accurate convergence, and preserve the optimal individual potential. The elite opposition-based learning strategy can fabricate reverse solutions, expand the monolithic detection space, shorten the convergence process, elevate the quality of initial and iterative solutions, boost population diversity, guide intelligent search direction, and relieve premature convergence. The MSBKA utilizes deficiency orientation, strategy adaptation, and collaborative search to accomplish the realistic demands of high-precision, high-efficiency and strong constraint adaptation, surmount the static trade-off dilemma, endow a strong directional abscond mechanism to replace random perturbation, and actualize the inertia of directional exploration and the blind spots of solution exploitation. Twenty-three benchmark functions and six real-world engineering designs are employed to authenticate theoretical superiority and engineering practicability. The experimental results demonstrate that the MSBKA incorporates strong practicability and reliability to strengthen information interaction, restrain search stagnation, diminish convergence oscillation and fluctuation, facilitate globalized discovery and localized extraction, expedite convergence efficacy, ameliorate solution precision, and consolidate stability and robustness. Full article
(This article belongs to the Section Biological Optimisation and Management)
13 pages, 1369 KB  
Article
Real-World Experience with Brolucizumab in Treatment-Naïve nAMD with Low Baseline Visual Acuity: Short-Term Outcomes from a Prospective Single-Institution Study
by Arsim Hajdari and Valdet Uka
Life 2026, 16(5), 754; https://doi.org/10.3390/life16050754 - 1 May 2026
Abstract
Background: Neovascular age-related macular degeneration (nAMD) is a progressive chronic disease that represents a major cause of irreversible vision loss worldwide. In this study we aim to assess the short-term functional and anatomical outcomes of brolucizumab therapy in treatment-naïve patients with nAMD presenting [...] Read more.
Background: Neovascular age-related macular degeneration (nAMD) is a progressive chronic disease that represents a major cause of irreversible vision loss worldwide. In this study we aim to assess the short-term functional and anatomical outcomes of brolucizumab therapy in treatment-naïve patients with nAMD presenting with low baseline visual acuity in a single institution setting. Methods: This is a prospective non-randomized study that included 154 treatment-naïve eyes with low baseline visual acuity. We measured visual outcomes (BCVA, logMAR) and structural outcomes (CST, μm). We also stratified the study population into respective age subgroups to evaluate any possible trend between outcome changes and age differences. BCVA and CST were measured at baseline, at each consecutive month (month 1, 2 and 3) of the loading phase, as well as at the final timepoint (6 months). Intraocular pressure (IOP) before and after injection, as well as the incidence of serious adverse events, were monitored throughout the study. Results: Mean BCVA improved by 0.41 logMAR (+20 ETDRS letters) after the first injection, 0.65 logMAR (+32 letters) after the second, and reached a maximum improvement of 0.80 logMAR after the third injection. The most important BCVA improvement was seen in younger patients (<50 years), with mean BCVA decreasing from approximately 1.0 logMAR at baseline to around 0.3–0.4 logMAR at the final measurement. Mean CST declined by 45.5 μm after the first injection, 78.5 μm after the second, 117.8 μm after the third, and 143.6 μm at the final timepoint, indicating a pronounced anatomical response to intravitreal brolucizumab therapy. Conclusions: In conclusion, this study demonstrates that brolucizumab therapy provides significant short-term anatomical and functional improvements in treatment-naïve patients with nAMD and poor baseline visual acuity. Baseline visual acuity, treatment-naïve status, and patient age appear to be key determinants of visual gain. Full article
(This article belongs to the Section Medical Research)
Show Figures

Graphical abstract

34 pages, 746 KB  
Review
Governing Privacy-Preserving Face Recognition in Transport Infrastructures: A Comprehensive Review
by Eva María Benito Sanz, Alba Gonzalo Primo, Gaurav Choudhary and Nicola Dragoni
Sensors 2026, 26(9), 2832; https://doi.org/10.3390/s26092832 - 1 May 2026
Abstract
Face recognition technologies are increasingly deployed in transport infrastructures to improve efficiency and security, but they raise significant privacy and data protection concerns. This study reviews how privacy-preserving face recognition techniques can address these challenges in real-world settings. Using a systematic literature review [...] Read more.
Face recognition technologies are increasingly deployed in transport infrastructures to improve efficiency and security, but they raise significant privacy and data protection concerns. This study reviews how privacy-preserving face recognition techniques can address these challenges in real-world settings. Using a systematic literature review approach, the paper analyses research across technical, operational, and governance perspectives. The findings show that while advanced methods such as encryption, federated learning, and de-identification can reduce data exposure, they are rarely implemented in operational systems, which tend to prioritize performance and scalability. At the same time, governance-focused studies emphasize issues such as proportionality, accountability, and fundamental rights, often without clear links to technical solutions. Overall, the review highlights a fragmented landscape and a gap between research and practice, underscoring the need for integrated approaches that align privacy-preserving techniques with practical deployment constraints and regulatory requirements. Full article
13 pages, 3017 KB  
Article
Familial Short QT Syndrome: Phenotypic Variability and Challenges in Risk Stratification
by Paula Bouzón, Alberto Alen, María Salgado, Francisco González-Urbistondo, Lorena María Vega-Prado, Eliecer Coto, José Julián Rodríguez-Reguero, Juan Gomez, Barbara Fernández-Barrio, Pablo Avanzas and Rebeca Lorca
J. Clin. Med. 2026, 15(9), 3461; https://doi.org/10.3390/jcm15093461 - 1 May 2026
Abstract
Background: Short QT syndrome (SQTS) is a rare inherited cardiac channelopathy associated with high risk of atrial and ventricular arrhythmias and sudden cardiac death (SCD). Data on its natural history, genotype–phenotype correlations, and risk stratification remain limited. We aimed to evaluate all families [...] Read more.
Background: Short QT syndrome (SQTS) is a rare inherited cardiac channelopathy associated with high risk of atrial and ventricular arrhythmias and sudden cardiac death (SCD). Data on its natural history, genotype–phenotype correlations, and risk stratification remain limited. We aimed to evaluate all families with a confirmed diagnosis of SQTS identified at our National Referral Center through a descriptive case series, thereby contributing additional real-world data on this rare condition. Methods: A retrospective review was conducted of all families evaluated for suspected SQTS between 2011 and 2025 at the Inherited Cardiac Diseases Unit. Diagnosis was based on 2022 ESC guidelines (QTc ≤320 ms or ≤360 ms plus supportive features), clinical evaluation, and genetic testing. Families meeting diagnostic criteria were included for detailed phenotypic and genotypic characterization and longitudinal follow-up. Results: Among all patients assessed, two families met the criteria for SQTS. One family with three phenotype-positive individuals was gene-elusive. This family had a history of SCD and the proband presented atrial fibrillation. The second family carried a pathogenic KCNJ2 variant (p.Asp172Asn). However, only the proband fulfilled ECG criteria for SQTS (phenotype-positive) and there was no family history of SCD. No patients were treated with pharmacological therapy for QT prolongation. All affected individuals showed stable QT intervals (none <320 ms) and there were no malignant arrhythmic events during follow-up. Conclusions: These two families illustrate the wide phenotypic spectrum of SQTS and underscore the difficulty of risk stratification in asymptomatic individuals. The rarity of the disease, variable penetrance, and absence of robust prospective data hinder evidence-based management. Systematic registry participation and longitudinal studies are essential to refine risk prediction and therapeutic strategies. Full article
(This article belongs to the Special Issue Clinical Updates on Cardiac Arrhythmias)
Show Figures

Figure 1

32 pages, 4484 KB  
Article
BCough: Design and Evaluation of a Bone-Conduction-Embedded AI Platform for On-Device Cough Detection
by Mayur Sanap, Joseph de la Viesca, Aadesh Shah, Sameer Dalal, Jack Twiddy, Michael Daniele and Edgar J. Lobaton
Electronics 2026, 15(9), 1912; https://doi.org/10.3390/electronics15091912 - 1 May 2026
Abstract
Continuous cough monitoring provides valuable insights into respiratory health; however, conventional air-conduction microphones are highly susceptible to ambient noise and raise privacy concerns. This work presents BCough, a bone-conduction-based embedded AI platform for on-device cough detection, designed and evaluated on the MAX78000 neural [...] Read more.
Continuous cough monitoring provides valuable insights into respiratory health; however, conventional air-conduction microphones are highly susceptible to ambient noise and raise privacy concerns. This work presents BCough, a bone-conduction-based embedded AI platform for on-device cough detection, designed and evaluated on the MAX78000 neural accelerator. The system employs a skin-contact bone-conduction sensor worn on the chest to capture body vibrations transmitted through bone and tissue, detecting cough events while minimizing environmental interference. The custom prototype integrates a bone-conduction microphone, a synchronized 6-axis IMU, power management circuitry, and an embedded neural accelerator to support real-time inference and future multimodal extensions. A compact 8-bit quantized convolutional neural network was optimized for deployment on the MAX78000 and evaluated using leave-one-subject-out cross-validation on one-second cough and non-cough segments derived from a corpus of 19,955 labeled events collected from five participants under controlled conditions. The deployed model achieved 0.80 Macro-F1, 0.81 balanced accuracy, 0.74 cough F1, and 0.89 AUC, with 15–16 ms inference latency and approximately 20 μJ energy per inference on chip. These results demonstrate the feasibility of low-power, privacy-preserving, bone-conduction cough detection on embedded AI hardware within an initial five-participant study. The current design is a benchtop prototype; the findings should therefore be interpreted as an initial feasibility assessment rather than evidence of robust performance across diverse users and real-world conditions. Future work will extend this platform toward miniaturized wearable implementations combining bone-conduction and inertial sensing for continuous multimodal respiratory monitoring. Full article
Show Figures

Figure 1

18 pages, 533 KB  
Article
A Rigorous Comparative Study of Supervised Machine Learning Techniques for Network Anomaly Detection: Empirical Insights from the UNSW-NB15 Dataset
by Nouf Alkhater
Computers 2026, 15(5), 285; https://doi.org/10.3390/computers15050285 - 1 May 2026
Abstract
The increasing complexity of modern network infrastructures has intensified the need for reliable and efficient intrusion detection systems. While advanced deep learning approaches have demonstrated strong performance, their high computational cost and limited interpretability restrict their practical deployment in real-time environments. This study [...] Read more.
The increasing complexity of modern network infrastructures has intensified the need for reliable and efficient intrusion detection systems. While advanced deep learning approaches have demonstrated strong performance, their high computational cost and limited interpretability restrict their practical deployment in real-time environments. This study presents a systematic empirical evaluation of four supervised machine learning models—Decision Tree, Random Forest, Support Vector Machine (SVM), and XGBoost—for network anomaly detection using the UNSW-NB15 dataset. To ensure methodological rigor, a structured preprocessing pipeline and a five-fold stratified cross-validation framework were employed. Model performance was assessed using multiple evaluation metrics, including accuracy, precision, recall, F1-score, and area under the ROC curve (AUC). In addition, a feature importance analysis was conducted to identify the most influential network traffic attributes contributing to anomaly detection. The results show that ensemble-based methods outperform individual classifiers, with XGBoost achieving the best overall performance (accuracy = 0.97, AUC = 0.98) along with high stability across validation folds. The analysis further reveals that a subset of flow-based and temporal features—such as sttl, sload, and dload—plays a critical role in distinguishing between normal and malicious traffic. This study provides a rigorous, interpretable, and reproducible benchmarking framework for supervised machine learning in network anomaly detection. The findings provide practical insights for developing efficient and scalable intrusion detection systems suitable for real-world deployment. Full article
Show Figures

Figure 1

Back to TopTop