Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,047)

Search Parameters:
Keywords = intelligent initiation system

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 1171 KB  
Article
Person Re-Identification Under Non-Overlapping Cameras Based on Advanced Contextual Embeddings
by Chi-Hung Chuang, Tz-Chian Huang, Chong-Wei Wang, Jung-Hua Lo and Chih-Lung Lin
Algorithms 2025, 18(11), 714; https://doi.org/10.3390/a18110714 - 12 Nov 2025
Viewed by 122
Abstract
Person Re-identification (ReID), a critical technology in intelligent surveillance, aims to accurately match specific individuals across non-overlapping camera networks. However, factors in real-world scenarios such as variations in illumination, viewpoint, and pose continuously challenge the matching accuracy of existing models. Although Transformer-based models [...] Read more.
Person Re-identification (ReID), a critical technology in intelligent surveillance, aims to accurately match specific individuals across non-overlapping camera networks. However, factors in real-world scenarios such as variations in illumination, viewpoint, and pose continuously challenge the matching accuracy of existing models. Although Transformer-based models like TransReID have demonstrated a strong capability for capturing global context in feature extraction, the features they produce still have room for optimization at the metric matching stage. To address this issue, this study proposes a hybrid framework that combines advanced feature extraction with post-processing optimization. We employed a fixed, pre-trained TransReID model as the feature extractor and introduced a camera-aware Jaccard distance re-ranking algorithm (CA-Jaccard) as a post-processing module. Without retraining the main model, this framework refines the initial distance metric matrix by analyzing the local neighborhood topology among feature vectors and incorporating camera information. Experiments were conducted on two major public datasets, Market-1501 and MSMT17. The results show that our framework significantly improved the overall ranking quality of the model, increasing the mean Average Precision (mAP) on Market-1501 from 88.2% to 93.58% compared to using TransReID alone, achieving a gain of nearly 4% in mAP on MSMT17. This research confirms that advanced post-processing techniques can effectively complement powerful feature extraction models, providing an efficient pathway to enhance the robustness of ReID systems in complex scenarios. Additionally, it is the first-ever to analyze how the modified distance metric improves the ReID task when used specifically with the ViT-based feature extractor TransReID. Full article
(This article belongs to the Special Issue Machine Learning for Pattern Recognition (3rd Edition))
Show Figures

Figure 1

30 pages, 3885 KB  
Article
Dynamic Pressure Awareness and Spatiotemporal Collaborative Optimization Scheduling for Microgrids Driven by Flexible Energy Storage
by Hao Liu, Li Di, Yu-Rong Hu, Jian-Wei Ma, Jian Zhao, Xiao-Zhao Wei, Ling Miao and Jing-Yuan Yin
Eng 2025, 6(11), 323; https://doi.org/10.3390/eng6110323 - 11 Nov 2025
Viewed by 155
Abstract
Under the dual carbon goals, microgrids face significant challenges in managing multi-energy flow coupling and maintaining operational robustness with high renewable energy penetration. This paper proposes a novel dynamic pressure-aware spatiotemporal optimization dispatch strategy. The strategy is centered on intelligent energy storage and [...] Read more.
Under the dual carbon goals, microgrids face significant challenges in managing multi-energy flow coupling and maintaining operational robustness with high renewable energy penetration. This paper proposes a novel dynamic pressure-aware spatiotemporal optimization dispatch strategy. The strategy is centered on intelligent energy storage and enables proactive energy allocation for critical pressure moments. We designed and validated the strategy under an ideal benchmark scenario with perfect foresight of the operational cycle. This approach demonstrates its maximum potential for spatiotemporal coordination. On this basis, we propose a Multi-Objective Self-Adaptive Hybrid Enzyme Optimization (MOSHEO) algorithm. The algorithm introduces segmented perturbation initialization, nonlinear search mechanisms, and multi-source fusion strategies. These enhancements improve the algorithm’s global exploration and convergence performance. Specifically, in the ZDT3 test, the IGD metric improved by 7.7% and the SP metric was optimized by 63.4%, while the best HV value of 0.28037 was achieved in the UF4 test. Comprehensive case studies validate the effectiveness of the proposed approach under this ideal setting. Under normal conditions, the strategy successfully eliminates power and thermal deficits of 1120.00 kW and 124.46 kW, respectively, at 19:00. It achieves this through optimal quota allocation, which involved allocating 468.19 kW of electricity at 13:00 and 65.78 kW of thermal energy at 18:00. Under extreme weather, the strategy effectively converts 95.87 kW of electricity to thermal energy at 18:00. This conversion addresses a 444.46 kW thermal deficit. Furthermore, the implementation reduces microgrid cluster trading imbalances from 1300 kW to zero for electricity and from 400 kW to 176.34 kW for thermal energy, significantly enhancing system economics and multi-energy coordination efficiency. This research provides valuable insights and methodological support for advanced microgrid optimization by establishing a performance benchmark, with future work focusing on integration with forecasting techniques. Full article
(This article belongs to the Section Electrical and Electronic Engineering)
Show Figures

Figure 1

22 pages, 1470 KB  
Review
Advancements in Pharmaceutical Lyophilization: Integrating QbD, AI, and Novel Formulation Strategies for Next-Generation Biopharmaceuticals
by Prachi Atre and Syed A. A. Rizvi
Biologics 2025, 5(4), 35; https://doi.org/10.3390/biologics5040035 - 10 Nov 2025
Viewed by 287
Abstract
Lyophilization (freeze-drying) has become a cornerstone pharmaceutical technology for stabilizing biopharmaceuticals, overcoming the inherent instability of biologics, vaccines, and complex drug formulations in aqueous environments. The appropriate literature for this review was identified through a structured search of several databases (such as PubMed, [...] Read more.
Lyophilization (freeze-drying) has become a cornerstone pharmaceutical technology for stabilizing biopharmaceuticals, overcoming the inherent instability of biologics, vaccines, and complex drug formulations in aqueous environments. The appropriate literature for this review was identified through a structured search of several databases (such as PubMed, Scopus) covering publications from late 1990s till date, with inclusion limited to peer-reviewed studies on lyophilization processes, formulation development, and process analytical technologies. This succinct review examines both fundamental principles and cutting-edge advancements in lyophilization technology, with particular emphasis on Quality by Design (QbD) frameworks for optimizing formulation development and manufacturing processes. The work systematically analyzes the critical three-stage lyophilization cycle—freezing, primary drying, and secondary drying—while detailing how key parameters (shelf temperature, chamber pressure, annealing) influence critical quality attributes (CQAs) including cake morphology, residual moisture content, and reconstitution behavior. Special attention is given to formulation strategies employing synthetic surfactants, cryoprotectants, and stabilizers for complex delivery systems such as liposomes, nanoparticles, and biologics. The review highlights transformative technological innovations, including artificial intelligence (AI)-driven cycle optimization, digital twin simulations, and automated visual inspection systems, which are revolutionizing process control and quality assurance. Practical case studies demonstrate successful applications across diverse therapeutic categories, from small molecules to monoclonal antibodies and vaccines, showcasing improved stability profiles and manufacturing efficiency. Finally, the discussion addresses current regulatory expectations (FDA/ICH) and compliance considerations, particularly regarding cGMP implementation and the evolving landscape of AI/ML (machine learning) validation in pharmaceutical manufacturing. By integrating QbD-driven process design with AI-enabled modeling, process analytical technology (PAT) implementation, and regulatory alignment, this review provides both a strategic roadmap and practical insights for advancing lyophilized drug product development to meet contemporary challenges in biopharmaceutical stabilization and global distribution. Despite several publications addressing individual aspects of lyophilization, there is currently no comprehensive synthesis that integrates formulation science, QbD principles, and emerging digital technologies such as AI/ML and digital twins within a unified framework for process optimization. Future work should integrate advanced technologies, AI/ML standardization, and global access initiatives within a QbD framework to enable next-generation lyophilized products with improved stability and patient focus. Full article
Show Figures

Graphical abstract

12 pages, 391 KB  
Systematic Review
Contemporary Trends in University Administration with the Integration of Digital/New Technologies
by Sotiria Panagiota Souli and Christos Pierrakeas
Adm. Sci. 2025, 15(11), 437; https://doi.org/10.3390/admsci15110437 - 10 Nov 2025
Viewed by 387
Abstract
This study conducts a systematic scoping review to explore how universities are integrating digital and emerging technologies into administrative processes. Following the PRISMA-ScR methodology, we systematically searched four major databases—Web of Science, Scopus, IEEE Xplore, and Google Scholar—for peer-reviewed publications between 2019 and [...] Read more.
This study conducts a systematic scoping review to explore how universities are integrating digital and emerging technologies into administrative processes. Following the PRISMA-ScR methodology, we systematically searched four major databases—Web of Science, Scopus, IEEE Xplore, and Google Scholar—for peer-reviewed publications between 2019 and 2024. Fifty-two studies met the inclusion criteria after rigorous screening and quality assessment using the CASP and JBI checklists. The originality of this review lies in synthesizing cross-disciplinary perspectives—encompassing digital marketing, artificial intelligence (AI), learning management systems (LMSs), open data, and collaborative digital tools—into a unified framework of administrative innovation. Findings reveal that digital marketing strategies enhance student engagement and institutional visibility, AI improves efficiency and decision-making, LMSs streamline academic and administrative coordination, and open data initiatives promote transparency but encounter legal and cultural resistance. Despite the potential of these technologies, persistent challenges include data privacy concerns, uneven digital infrastructure, and limited institutional readiness. This review contributes to the literature by mapping the intersection of technological innovation and university governance, identifying research gaps, and outlining directions for sustainable digital transformation in higher education. Full article
Show Figures

Figure 1

45 pages, 2852 KB  
Review
The Role of Carbon Capture, Utilization, and Storage (CCUS) Technologies and Artificial Intelligence (AI) in Achieving Net-Zero Carbon Footprint: Advances, Implementation Challenges, and Future Perspectives
by Ife Fortunate Elegbeleye, Olusegun Aanuoluwapo Oguntona and Femi Abiodun Elegbeleye
Technologies 2025, 13(11), 509; https://doi.org/10.3390/technologies13110509 - 8 Nov 2025
Viewed by 658
Abstract
Carbon dioxide (CO2), the primary anthropogenic greenhouse gas, drives significant and potentially irreversible impacts on ecosystems, biodiversity, and human health. Achieving the Paris Agreement target of limiting global warming to well below 2 °C, ideally 1.5 °C, requires rapid and substantial [...] Read more.
Carbon dioxide (CO2), the primary anthropogenic greenhouse gas, drives significant and potentially irreversible impacts on ecosystems, biodiversity, and human health. Achieving the Paris Agreement target of limiting global warming to well below 2 °C, ideally 1.5 °C, requires rapid and substantial global emission reductions. While recent decades have seen advances in clean energy technologies, carbon capture, utilization, and storage (CCUS) remain essential for deep decarbonization. Despite proven technical readiness, large-scale carbon capture and storage (CCS) deployment has lagged initial targets. This review evaluates CCS technologies and their contributions to net-zero objectives, with emphasis on sector-specific applications. We found that, in the iron and steel industry, post-combustion CCS and oxy-combustion demonstrate potential to achieve the highest CO2 capture efficiencies, whereas cement decarbonization is best supported by oxy-fuel combustion, calcium looping, and emerging direct capture methods. For petrochemical and refining operations, oxy-combustion, post-combustion, and chemical looping offer effective process integration and energy efficiency gains. Direct air capture (DAC) stands out for its siting flexibility, low land-use conflict, and ability to remove atmospheric CO2, but it’s hindered by high costs (~$100–1000/t CO2). Conversely, post-combustion capture is more cost-effective (~$47–76/t CO2) and compatible with existing infrastructure. CCUS could deliver ~8% of required emission reductions for net-zero by 2050, equivalent to ~6 Gt CO2 annually. Scaling deployment will require overcoming challenges through material innovations aided by artificial intelligence (AI) and machine learning, improving capture efficiency, integrating CCS with renewable hybrid systems, and establishing strong, coordinated policy frameworks. Full article
(This article belongs to the Section Environmental Technology)
Show Figures

Figure 1

17 pages, 1904 KB  
Article
Optimal Deployment of Low-Voltage Instrument Transformers Considering Time-Varying Risk Assessment
by Yinglong Diao, Jiawei Fan, Kangmin Hu, Lei Yang and Qiang Yao
Electronics 2025, 14(22), 4361; https://doi.org/10.3390/electronics14224361 - 7 Nov 2025
Viewed by 178
Abstract
To address the “metering blind zone” problem in distribution networks caused by flood disasters, this paper proposes an optimal deployment strategy for low-voltage instrument transformers (LVITs) based on time-varying risk assessment. A comprehensive model quantifying real-time node importance during disaster progression is established, [...] Read more.
To address the “metering blind zone” problem in distribution networks caused by flood disasters, this paper proposes an optimal deployment strategy for low-voltage instrument transformers (LVITs) based on time-varying risk assessment. A comprehensive model quantifying real-time node importance during disaster progression is established, considering cascading faults and dynamic load fluctuations. A multi-objective optimization model minimizes deployment costs while maximizing fault coverage, incorporating dynamic response constraints. A Genetic-Greedy Hybrid Algorithm (GGHA) with intelligent initialization and elite retention mechanisms is proposed to solve the complex spatiotemporal coupling problem. Simulation results demonstrate that GGHA achieves solution quality of 0.847, outperforming PSO, GA, and GD by 7.5%, 11.7%, and 8.7%, respectively, with convergence stability within ±2.5%. The strategy maintains 100% normal coverage and 73.3–95.5% disaster coverage across flood severity levels, exhibiting strong feasibility and generalizability on IEEE 123-node and 33-node test systems. Full article
Show Figures

Figure 1

26 pages, 1952 KB  
Review
Beyond Standard Parameters: Precision Hemodynamic Monitoring in Patients on Veno-Arterial ECMO
by Debora Emanuela Torre and Carmelo Pirri
J. Pers. Med. 2025, 15(11), 541; https://doi.org/10.3390/jpm15110541 - 7 Nov 2025
Viewed by 449
Abstract
Background: Hemodynamic management in veno-arterial extracorporeal membrane oxygenation (V-A ECMO) is inherently complex, as extracorporeal circulation profoundly alters preload, afterload, ventriculo-arterial coupling and tissue perfusion. This review summarizes current and emerging monitoring strategies to guide initiation, maintenance and weaning. Methods: A [...] Read more.
Background: Hemodynamic management in veno-arterial extracorporeal membrane oxygenation (V-A ECMO) is inherently complex, as extracorporeal circulation profoundly alters preload, afterload, ventriculo-arterial coupling and tissue perfusion. This review summarizes current and emerging monitoring strategies to guide initiation, maintenance and weaning. Methods: A structured literature search was performed in PubMed and Scopus (1990–2025), including clinical studies, consensus statement and expert reviews addressing hemodynamic monitoring in V-A ECMO. Results: A multiparametric framework is required. Echocardiography remains central for assessing biventricular performance, aortic valve dynamics and ventricular unloading. Pulmonary artery catheterization provides complementary data on filling pressures, cardiac output and global oxygen balance. Metabolic indices such as lactate clearance and veno-arterial CO2 gap, together with regional oximetry (NIRS), inform the adequacy of systemic and tissue perfusion. Microcirculatory monitoring, though technically demanding, has shown prognostic value, particularly during weaning. Additional adjuncts include arterial pulse pressure, end-tidal CO2 and waveform analysis. Phenotype oriented priorities, such as detection of differential hypoxemia, prevention of left ventricular distension or surveillance for limb ischemia, require tailored monitoring strategies. Artificial intelligence and machine learning represent future avenues for integrating multiparametric data into predictive models. Conclusions: No single modality can capture the hemodynamic complexity of V-A ECMO. Precision monitoring demands a dynamic, phenotype-specific and time-dependent approach that integrates systemic, cardiac, metabolic and microcirculatory variables. Such individualized strategies hold promise to optimize outcomes, reduce complications and align V-A ECMO management with the principles of precision medicine. Full article
(This article belongs to the Special Issue Emergency and Critical Care in the Context of Personalized Medicine)
Show Figures

Graphical abstract

5 pages, 170 KB  
Proceeding Paper
Shaping AI-Based Decision Support in Kidney Cancer: Multidisciplinary Consensus from the IKCSEU25 ART Think Tank
by Ignacio Duran and Jesus Garcia-Donas
Med. Sci. Forum 2025, 39(1), 1; https://doi.org/10.3390/msf2025039001 - 6 Nov 2025
Viewed by 334
Abstract
Background: Artificial intelligence (AI) has the potential to significantly enhance clinical decision-making in oncology. However, its application in renal cell carcinoma (RCC) remains limited. The ART (Artificial Intelligence in Renal Tumors) project is a Spanish, multi-institutional initiative aimed at developing a dynamic, [...] Read more.
Background: Artificial intelligence (AI) has the potential to significantly enhance clinical decision-making in oncology. However, its application in renal cell carcinoma (RCC) remains limited. The ART (Artificial Intelligence in Renal Tumors) project is a Spanish, multi-institutional initiative aimed at developing a dynamic, transcriptomics-based AI model to guide systemic treatment decisions for patients with metastatic RCC (mRCC). Objective: The aim of this paper is to present the rationale, methodology, and early implementation challenges of the ART project, as discussed during a dedicated Think Tank session at the 2025 International Kidney Cancer Symposium Europe (IKCSEU25), and to gather expert insights on its clinical and regulatory viability. Design, Setting, and Participants: The ART project includes three phases: (1) retrospective algorithm training using clinical and transcriptomic data from completed trials; (2) a prospective, non-interventional study collecting multi-omic and clinical data from 500 patients across 30 centers; and (3) a future comparative analysis of ART-guided versus standard clinical decisions. The AI model is designed to evolve continuously through ongoing data integration. Results and Limitations: Experts underscored the importance of integrating multimodal data—including circulating biomarkers and immune profiling—while expressing concerns about the reliance on short-term endpoints. Key barriers identified included data harmonization, external validation, and regulatory uncertainty regarding adaptive algorithms. The absence of a clear approval pathway for non-static clinical decision support systems also poses a challenge. Despite limited initial funding, the ART platform has generated strong institutional engagement and may serve as a scalable model for clinician-oriented AI tools. Conclusions: The ART project represents an innovative approach to AI-driven personalization of kidney cancer treatment. Expert feedback from IKCSEU25 highlighted the scientific robustness of the initiative, while also emphasizing the need for broader validation, regulatory clarity, and the use of clinically meaningful endpoints to support real-world implementation. Patient Summary: Experts reviewed a new AI-based tool being developed in Spain to help doctors choose the best treatments for kidney cancer. The tool shows promise but needs further testing and must meet regulatory standards before it can be used in routine clinical care. Full article
16 pages, 923 KB  
Review
Beyond the Surface: Revealing the Concealed Effects of Hyperglycemia on Ocular Surface Homeostasis and Dry Eye Disease
by Marco Zeppieri, Matteo Capobianco, Federico Visalli, Mutali Musa, Alessandro Avitabile, Rosa Giglio, Daniele Tognetto, Caterina Gagliano, Fabiana D’Esposito and Francesco Cappellani
Medicina 2025, 61(11), 1992; https://doi.org/10.3390/medicina61111992 - 6 Nov 2025
Viewed by 309
Abstract
Background and Objectives: Dry eye disease (DED) is a multifactorial ocular surface disease that markedly diminishes quality of life. Although diabetes mellitus is well-known for its retinal consequences, anterior segment symptoms including dry eye disease are often overlooked. Chronic hyperglycemia causes metabolic, [...] Read more.
Background and Objectives: Dry eye disease (DED) is a multifactorial ocular surface disease that markedly diminishes quality of life. Although diabetes mellitus is well-known for its retinal consequences, anterior segment symptoms including dry eye disease are often overlooked. Chronic hyperglycemia causes metabolic, neurovascular, and immunological changes that undermine tear film stability, corneal innervation, and ocular surface integrity. This review seeks to consolidate existing knowledge regarding the concealed impacts of diabetes on ocular surface homeostasis, highlighting processes, diagnostic difficulties, and treatment prospects. Materials and Methods: A narrative review of the literature was performed by searching PubMed for publications from January 2020 to July 2025 using the terms “diabetic dry eye,” “hyperglycemia AND ocular surface,” “tear proteomics AND diabetes,” “corneal nerves AND diabetes,” and “neurotrophic keratitis.” Eligible studies were experimental research, clinical trials, and translational investigations concerning tear film function, corneal neuropathy, inflammatory indicators, or lacrimal gland dysfunction in diabetes. The exclusion criteria were non-English language, lack of primary data, and inadequate methodological description. Results: Hyperglycemia compromises lacrimal gland functionality, modifies lipid secretion from Meibomian glands, and diminishes corneal nerve density, resulting in neurotrophic deficits. Inflammatory cytokines and oxidative stress compromise epithelial integrity, but proteome alterations in tears serve as sensitive indicators of disease. Diagnosis is impeded by corneal hypoesthesia, resulting in a disconnection between symptoms and findings. Progress in imaging, proteomics, and artificial intelligence may facilitate earlier detection and improved risk assessment. Novel therapeutics, such as neurotrophic drugs, antioxidants, and customized anti-inflammatory approaches, show promise but remain under clinical evaluation. Conclusions: Diabetes-related dry eye disease is a multifaceted and underappreciated condition influenced by systemic metabolic dysfunction. The ocular surface may act as an initial indicator for systemic disease load. Narrative synthesis emphasizes the necessity for customized diagnostic instruments, individualized treatment approaches, and collaborative management. Reconceptualizing diabetic dry eye disease within the context of systemic metabolic care presents prospects for precision medicine strategies that enhance both ocular and systemic results. Full article
(This article belongs to the Special Issue Ophthalmology: New Diagnostic and Treatment Approaches (2nd Edition))
Show Figures

Figure 1

34 pages, 2046 KB  
Article
Sustainable AI Transformation: A Critical Framework for Organizational Resilience and Long-Term Viability
by Jonathan H. Westover
Sustainability 2025, 17(21), 9822; https://doi.org/10.3390/su17219822 - 4 Nov 2025
Viewed by 418
Abstract
This research examines how artificial intelligence is reshaping business and labor structures through a sustainability lens. Drawing on survey data from 127 organizations and 14 case studies, we quantify workforce impacts while exposing methodological limitations in current projections. Our analysis reveals implementation variations [...] Read more.
This research examines how artificial intelligence is reshaping business and labor structures through a sustainability lens. Drawing on survey data from 127 organizations and 14 case studies, we quantify workforce impacts while exposing methodological limitations in current projections. Our analysis reveals implementation variations of 37% across industries and 41% higher user adoption rates for hybrid governance approaches versus centralized models. The evidence supports a three-dimensional strategic framework for sustainable organizational development: comprehensive upskilling fostering behavioral change (2.7× higher implementation success), distributed innovation enabling cross-functional ideation (3.1× more identified use cases), and strategic integration aligning systems across departments (explaining 31% of implementation success variance). Organizations deploying all three dimensions achieved a 74% AI initiative success rate versus 12% for those using none. Implementation barriers include regulatory uncertainty, organizational resistance, and ethical considerations, with data infrastructure maturity (β = 0.32), executive sponsorship (β = 0.29), and change readiness (β = 0.26) explaining 58% of implementation success variance. Our findings indicate that sustainable adaptation capacity—not merely technological investment—determines which organizations successfully navigate this transformation while maintaining long-term organizational viability, workforce resilience, and contribution to broader sustainable development goals. Full article
(This article belongs to the Section Economic and Business Aspects of Sustainability)
Show Figures

Figure 1

23 pages, 3017 KB  
Article
Real-Time Passenger Flow Analysis in Tram Stations Using YOLO-Based Computer Vision and Edge AI on Jetson Nano
by Sonia Diaz-Santos, Pino Caballero-Gil and Cándido Caballero-Gil
Computers 2025, 14(11), 476; https://doi.org/10.3390/computers14110476 - 3 Nov 2025
Viewed by 685
Abstract
Efficient real-time computer vision-based passenger flow analysis is increasingly important for the management of intelligent transportation systems and smart cities. This paper presents the design and implementation of a system for real-time object detection, tracking, and people counting in tram stations. The proposed [...] Read more.
Efficient real-time computer vision-based passenger flow analysis is increasingly important for the management of intelligent transportation systems and smart cities. This paper presents the design and implementation of a system for real-time object detection, tracking, and people counting in tram stations. The proposed approach integrates YOLO-based detection with a lightweight tracking module and is deployed on an NVIDIA Jetson Nano device, enabling operation under resource constraints and demonstrating the potential of edge AI. Multiple YOLO versions, from v3 to v11, were evaluated on data collected in collaboration with Metropolitano de Tenerife. Experimental results show that YOLOv5s achieves the best balance between detection accuracy and inference speed, reaching 96.85% accuracy in counting tasks. The system demonstrates the feasibility of applying edge AI to monitor passenger flow in real time, contributing to intelligent transportation and smart city initiatives. Full article
Show Figures

Figure 1

22 pages, 3019 KB  
Article
Probabilistic Forecast for Real-Time Control of Rainwater Pollutant Loads in Urban Environments
by Annalaura Gabriele, Federico Di Palma, Ezio Todini and Rudy Gargano
Hydrology 2025, 12(11), 289; https://doi.org/10.3390/hydrology12110289 - 1 Nov 2025
Viewed by 234
Abstract
Advanced wastewater management systems are necessary to effectively direct severely contaminated initial rainwater runoff to the treatment facility only when pollutant concentrations are elevated during the initial flush event, thereby reducing the risk of water pollution caused by urban drainage systems. This necessitates [...] Read more.
Advanced wastewater management systems are necessary to effectively direct severely contaminated initial rainwater runoff to the treatment facility only when pollutant concentrations are elevated during the initial flush event, thereby reducing the risk of water pollution caused by urban drainage systems. This necessitates the implementation of intelligent decision-making systems, forecasting, and monitoring. However, conventional “deterministic” forecasts are inadequate for making informed decisions in the presence of uncertainty regarding future values, despite the fact that a variety of modeling techniques have been employed to predict total suspended solids at specific locations. The literature contains a number of “probabilistic” forecasting approaches that take into account uncertainty. Among them, this paper proposes the Model Conditional Processor (MCP), which is well-known in hydrological, hydraulic, and climatological fields, to forecast the predictive probability density of total suspended solids based on one or more deterministic predictions. This is intended to address the issue. The decision to divert the first flush is subsequently guided by the predictive density and probabilistic thresholds. The effective implementation of the MCP approach is demonstrated in a real case study that is part of the USGS’s extensive and long-term stormwater monitoring initiative, based on observations of a real stormwater drainage system. The results obtained confirm that probabilistic approaches are suitable instruments for enhancing decision-making. Full article
Show Figures

Figure 1

42 pages, 17784 KB  
Article
Research on a Short-Term Electric Load Forecasting Model Based on Improved BWO-Optimized Dilated BiGRU
by Ziang Peng, Haotong Han and Jun Ma
Sustainability 2025, 17(21), 9746; https://doi.org/10.3390/su17219746 - 31 Oct 2025
Viewed by 322
Abstract
In the context of global efforts toward energy conservation and emission reduction, accurate short-term electric load forecasting plays a crucial role in improving energy efficiency, enabling low-carbon dispatching, and supporting sustainable power system operations. To address the growing demand for accuracy and stability [...] Read more.
In the context of global efforts toward energy conservation and emission reduction, accurate short-term electric load forecasting plays a crucial role in improving energy efficiency, enabling low-carbon dispatching, and supporting sustainable power system operations. To address the growing demand for accuracy and stability in this domain, this paper proposes a novel prediction model tailored for power systems. The proposed method combines Spearman correlation analysis with modal decomposition techniques to compress redundant features while preserving key information, resulting in more informative and cleaner input representations. In terms of model architecture, this study integrates Bidirectional Gated Recurrent Units (BiGRUs) with dilated convolution. This design improves the model’s capacity to capture long-range dependencies and complex relationships. For parameter optimization, an Improved Beluga Whale Optimization (IBWO) algorithm is introduced, incorporating dynamic population initialization, adaptive Lévy flight mechanisms, and refined convergence procedures to enhance search efficiency and robustness. Experiments on real-world datasets demonstrate that the proposed model achieves excellent forecasting performance (RMSE = 26.1706, MAE = 18.5462, R2 = 0.9812), combining high predictive accuracy with strong generalization. These advancements contribute to more efficient energy scheduling and reduced environmental impact, making the model well-suited for intelligent and sustainable load forecasting applications in environmentally conscious power systems. Full article
Show Figures

Figure 1

25 pages, 6312 KB  
Review
Early Insights into AI and Machine Learning Applications in Hydrogel Microneedles: A Short Review
by Jannah Urifa and Kwok Wei Shah
Micro 2025, 5(4), 48; https://doi.org/10.3390/micro5040048 - 31 Oct 2025
Viewed by 441
Abstract
Hydrogel microneedles (HMNs) act as non-invasive devices that can effortlessly merge with the human body for drug delivery and diagnostic purposes. Nonetheless, their improvement is limited by intricate and repetitive issues related to material composition, structural geometry, manufacturing accuracy, and performance enhancement. At [...] Read more.
Hydrogel microneedles (HMNs) act as non-invasive devices that can effortlessly merge with the human body for drug delivery and diagnostic purposes. Nonetheless, their improvement is limited by intricate and repetitive issues related to material composition, structural geometry, manufacturing accuracy, and performance enhancement. At present, there are only a limited number of studies accessible since artificial intelligence and machine learning (AI/ML) for HMN are just starting to emerge and are in the initial phase. Data is distributed across separate research efforts, spanning different fields. This review aims to tackle the disjointed and narrowly concentrated aspects of current research on AI/ML applications in HMN technologies by offering a cohesive, comprehensive synthesis of interdisciplinary insights, categorized into five thematic areas: (1) material and microneedle design, (2) diagnostics and therapy, (3) drug delivery, (4) drug development, and (5) health and agricultural sensing. For each domain, we detail typical AI methods, integration approaches, proven advantages, and ongoing difficulties. We suggest a systematic five-stage developmental pathway covering material discovery, structural design, manufacturing, biomedical performance, and advanced AI integration, intended to expedite the transition of HMNs from research ideas to clinically and commercially practical systems. The findings of this review indicate that AI/ML can significantly enhance HMN development by addressing design and fabrication constraints via predictive modeling, adaptive control, and process optimization. By synchronizing these abilities with clinical and commercial translation requirements, AI/ML can act as key facilitators in converting HMNs from research ideas into scalable, practical biomedical solutions. Full article
Show Figures

Figure 1

44 pages, 4433 KB  
Article
Mathematical Model of the Software Development Process with Hybrid Management Elements
by Serhii Semenov, Volodymyr Tsukur, Valentina Molokanova, Mateusz Muchacki, Grzegorz Litawa, Mykhailo Mozhaiev and Inna Petrovska
Appl. Sci. 2025, 15(21), 11667; https://doi.org/10.3390/app152111667 - 31 Oct 2025
Viewed by 211
Abstract
Reliable schedule-risk estimation in hybrid software development lifecycles is strategically important for organizations adopting AI in software engineering. This study addresses that need by transforming routine process telemetry (CI/CD, SAST, traceability) into explainable, quantitative predictions of completion time and rework. This paper introduces [...] Read more.
Reliable schedule-risk estimation in hybrid software development lifecycles is strategically important for organizations adopting AI in software engineering. This study addresses that need by transforming routine process telemetry (CI/CD, SAST, traceability) into explainable, quantitative predictions of completion time and rework. This paper introduces an integrated probabilistic model of the hybrid software development lifecycle that combines Generalized Evaluation and Review Technique (GERT) network semantics with I-AND synchronization, explicit artificial-intelligence (AI) interventions, and a fuzzy treatment of epistemic uncertainty. The model embeds two controllable AI nodes–an AI Requirements Assistant and AI-augmented static code analysis, directly into the process topology and applies an analytical reduction to a W-function to obtain iteration-time distributions and release-success probabilities without resorting solely to simulation. Epistemic uncertainty on critical arcs is represented by fuzzy intervals and propagated via Zadeh’s extension principle, while aleatory variability is captured through stochastic branching. Parameter calibration relies on process telemetry (requirements traceability, static-analysis signals, continuous integration/continuous delivery, CI/CD, and history). A validation case (“system design → UX prototyping → implementation → quality assurance → deployment”) demonstrates practical use: large samples of process trajectories are generated under identical initial conditions and fixed random seeds, and kernel density estimation with Silverman’s bandwidth is applied to normalized histograms of continuous outcomes. Results indicate earlier defect detection, fewer late rework loops, thinner right tails of global duration, and an approximately threefold reduction in the expected number of rework cycles when AI is enabled. The framework yields interpretable, scenario-ready metrics for tuning quality-gate policies and automation levels in Agile/DevOps settings. Full article
Show Figures

Figure 1

Back to TopTop