Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (779)

Search Parameters:
Keywords = computational intelligence for modeling and control

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
36 pages, 1952 KB  
Review
Comparative Review of Reactive Power Estimation Techniques for Voltage Restoration
by Natanael Faleiro, Raul Monteiro, André Fonseca, Lina Negrete, Rogério Lima and Jakson Bonaldo
Energies 2026, 19(3), 826; https://doi.org/10.3390/en19030826 - 4 Feb 2026
Abstract
With the focus on the growing concern of voltage instability and its inherent risks connected to blackouts, this study addresses the importance of Volt/VAR control (VVC) in maintaining voltage stability, optimizing power factor, and reducing losses. As such, this scientific article presents a [...] Read more.
With the focus on the growing concern of voltage instability and its inherent risks connected to blackouts, this study addresses the importance of Volt/VAR control (VVC) in maintaining voltage stability, optimizing power factor, and reducing losses. As such, this scientific article presents a review of the methodologies used to estimate the quantity of reactive power required to restore voltage in power grids. Although reviews exist on classical methods, optimization, and machine learning, a study unifying these approaches is lacking. This gap hinders an integrated comparison of methodologies and constitutes the main motivation for this study in 2025. This absence of a consolidated and up-to-date review limits both academic progress and practical decision-making in modern power systems, especially as DER penetration accelerates. This research was conducted using the Scopus database through the selection of articles that address reactive power estimation methods. The results indicate that traditional numerical and optimization methods, although accurate, demonstrate high computational costs for real-time application. In contrast, techniques such as Deep Reinforcement Learning (DRL) and hybrid models show greater potential for dealing with uncertainties and dynamic topologies. The conclusion reached is that the solution for reactive power management lies in hybrid approaches, which combine machine learning with numerical methods, supported by an intelligent and robust data infrastructure. The comparative analysis shows that numerical methods offer high precision but are computationally expensive for real-time use; optimization techniques provide good robustness but depend on detailed models that are sensitive to system conditions; and machine learning-based approaches offer greater adaptability under uncertainty, although they require large datasets and careful training. Given these complementary limitations, hybrid approaches emerge as the most promising alternative, combining the reliability of classical methods with the flexibility of intelligent models, especially in smart grids with dynamic topologies and high penetration of Distributed Energy Resources (DERs). Full article
(This article belongs to the Section A1: Smart Grids and Microgrids)
Show Figures

Figure 1

19 pages, 712 KB  
Review
Reinforcement Learning for UAV Control: From Algorithms to Deployment Readiness
by Georgios Memlikai and Konstantinos A. Tsintotas
Machines 2026, 14(2), 177; https://doi.org/10.3390/machines14020177 - 3 Feb 2026
Abstract
The rapid expansion of unmanned aerial vehicles (UAVs) across diverse application domains has underscored the need for reliable autonomy in complex and dynamic environments. To advance toward this goal, in recent years, learning-based control strategies have emerged as a promising alternative, offering adaptability [...] Read more.
The rapid expansion of unmanned aerial vehicles (UAVs) across diverse application domains has underscored the need for reliable autonomy in complex and dynamic environments. To advance toward this goal, in recent years, learning-based control strategies have emerged as a promising alternative, offering adaptability and decision-making capabilities beyond those of conventional model-based ones. Bearing this in mind, the proposed article examines reinforcement learning methodologies for controlling UAVs, with particular emphasis on commonly used virtual environments, benchmark tasks, and the challenges of bridging the gap between simulation and real-world deployment. Therefore, key limitations, including high computational demands, reliance on extensive training data, and reduced robustness under environmental variability, are critically analyzed from a practical implementation perspective. Rather than adopting an algorithm-centric viewpoint, this work aggregates existing knowledge and categorizes learning-based approaches by their level of control abstraction and their treatment of safety and stability, thereby identifying the key factors limiting large-scale real-world deployment and the trends shaping intelligent controllers. Full article
(This article belongs to the Special Issue Intelligent Control Techniques for Unmanned Aerial Vehicles)
Show Figures

Figure 1

39 pages, 2492 KB  
Systematic Review
Cloud, Edge, and Digital Twin Architectures for Condition Monitoring of Computer Numerical Control Machine Tools: A Systematic Review
by Mukhtar Fatihu Hamza
Information 2026, 17(2), 153; https://doi.org/10.3390/info17020153 - 3 Feb 2026
Abstract
Condition monitoring has come to the forefront of intelligent manufacturing and is particularly important in Computer Numerical Control (CNC) machining processes, where reliability, precision, and productivity are crucial. The traditional methods of monitoring, which are mostly premised on single sensors, the localized capture [...] Read more.
Condition monitoring has come to the forefront of intelligent manufacturing and is particularly important in Computer Numerical Control (CNC) machining processes, where reliability, precision, and productivity are crucial. The traditional methods of monitoring, which are mostly premised on single sensors, the localized capture of data, and offline interpretation, are proving too small to handle current machining processes. Being limited in their scale, having limited computational power, and not being responsive in real-time, they do not fit well in a dynamic and data-intensive production environment. Recent progress in the Industrial Internet of Things (IIoT), cloud computing, and edge intelligence has led to a push into distributed monitoring architectures capable of obtaining, processing, and interpreting large amounts of heterogeneous machining data. Such innovations have facilitated more adaptive decision-making approaches, which have helped in supporting predictive maintenance, enhancing machining stability, tool lifespan, and data-driven optimization in manufacturing businesses. A structured literature search was conducted across major scientific databases, and eligible studies were synthesized qualitatively. This systematic review synthesizes over 180 peer-reviewed studies found in major scientific databases, using specific inclusion criteria and a PRISMA-guided screening process. It provides a comprehensive look at sensor technologies, data acquisition systems, cloud–edge–IoT frameworks, and digital twin implementations from an architectural perspective. At the same time, it identifies ongoing challenges related to industrial scalability, standardization, and the maturity of deployment. The combination of cloud platforms and edge intelligence is of particular interest, with emphasis placed on how the two ensure a balance in the computational load and latency, and improve system reliability. The review is a synthesis of the major advances associated with sensor technologies, data collection approaches, machine operations, machine learning, deep learning methods, and digital twins. The paper concludes with what can and cannot be performed to date by providing a comparative analysis of what is known about this topic and the reported industrial case applications. The main issues, such as the inconsistency of data, the lack of standardization, cyber threats, and old system integration, are critically analyzed. Lastly, new research directions are touched upon, including hybrid cloud–edge intelligence, advanced AI models, and adaptive multisensory fusion, which is oriented to autonomous and self-evolving CNC monitoring systems in line with the Industry 4.0 and Industry 5.0 paradigms. The review process was made transparent and repeatable by using a PRISMA-guided approach to qualitative synthesis and literature screening. Full article
Show Figures

Figure 1

38 pages, 1574 KB  
Review
A Review of Intelligent Power Management and AI-Assisted Energy-Efficient Control in Robotics
by Nathaniel Jackson, Francisca Oseghale, Annette von Jouanne and Alex Yokochi
Energies 2026, 19(3), 780; https://doi.org/10.3390/en19030780 - 2 Feb 2026
Viewed by 26
Abstract
As robotic platforms have become more capable, the need for improved power efficiency has grown due to increased applications and computational loads. Several methods and controllers are available in various types of robotics that can achieve increased power efficiency. This paper reviews intelligent [...] Read more.
As robotic platforms have become more capable, the need for improved power efficiency has grown due to increased applications and computational loads. Several methods and controllers are available in various types of robotics that can achieve increased power efficiency. This paper reviews intelligent power management methods and energy-efficient controls in untethered battery-powered robotics including dynamic power management (DPM), dynamic voltage and frequency scaling (DVFS), AI-assisted adaptive dynamic programming (DP) control systems, AI-assisted model predictive control (MPC) systems, and hybrid energy storage system (HESS) hardware well suited for multi-objective AI integration. Robotic neural networks and AI-enhancement are identified as promising directions for advanced research. However, the need to improve training power efficiency calls for further research if these AI-enhancement systems are to be integrated onboard robotic platforms. This paper provides the background and case study implementation of robotic power efficiency methods across various scales of development to illustrate the current capabilities of robotic platforms. Efficiency improvements are quantified and opportunities for advancements are presented, as well as key findings reached through this in-depth review. Full article
(This article belongs to the Section F5: Artificial Intelligence and Smart Energy)
Show Figures

Figure 1

22 pages, 1796 KB  
Article
Untargeted Metabolomics and Multivariate Data Processing to Reveal SARS-CoV-2 Specific VOCs for Canine Biodetection
by Diego Pardina Aizpitarte, Eider Larrañaga, Ugo Mayor, Ainhoa Isla, Jose Manuel Amigo and Luis Bartolomé
Chemosensors 2026, 14(2), 35; https://doi.org/10.3390/chemosensors14020035 - 2 Feb 2026
Viewed by 38
Abstract
The exceptional olfactory capabilities of trained detection dogs demonstrate high potential for identifying infectious diseases. However, safe and standardized canine training requires specific chemical targets rather than infectious biological samples. This study presents an analytical proof-of-concept combining untargeted metabolomics and machine learning (ML) [...] Read more.
The exceptional olfactory capabilities of trained detection dogs demonstrate high potential for identifying infectious diseases. However, safe and standardized canine training requires specific chemical targets rather than infectious biological samples. This study presents an analytical proof-of-concept combining untargeted metabolomics and machine learning (ML) to decode the specific odor profile of SARS-CoV-2 infection. Using headspace solid-phase microextraction gas chromatography coupled with time-of-flight mass spectrometry (HS-SPME-GC/MS-ToF), axillary sweat samples from 76 individuals (SARS-CoV-2 positive and negative) were analyzed. Data preprocessing and dimensionality reduction were performed to feed a Partial Least Squares-Discriminant Analysis (PLS-DA) model. The optimized model achieved an overall accuracy of 79%, with a specificity of 89% and sensitivity of 70% in external validation, identifying a specific panel of Volatile Organic Compounds (VOCs) as discriminant biomarkers. The optimized model achieved robust classification performance, effectively distinguishing infected individuals from healthy controls based solely on their volatilome. Six VOCs were found to be consistently presented in COVID-19-positive individuals. These compounds were proposed as candidate odor signatures for constructing artificial training aids to standardize and accelerate the training of detection dogs. This study establishes a framework where machine learning-driven metabolomic profiling directly informs biological sensor training, offering a novel synergy between ML and biological intelligence in disease detection. This study establishes a scalable computational framework to translate biological samples into chemical data, providing the scientific basis for designing safe, synthetic K9 training aids for future infectious disease outbreaks without the biosafety risks associated with handling live pathogens. Full article
Show Figures

Figure 1

25 pages, 2737 KB  
Review
Integration of Artificial Intelligence in Food Processing Technologies
by Ali Ayoub
Processes 2026, 14(3), 513; https://doi.org/10.3390/pr14030513 - 2 Feb 2026
Viewed by 101
Abstract
The food processing industry is undergoing a profound transformation with the integration of Artificial Intelligence (AI), evolving from traditional automation to intelligent, adaptive systems aligned with Industry 5.0 principles. This review examines AI’s role across the food value chain, including supply chain management, [...] Read more.
The food processing industry is undergoing a profound transformation with the integration of Artificial Intelligence (AI), evolving from traditional automation to intelligent, adaptive systems aligned with Industry 5.0 principles. This review examines AI’s role across the food value chain, including supply chain management, quality control, process optimization in key unit operations, and emerging areas. Recent advancements in machine learning (ML), computer vision, and predictive analytics have significantly improved detection in food processing, achieving accuracy exceeding 98%. These technologies have also contributed to energy savings of 15–20% and reduced waste through real-time process optimization and predictive maintenance. The integration of blockchain and Internet of Things (IoT) technologies further strengthens traceability and sustainability across the supply chain, while generative AI accelerates the development of novel food products. Despite these benefits, several challenges persist, including substantial implementation costs, heterogeneous data sources, ethical considerations related to workforce displacement, and the opaque, “black box” nature of many AI models. Moreover, the effectiveness of AI solutions remains context-dependent; some studies report only marginal improvements in dynamic or data-poor environments. Looking ahead, the sector is expected to embrace autonomous manufacturing, edge computing, and bio-computing, with projections indicating that the AI market in food processing could approach $90 billion by 2030. Full article
Show Figures

Figure 1

30 pages, 1315 KB  
Review
Abrasive Water Jet Machining (AWJM) of Titanium Alloy—A Review
by Aravinthan Arumugam, Alokesh Pramanik, Amit Rai Dixit and Animesh Kumar Basak
Designs 2026, 10(1), 13; https://doi.org/10.3390/designs10010013 - 31 Jan 2026
Viewed by 78
Abstract
Abrasive water jet machining (AWJM) is a non-traditional machining process that is increasingly employed for shaping hard-to-machine materials, particularly titanium (Ti)-based alloys such as Ti-6Al-4V. Owing to its non-thermal nature, AWJM enables effective material removal while minimising metallurgical damage and preserving subsurface integrity. [...] Read more.
Abrasive water jet machining (AWJM) is a non-traditional machining process that is increasingly employed for shaping hard-to-machine materials, particularly titanium (Ti)-based alloys such as Ti-6Al-4V. Owing to its non-thermal nature, AWJM enables effective material removal while minimising metallurgical damage and preserving subsurface integrity. The process performance is governed by several interacting parameters, including jet pressure, abrasive type and flow rate, nozzle traverse speed, stand-off distance, jet incident angle, and nozzle design. These parameters collectively influence key output responses such as the material removal rate (MRR), surface roughness, kerf geometry, and subsurface quality. The existing studies consistently report that the jet pressure and abrasive flow rate are directly proportional to MRR, whereas the nozzle traverse speed and stand-off distance exhibit inverse relationships. Nozzle geometry plays a critical role in jet acceleration and abrasive entrainment through the Venturi effect, thereby affecting the cutting efficiency and surface finish. Optimisation studies based on the design of the experiments identify jet pressure and traverse speed as the most significant parameters controlling the surface quality in the AWJM of titanium alloys. Recent research demonstrates the effectiveness of artificial neural networks (ANNs) for process modelling and optimisation of AWJM of Ti-6Al-4V, achieving high predictive accuracy with limited experimental data. This review highlights research gaps in artificial intelligence-based fatigue behaviour prediction, computational fluid dynamics analysis of nozzle wear mechanisms and jet behaviour, and the development of hybrid AWJM systems for enhanced machining performance. Full article
(This article belongs to the Special Issue Studies in Advanced and Selective Manufacturing Technologies)
32 pages, 27435 KB  
Review
Artificial Intelligence in Adult Cardiovascular Medicine and Surgery: Real-World Deployments and Outcomes
by Dimitrios E. Magouliotis, Noah Sicouri, Laura Ramlawi, Massimo Baudo, Vasiliki Androutsopoulou and Serge Sicouri
J. Pers. Med. 2026, 16(2), 69; https://doi.org/10.3390/jpm16020069 - 30 Jan 2026
Viewed by 241
Abstract
Artificial intelligence (AI) is rapidly reshaping adult cardiac surgery, enabling more accurate diagnostics, personalized risk assessment, advanced surgical planning, and proactive postoperative care. Preoperatively, deep-learning interpretation of ECGs, automated CT/MRI segmentation, and video-based echocardiography improve early disease detection and refine risk stratification beyond [...] Read more.
Artificial intelligence (AI) is rapidly reshaping adult cardiac surgery, enabling more accurate diagnostics, personalized risk assessment, advanced surgical planning, and proactive postoperative care. Preoperatively, deep-learning interpretation of ECGs, automated CT/MRI segmentation, and video-based echocardiography improve early disease detection and refine risk stratification beyond conventional tools such as EuroSCORE II and the STS calculator. AI-driven 3D reconstruction, virtual simulation, and augmented-reality platforms enhance planning for structural heart and aortic procedures by optimizing device selection and anticipating complications. Intraoperatively, AI augments robotic precision, stabilizes instrument motion, identifies anatomy through computer vision, and predicts hemodynamic instability via real-time waveform analytics. Integration of the Hypotension Prediction Index into perioperative pathways has already demonstrated reductions in ventilation duration and improved hemodynamic control. Postoperatively, machine-learning early-warning systems and physiologic waveform models predict acute kidney injury, low-cardiac-output syndrome, respiratory failure, and sepsis hours before clinical deterioration, while emerging closed-loop control and remote monitoring tools extend individualized management into the recovery phase. Despite these advances, current evidence is limited by retrospective study designs, heterogeneous datasets, variable transparency, and regulatory and workflow barriers. Nonetheless, rapid progress in multimodal foundation models, digital twins, hybrid OR ecosystems, and semi-autonomous robotics signals a transition toward increasingly precise, predictive, and personalized cardiac surgical care. With rigorous validation and thoughtful implementation, AI has the potential to substantially improve safety, decision-making, and outcomes across the entire cardiac surgical continuum. Full article
Show Figures

Graphical abstract

16 pages, 762 KB  
Perspective
Electric Vehicle Model Predictive Control Energy Management Strategy: Theory, Applications, Perspectives and Challenges
by Xiaohuan Zhao, Guanda Huang, Kaijian Lei, Xiangkai Huang, Yuanhong Zhuo and Jiayi Zhao
Energies 2026, 19(3), 740; https://doi.org/10.3390/en19030740 - 30 Jan 2026
Viewed by 94
Abstract
Model predictive control (MPC) has become one of the most promising control strategies in the field of electric vehicle energy management due to its rolling optimization and explicit constraint processing capabilities. This study analyzes the modeling mechanism and implementation path of MPC in [...] Read more.
Model predictive control (MPC) has become one of the most promising control strategies in the field of electric vehicle energy management due to its rolling optimization and explicit constraint processing capabilities. This study analyzes the modeling mechanism and implementation path of MPC in power allocation, regenerative braking and energy collaborative control, which elaborates on the improvement principle of energy efficiency and system stability through predictive modeling and dynamic optimization. The evolution of MPC application in hybrid power systems, vehicle dynamic stability control, and hierarchical optimization control is discussed. The synergistic effect of multi-objective optimization and health-conscious control in energy efficiency improvement and service life extension is analyzed. With the development of artificial intelligence technology, MPC is expanding from model-based deterministic control to the directions of intelligent learning and distributed adaptation. Model uncertainty, computational complexity, and real-time solving efficiency are the main challenges faced by MPC. Future research will focus on the deep integration of model simplification, rapid solving, and intelligent learning to achieve a more efficient and reliable intelligent energy management system. Full article
Show Figures

Figure 1

21 pages, 4245 KB  
Article
Floating Fish Residual Feed Identification Based on LMFF–YOLO
by Chengbiao Tong, Jiting Wu, Xinming Xu and Yihua Wu
Fishes 2026, 11(2), 80; https://doi.org/10.3390/fishes11020080 - 30 Jan 2026
Viewed by 117
Abstract
Identifying floating residual feed is a critical technology in recirculating aquaculture systems, aiding water-quality control and the development of intelligent feeding models. However, existing research is largely based on ideal indoor environments and lacks adaptability to complex outdoor scenarios. Moreover, current methods for [...] Read more.
Identifying floating residual feed is a critical technology in recirculating aquaculture systems, aiding water-quality control and the development of intelligent feeding models. However, existing research is largely based on ideal indoor environments and lacks adaptability to complex outdoor scenarios. Moreover, current methods for this task often suffer from high computational costs, poor real-time performance, and limited recognition accuracy. To address these issues, this study first validates in outdoor aquaculture tanks that instance segmentation is more suitable than individual detection for handling clustered and adhesive feed residues. We therefore propose LMFF–YOLO, a lightweight multi-scale fusion feed segmentation model based on YOLOv8n-seg. This model achieves the first collaborative optimization of lightweight architecture and segmentation accuracy specifically tailored for outdoor residual feed segmentation tasks. To enhance recognition capability, we construct a network using a Context-Fusion Diffusion Pyramid Network (CFDPN) and a novel Multi-scale Feature Fusion Module (MFFM) to improve multi-scale and contextual feature capture, supplemented by an efficient local attention mechanism at the backbone’s end for refined local feature extraction. To reduce computational costs and improve real-time performance, the original C2f module is replaced with a C2f-Reparameterization vision block, and a shared-convolution local-focus lightweight segmentation head is designed. Experimental results show that LMFF–YOLO achieves an mAP50 of 87.1% (2.6% higher than YOLOv8n-seg), enabling more precise estimation of residual feed quantity. Coupled with a 19.1% and 20.0% reduction in parameters and FLOPs, this model provides a practical solution for real-time monitoring, supporting feed waste reduction and intelligent feeding strategies. Full article
(This article belongs to the Section Fishery Facilities, Equipment, and Information Technology)
Show Figures

Figure 1

50 pages, 3177 KB  
Review
Computational Entropy Modeling for Sustainable Energy Systems: A Review of Numerical Techniques, Optimization Methods, and Emerging Applications
by Łukasz Łach
Energies 2026, 19(3), 728; https://doi.org/10.3390/en19030728 - 29 Jan 2026
Viewed by 174
Abstract
Thermodynamic entropy generation quantifies irreversibility in energy conversion processes, providing rigorous thermodynamic foundations for optimizing efficiency and sustainability in thermal and energy systems. This critical review synthesizes advances in computational entropy modeling across numerical methods, optimization strategies, and sustainable energy applications. Computational fluid [...] Read more.
Thermodynamic entropy generation quantifies irreversibility in energy conversion processes, providing rigorous thermodynamic foundations for optimizing efficiency and sustainability in thermal and energy systems. This critical review synthesizes advances in computational entropy modeling across numerical methods, optimization strategies, and sustainable energy applications. Computational fluid dynamics, finite element methods, and lattice Boltzmann methods enable spatially resolved entropy analysis in convective, conjugate, and microscale systems, but exhibit varying maturity levels and accuracy–cost trade-offs. The minimization of entropy generation and the integration of artificial intelligence demonstrate quantifiable performance improvements in heat exchangers, renewable energy systems, and smart grids, with reported efficiency gains of 15 to 39% in specific applications under controlled conditions. While overall performance depends critically on system scale, operating regime, and baseline configuration, persistent limitations still constrain practical deployment. Systematic conflation between thermodynamic entropy (quantifying physical irreversibility) and information entropy (measuring statistical uncertainty) leads to inappropriate method selection; validation challenges arise from entropy’s status as a non-directly-measurable state function; high-order maximum entropy models achieve superior uncertainty quantification but require prohibitive computational resources; and standardized benchmarking protocols remain absent. Research fragmentation across thermodynamics, information theory, and machine learning communities limits integrated frameworks capable of addressing multi-scale, transient, multiphysics systems. This review provides structured, cross-method, application-aware synthesis identifying where computational entropy modeling achieves industrial readiness versus research-stage development, offering forward-looking insights on physics-informed machine learning, unified theoretical frameworks, and real-time entropy-aware control as critical directions for advancing sustainable energy system design. Full article
Show Figures

Figure 1

18 pages, 1237 KB  
Article
Real-Time Robotic Navigation with Smooth Trajectory Using Variable Horizon Model Predictive Control
by Guopeng Wang, Guofu Ma, Dongliang Wang, Keqiang Bai, Weicheng Luo, Jiafan Zhuang and Zhun Fan
Electronics 2026, 15(3), 603; https://doi.org/10.3390/electronics15030603 - 29 Jan 2026
Viewed by 155
Abstract
This study addresses the challenges of real-time performance, safety, and trajectory smoothness in robot navigation by proposing an innovative variable-horizon model predictive control (MPC) scheme that utilizes evolutionary algorithms. To effectively adapt to the complex and dynamic conditions during navigation, a constrained multi-objective [...] Read more.
This study addresses the challenges of real-time performance, safety, and trajectory smoothness in robot navigation by proposing an innovative variable-horizon model predictive control (MPC) scheme that utilizes evolutionary algorithms. To effectively adapt to the complex and dynamic conditions during navigation, a constrained multi-objective evolutionary algorithm is used to tune the control parameters precisely. The optimized parameters are then used to dynamically adjust the MPC’s prediction horizon online. To further enhance the system’s real-time performance, warm start and multiple shooting techniques are introduced, significantly improving the computational efficiency of the MPC. Finally, simulation and real-world experiments are conducted to validate the effectiveness of the proposed method. Experimental results demonstrate that the proposed control scheme exhibits excellent navigation performance in differential-drive robot models, offering a novel solution for intelligent mobile robot navigation. Full article
Show Figures

Figure 1

40 pages, 47306 KB  
Review
Advances in EMG Signal Processing and Pattern Recognition: Techniques, Challenges, and Emerging Applications
by Lasitha Piyathilaka, Jung-Hoon Sul, Sanura Dunu Arachchige, Amal Jayawardena and Diluka Moratuwage
Electronics 2026, 15(3), 590; https://doi.org/10.3390/electronics15030590 - 29 Jan 2026
Viewed by 340
Abstract
Electromyography (EMG) has become essential in biomedical engineering, rehabilitation, and human–machine interfacing due to its ability to capture neuromuscular activation for control, monitoring, and diagnosis. Recent advances in sensing hardware, high-density and flexible electrodes, and embedded acquisition modules combined with modern signal processing [...] Read more.
Electromyography (EMG) has become essential in biomedical engineering, rehabilitation, and human–machine interfacing due to its ability to capture neuromuscular activation for control, monitoring, and diagnosis. Recent advances in sensing hardware, high-density and flexible electrodes, and embedded acquisition modules combined with modern signal processing and machine learning have significantly enhanced the robustness and applicability of EMG-based systems. This review provides an integrated overview of EMG generation, acquisition standards, and preprocessing techniques, including adaptive filtering, wavelet denoising, and empirical mode decomposition. Feature extraction methods across the time, frequency, time–frequency, and nonlinear domains are compared with respect to computational efficiency and suitability for real-time systems. The review synthesizes classical and contemporary pattern-recognition approaches, from statistical classifiers to deep architectures such as CNNs, RNNs, hybrid CNN–RNN models, transformer-based networks, and graph neural networks. Key challenges, including signal non-stationarity, electrode displacement, muscle fatigue, and poor cross-user or cross-session generalization, are examined alongside emerging strategies such as transfer learning, domain adaptation, and multimodal fusion with IMU or FMG signals. Finally, the paper surveys rapidly growing EMG applications in prosthetics, rehabilitation robotics, human–machine interfaces, clinical diagnostics, and sports analytics. The review highlights ongoing limitations and outlines future pathways toward robust, adaptive, and deployable EMG-driven intelligent systems. Full article
(This article belongs to the Special Issue Image and Signal Processing Techniques and Applications)
Show Figures

Figure 1

70 pages, 1137 KB  
Review
A Review of Artificial Intelligence Techniques for Low-Carbon Energy Integration and Optimization in Smart Grids and Smart Homes
by Omosalewa O. Olagundoye, Olusola Bamisile, Chukwuebuka Joseph Ejiyi, Oluwatoyosi Bamisile, Ting Ni and Vincent Onyango
Processes 2026, 14(3), 464; https://doi.org/10.3390/pr14030464 - 28 Jan 2026
Viewed by 149
Abstract
The growing demand for electricity in residential sectors and the global need to decarbonize power systems are accelerating the transformation toward smart and sustainable energy networks. Smart homes and smart grids, integrating renewable generation, energy storage, and intelligent control systems, represent a crucial [...] Read more.
The growing demand for electricity in residential sectors and the global need to decarbonize power systems are accelerating the transformation toward smart and sustainable energy networks. Smart homes and smart grids, integrating renewable generation, energy storage, and intelligent control systems, represent a crucial step toward achieving energy efficiency and carbon neutrality. However, ensuring real-time optimization, interoperability, and sustainability across these distributed energy resources (DERs) remains a key challenge. This paper presents a comprehensive review of artificial intelligence (AI) applications for sustainable energy management and low-carbon technology integration in smart grids and smart homes. The review explores how AI-driven techniques include machine learning, deep learning, and bio-inspired optimization algorithms such as particle swarm optimization (PSO), whale optimization algorithm (WOA), and cuckoo optimization algorithm (COA) enhance forecasting, adaptive scheduling, and real-time energy optimization. These techniques have shown significant potential in improving demand-side management, dynamic load balancing, and renewable energy utilization efficiency. Moreover, AI-based home energy management systems (HEMSs) enable predictive control and seamless coordination between grid operations and distributed generation. This review also discusses current barriers, including data heterogeneity, computational overhead, and the lack of standardized integration frameworks. Future directions highlight the need for lightweight, scalable, and explainable AI models that support decentralized decision-making in cyber-physical energy systems. Overall, this paper emphasizes the transformative role of AI in enabling sustainable, flexible, and intelligent power management across smart residential and grid-level systems, supporting global energy transition goals and contributing to the realization of carbon-neutral communities. Full article
Show Figures

Figure 1

17 pages, 1152 KB  
Systematic Review
Use of Artificial Intelligence in Diagnosing Vertical Root Fractures—A Systematic Review
by Abdulmajeed Saeed Alshahrani, Ahmed Ali Alelyani, Ahmad Jabali, Ahmed Abdullah Al Malwi, Riyadh Alroomy, Amal S. Shaiban, Raid Abdullah Almnea, Vini Mehta and Mohammed M. Al Moaleem
Diagnostics 2026, 16(3), 406; https://doi.org/10.3390/diagnostics16030406 - 27 Jan 2026
Viewed by 231
Abstract
Background/Objectives: Vertical root fractures (VRFs) present significant diagnostic challenges due to their subtle radiographic features and variability across imaging modalities. Artificial intelligence (AI) offers potential to improve detection accuracy, yet evidence regarding its performance across different imaging systems remains fragmented. To critically evaluate [...] Read more.
Background/Objectives: Vertical root fractures (VRFs) present significant diagnostic challenges due to their subtle radiographic features and variability across imaging modalities. Artificial intelligence (AI) offers potential to improve detection accuracy, yet evidence regarding its performance across different imaging systems remains fragmented. To critically evaluate current evidence on AI-assisted detection of VRFs across periapical radiography, panoramic radiography, and cone-beam computed tomography (CBCT) and to compare diagnostic performance, methodological strengths, and limitations. Methods: A systematic review of literature up to January 2025 was carried out using databases such as PubMed, Scopus, Web of Science, and the Cochrane Library. The studies included in this review utilized AI-based techniques for detecting VRF through periapical, panoramic, or CBCT imaging. Extracted data encompassed study design, AI models, dataset sizes, preprocessing methods, imaging parameters, validation techniques, and diagnostic metrics. The risk of bias in these studies was evaluated using the QUADAS-2 tool. Results: Ten studies met inclusion criteria; CNN-based models predominated, with performance highly dependent on imaging modality. CBCT-based AI systems achieved the highest diagnostic accuracy (91.4–97.8%) and specificity (90.7–100%), followed by periapical radiography models with accuracies up to 95.7% in controlled settings. Panoramic radiography models demonstrated lower sensitivity (0.45–0.75) but maintained high precision (0.93) in certain contexts. Most studies reported improvements over human performance, yet limitations included small datasets, heterogeneous methodologies, and risk of overfitting. Conclusions: AI-assisted VRF detection shows promising accuracy, particularly with CBCT imaging, but current evidence is constrained by methodological variability and limited clinical validation. Full article
Show Figures

Figure 1

Back to TopTop