Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,904)

Search Parameters:
Keywords = virtual machining

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
41 pages, 5882 KB  
Review
Development of an Advanced Multi-Layer Digital Twin Conceptual Framework for Underground Mining
by Carlos Cacciuttolo, Edison Atencio, Seyedmilad Komarizadehasl and Jose Antonio Lozano-Galant
Sensors 2025, 25(21), 6650; https://doi.org/10.3390/s25216650 - 30 Oct 2025
Abstract
Digital mining has been evolving in recent years under the Industry 4.0 paradigm. In this sense, technological tools such as sensors aid the management and operation of mining projects, reducing the risk of accidents, increasing productivity, and promoting business sustainability. DT is a [...] Read more.
Digital mining has been evolving in recent years under the Industry 4.0 paradigm. In this sense, technological tools such as sensors aid the management and operation of mining projects, reducing the risk of accidents, increasing productivity, and promoting business sustainability. DT is a technological tool that enables the integration of various Industry 4.0 technologies to create a virtual model of a real, physical entity, allowing for the study and analysis of the model’s behavior through real-time data collection. A digital twin of an underground mine is a real-time, virtual replica of an actual mine. It is like an extremely detailed “simulator” that uses data from sensors, machines, and personnel to accurately reflect what is happening in the mine at that very moment. Some of the functionalities of an underground mining DT include (i) accurate geometry of the real physical asset, (ii) real-time monitoring capability, (iii) anomaly prediction capability, (iv) scenario simulation, (v) lifecycle management to reduce costs, and (vi) a support system for smart and proactive decision-making. A digital twin of an underground mine offers transformative benefits, such as real-time operational optimization, improved safety through risk simulation, strategic planning with predictive scenarios, and cost reduction through predictive maintenance. However, its implementation faces significant challenges, including the high technical complexity of integrating diverse data, the high initial cost, organizational resistance to change, a shortage of skilled personnel, and the lack of a comprehensive, multi-layered conceptual framework for an underground mine digital twin. To overcome these barriers and gaps, this paper proposes a strategy that includes defining an advanced, multi-layered conceptual framework for the digital twin. Simultaneously, it advocates for fostering a culture of change through continuous training, establishing partnerships with specialized experts, and investing in robust sensor and connectivity infrastructure to ensure reliable, real-time data flow that feeds the digital twin. Finally, validation of the advanced multi-layered conceptual framework for digital twins of underground mines is carried out through a questionnaire administered to a panel of experts. Full article
Show Figures

Figure 1

14 pages, 20276 KB  
Article
A Discrete Space Vector Modulation MPC-Based Artificial Neural Network Controller for PMSM Drives
by Jiawei Guo, Takahiro Kawaguchi and Seiji Hashimoto
Machines 2025, 13(11), 996; https://doi.org/10.3390/machines13110996 - 30 Oct 2025
Abstract
In addition to the basic voltage vector modulation technique, virtual vectors can be generated through the discrete space vector modulation (DSVM) technique. Consequently, DSVM-based model predictive control (MPC) can achieve the reduction in current harmonics and torque ripples in permanent magnet synchronous machine [...] Read more.
In addition to the basic voltage vector modulation technique, virtual vectors can be generated through the discrete space vector modulation (DSVM) technique. Consequently, DSVM-based model predictive control (MPC) can achieve the reduction in current harmonics and torque ripples in permanent magnet synchronous machine (PMSM) drives. However, as the number of virtual candidate voltage vectors becomes excessively large, the computational burden increases significantly. This paper proposes an artificial neural network (ANN) control algorithm, in which massive input and output datasets generated by an existing DSVM-MPC algorithm are utilized for ANN offline training. In this way, the ANN can efficiently select the optimal voltage vector without enumerating all candidate voltage vectors, thereby reducing the heavy online computation of the DSVM-MPC controller and significantly reducing the computational burden. Finally, the effectiveness of the proposed ANN controller is validated. Full article
(This article belongs to the Section Electrical Machines and Drives)
Show Figures

Figure 1

26 pages, 1048 KB  
Article
QRoNS: Quantum Resilience over IPsec Tunnels for Network Slicing
by Dimosthenis Iliadis-Apostolidis, Daniel Christian Lawo, Sokol Kosta, Idelfonso Tafur Monroy and Juan Jose Vegas Olmos
Electronics 2025, 14(21), 4234; https://doi.org/10.3390/electronics14214234 - 29 Oct 2025
Abstract
Modern high-performance network infrastructures must address the challenges of scalability and quantum-resistant security, particularly in multi-tenant and virtualized environments. In this work, we introduce a novel implementation of Post-Quantum Cryptography (PQC)-IPsec using the NVIDIA BlueField-3 Data Processing Unit (Santa Clara, CA, USA), capable [...] Read more.
Modern high-performance network infrastructures must address the challenges of scalability and quantum-resistant security, particularly in multi-tenant and virtualized environments. In this work, we introduce a novel implementation of Post-Quantum Cryptography (PQC)-IPsec using the NVIDIA BlueField-3 Data Processing Unit (Santa Clara, CA, USA), capable of achieving 400 Gbit/s. We demonstrate line-rate performance through quantum-resilient communication channels using Kyber1024 (ML-KEM) and Dilithium5 (ML-DSA). We evaluate our implementation on two experimental setups; a host-to-host configuration and a 16 Virtual Machines (VMs)-to-host setup, both across a direct high-speed link. We set the Data Processing Unit (DPU) on both Network Interface Card (NIC) mode with no/crypto/full packet offload and on DPU mode by configuring Open vSwitch (OvS) on the ARM cores and offloading the packet processing to the hardware. We achieve 97.5% of the available line-rate for 14 VMs and 99.9% for 16 VMs, in DPU mode. Our findings confirm that PQC-enabled IPsec can operate at line-rate speeds in modern data centers, providing a practical and future-proof foundation for secure, high-throughput communication in the post-quantum computing era. Full article
Show Figures

Figure 1

28 pages, 1286 KB  
Article
Stability Assessment of Fully Inverter-Based Power Systems Using Grid-Forming Controls
by Zahra Ahmadimonfared and Stefan Eichner
Electronics 2025, 14(21), 4202; https://doi.org/10.3390/electronics14214202 - 27 Oct 2025
Viewed by 137
Abstract
The displacement of synchronous machines by inverter-based resources raises critical concerns regarding the stability of future low-inertia power systems. Grid-forming (GFM) inverters offer a pathway to address these challenges by autonomously establishing voltage and frequency while emulating inertia and damping. This paper investigates [...] Read more.
The displacement of synchronous machines by inverter-based resources raises critical concerns regarding the stability of future low-inertia power systems. Grid-forming (GFM) inverters offer a pathway to address these challenges by autonomously establishing voltage and frequency while emulating inertia and damping. This paper investigates the feasibility of operating a transmission-scale network with 100% GFM penetration by fully replacing all synchronous generators in the IEEE 39-bus system with a heterogeneous mix of droop, virtual synchronous machine (VSM), and synchronverter controls. System stability is assessed under a severe fault-initiated separation, focusing on frequency and voltage metrics defined through center-of-inertia formulations and standard acceptance envelopes. A systematic parameter sweep of virtual inertia (H) and damping (Dp) reveals their distinct and complementary roles: inertia primarily shapes the Rate of Change in Frequency and excursion depth, while damping governs convergence speed and steady-state accuracy. All tested parameter combinations remain within established stability limitations, confirming the robust operability of a fully inverter-dominated grid. These findings demonstrate that properly tuned GFM inverters can enable secure and reliable operation of future power systems without reliance on synchronous machines. Full article
(This article belongs to the Topic Power System Dynamics and Stability, 2nd Edition)
Show Figures

Graphical abstract

23 pages, 1943 KB  
Article
Modeling of New Agents with Potential Antidiabetic Activity Based on Machine Learning Algorithms
by Yevhen Pruhlo, Ivan Iurchenko and Alina Tomenko
AppliedChem 2025, 5(4), 30; https://doi.org/10.3390/appliedchem5040030 - 27 Oct 2025
Viewed by 142
Abstract
Type 2 diabetes mellitus (T2DM) is a growing global health challenge, expected to affect over 600 million people by 2045. The discovery of new antidiabetic agents remains resource-intensive, motivating the use of machine learning (ML) for virtual screening based on molecular structure. In [...] Read more.
Type 2 diabetes mellitus (T2DM) is a growing global health challenge, expected to affect over 600 million people by 2045. The discovery of new antidiabetic agents remains resource-intensive, motivating the use of machine learning (ML) for virtual screening based on molecular structure. In this study, we developed a predictive pipeline integrating two distinct descriptor types: high-dimensional numerical features from the Mordred library (>1800 2D/3D descriptors) and categorical ontological annotations from the ClassyFire and ChEBI systems. These encode hierarchical chemical classifications and functional group labels. The dataset included 45 active compounds and thousands of inactive molecules, depending on the descriptor system. To address class imbalance, we applied SMOTE and created balanced training and test sets while preserving independent validation sets. Thirteen ML models—including regression, SVM, naive Bayes, decision trees, ensemble methods, and others—were trained using stratified 12-fold cross-validation and evaluated across training, test, and validation. Ridge Regression showed the best generalization (MCC = 0.814), with Gradient Boosting following (MCC = 0.570). Feature importance analysis highlighted the complementary nature of the descriptors: Ridge Regression emphasized ClassyFire taxonomies such as CHEMONTID:0000229 and CHEBI:35622, while Mordred-based models (e.g., Random Forest) prioritized structural and electronic features like MAXsssCH and ETA_dEpsilon_D. This study is the first to systematically integrate and compare structural and ontological descriptors for antidiabetic compound prediction. The framework offers a scalable and interpretable approach to virtual screening and can be extended to other therapeutic domains to accelerate early-stage drug discovery. Full article
Show Figures

Figure 1

14 pages, 2105 KB  
Article
A Unified Control Strategy Integrating VSG and LVRT for Current-Source PMSGs
by Yang Yang, Zaijun Wu, Xiangjun Quan, Junjie Xiong, Zijing Wan and Zetao Wei
Processes 2025, 13(11), 3432; https://doi.org/10.3390/pr13113432 - 25 Oct 2025
Viewed by 379
Abstract
The growing penetration of renewable energy has reduced system inertia and damping, threatening grid stability. This paper proposes a novel control strategy that seamlessly integrates virtual synchronous generator (VSG) emulation with low-voltage ride-through (LVRT) capability for direct-drive permanent magnet synchronous generators (PMSGs). The [...] Read more.
The growing penetration of renewable energy has reduced system inertia and damping, threatening grid stability. This paper proposes a novel control strategy that seamlessly integrates virtual synchronous generator (VSG) emulation with low-voltage ride-through (LVRT) capability for direct-drive permanent magnet synchronous generators (PMSGs). The unified control framework enables simultaneous inertia support during frequency disturbances and compliant reactive current injection during voltage sags—eliminating mode switching. Furthermore, the proposed strategy has been validated through both a single-machine model and actual wind farm topology. Results demonstrate that the strategy successfully achieves VSG control functionality while simultaneously meeting LVRT requirements. Full article
Show Figures

Figure 1

33 pages, 3585 KB  
Article
Identifying the Location of Dynamic Load Using a Region’s Asymptotic Approximation
by Yuantian Qin, Jiakai Zheng and Vadim V. Silberschmidt
Aerospace 2025, 12(11), 953; https://doi.org/10.3390/aerospace12110953 - 24 Oct 2025
Viewed by 110
Abstract
Since it is difficult to obtain the positions of dynamic loads on structures, this paper suggests a new method to identify the locations of dynamic loads step-by-step based on the correlation coefficients of dynamic responses. First, a recognition model for dynamic load position [...] Read more.
Since it is difficult to obtain the positions of dynamic loads on structures, this paper suggests a new method to identify the locations of dynamic loads step-by-step based on the correlation coefficients of dynamic responses. First, a recognition model for dynamic load position based on a finite-element scheme is established, with the finite-element domain divided into several regions. Second, virtual loads are applied at the central points of these regions, and acceleration responses are calculated at the sensor measurement points. Third, the maximum correlation coefficient between the calculational and measured accelerations is obtained, and the dynamic load is located in the region with the virtual load corresponding to the maximum correlation coefficient. Finally, this region is continuously subdivided with the refined mesh until the dynamic load is pinpointed in a sufficiently small area. Different virtual load construction methods are proposed according to different types of loads. The frequency response function, unresolvable for the actual problem due to the unknown location of the real dynamic load, can be transformed into a solvable form, involving only known points. This transformation simplifies the analytical process, making it more efficient and applicable to analysis of the dynamic behavior of the system. The identification of the dynamic load position in the entire structure is then transformed into a sub-region approach, focusing on the area where the dynamic load acts. Simulations for case studies are conducted to demonstrate that the proposed method can effectively identify positions of single and multiple dynamic loads. The correctness of the theory and simulation model is verified with experiments. Compared to recent methods that use machine learning and neural networks to identify positions of dynamic loads, the approach proposed in this paper avoids the heavy computational cost and time required for data training. Full article
Show Figures

Figure 1

27 pages, 1802 KB  
Perspective
Toward Artificial Intelligence in Oncology and Cardiology: A Narrative Review of Systems, Challenges, and Opportunities
by Visar Vela, Ali Yasin Sonay, Perparim Limani, Lukas Graf, Besmira Sabani, Diona Gjermeni, Andi Rroku, Arber Zela, Era Gorica, Hector Rodriguez Cetina Biefer, Uljad Berdica, Euxhen Hasanaj, Adisa Trnjanin, Taulant Muka and Omer Dzemali
J. Clin. Med. 2025, 14(21), 7555; https://doi.org/10.3390/jcm14217555 - 24 Oct 2025
Viewed by 303
Abstract
Background: Artificial intelligence (AI), the overarching field that includes machine learning (ML) and its subfield deep learning (DL), is rapidly transforming clinical research by enabling the analysis of high-dimensional data and automating the output of diagnostic and prognostic tests. As clinical trials become [...] Read more.
Background: Artificial intelligence (AI), the overarching field that includes machine learning (ML) and its subfield deep learning (DL), is rapidly transforming clinical research by enabling the analysis of high-dimensional data and automating the output of diagnostic and prognostic tests. As clinical trials become increasingly complex and costly, ML-based approaches (especially DL for image and signal data) offer promising solutions, although they require new approaches in clinical education. Objective: Explore current and emerging AI applications in oncology and cardiology, highlight real-world use cases, and discuss the challenges and future directions for responsible AI adoption. Methods: This narrative review summarizes various aspects of AI technology in clinical research, exploring its promise, use cases, and its limitations. The review was based on a literature search in PubMed covering publications from 2019 to 2025. Search terms included “artificial intelligence”, “machine learning”, “deep learning”, “oncology”, “cardiology”, “digital twin”. and “AI-ECG”. Preference was given to studies presenting validated or clinically applicable AI tools, while non-English articles, conference abstracts, and gray literature were excluded. Results: AI demonstrates significant potential in improving diagnostic accuracy, facilitating biomarker discovery, and detecting disease at an early stage. In clinical trials, AI improves patient stratification, site selection, and virtual simulations via digital twins. However, there are still challenges in harmonizing data, validating models, cross-disciplinary training, ensuring fairness, explainability, as well as the robustness of gold standards to which AI models are built. Conclusions: The integration of AI in clinical research can enhance efficiency, reduce costs, and facilitate clinical research as well as lead the way towards personalized medicine. Realizing this potential requires robust validation frameworks, transparent model interpretability, and collaborative efforts among clinicians, data scientists, and regulators. Interoperable data systems and cross-disciplinary education will be critical to enabling the integration of scalable, ethical, and trustworthy AI into healthcare. Full article
(This article belongs to the Section Clinical Research Methods)
Show Figures

Figure 1

21 pages, 7994 KB  
Article
Power Analysis Produced by Virtual Inertia in Single-Phase Grid-Forming Converters Under Frequency Events Intended for Bidirectional Battery Chargers
by Erick Pantaleon, Jhonatan Paucara and Damián Sal y Rosas
Energies 2025, 18(21), 5560; https://doi.org/10.3390/en18215560 - 22 Oct 2025
Viewed by 249
Abstract
The widespread integration of renewable energy sources (RESs) into the grid through inertia-less power converters is reducing the overall system inertia leading to large frequency variations. To mitigate this issue, grid-forming (GFM) control strategies in bidirectional battery chargers have emerged as a promising [...] Read more.
The widespread integration of renewable energy sources (RESs) into the grid through inertia-less power converters is reducing the overall system inertia leading to large frequency variations. To mitigate this issue, grid-forming (GFM) control strategies in bidirectional battery chargers have emerged as a promising solution, since the inertial response of synchronous generators (SGs) can be emulated by power converters. However, unlike SGs, which can withstand currents above their rated values, the output current of a power converter is limited to its nominal design value. Therefore, the estimation of the power delivered by the GFM power converter during frequency events, called Virtual Inertia (VI) support, is essential to prevent exceeding the rated current. This article analyzes the VI response of GFM power converters, classifying the dynamic behavior as underdamped, critically damped, or overdamped according to the selected inertia constant and damping coefficient, parameters of the GFM control strategy. Subsequently, the transient power response under step-shaped and ramp-shaped frequency deviations is quantified. The proposed analysis is validated using a 1.2 KW single-phase power converter. The simulation and experimental results confirm that the overdamped response under a ramp-shaped frequency event shows higher fidelity to the theorical model. Full article
Show Figures

Figure 1

13 pages, 504 KB  
Article
MambaNet0: Mamba-Based Sustainable Cloud Resource Prediction Framework Towards Net Zero Goals
by Thananont Chevaphatrakul, Han Wang and Sukhpal Singh Gill
Future Internet 2025, 17(10), 480; https://doi.org/10.3390/fi17100480 - 21 Oct 2025
Viewed by 376
Abstract
With the ever-growing reliance on cloud computing, efficient resource allocation is crucial for maximising the effective use of provisioned resources from cloud service providers. Proactive resource management is therefore critical for minimising costs and striving for net zero emission goals. One of the [...] Read more.
With the ever-growing reliance on cloud computing, efficient resource allocation is crucial for maximising the effective use of provisioned resources from cloud service providers. Proactive resource management is therefore critical for minimising costs and striving for net zero emission goals. One of the most promising methods involves the use of Artificial Intelligence (AI) techniques to analyse and predict resource demand, such as cloud CPU utilisation. This paper presents MambaNet0, a Mamba-based cloud resource prediction framework. The model is implemented on Google’s Vertex AI workbench and uses the real-world Bitbrains Grid Workload Archive-T-12 dataset, which contains the resource usage metrics of 1750 virtual machines. The Mamba model’s performance is then evaluated against established baseline models, including Autoregressive Integrated Moving Average (ARIMA), Long Short-Term Memory (LSTM), and Amazon Chronos, to demonstrate its potential for accurate prediction of CPU utilisation. The MambaNet0 model achieved a 29% improvement in Symmetric Mean Absolute Percentage Error (SMAPE) compared to the best-performing baseline Amazon Chronos. These findings reinforce the Mamba model’s ability to forecast accurate CPU utilisation, highlighting its potential for optimising cloud resource allocation in contribution to net zero goals. Full article
Show Figures

Figure 1

27 pages, 2075 KB  
Review
Physics-Informed Machine Learning for Intelligent Gas Turbine Digital Twins: A Review
by Hiyam Farhat and Amani Altarawneh
Energies 2025, 18(20), 5523; https://doi.org/10.3390/en18205523 - 20 Oct 2025
Viewed by 600
Abstract
This review surveys recent progress in hybrid artificial intelligence (AI) approaches for gas turbine intelligent digital twins, with an emphasis on models that integrate physics-based simulations and machine learning. The main contribution is the introduction of a structured classification of hybrid AI methods [...] Read more.
This review surveys recent progress in hybrid artificial intelligence (AI) approaches for gas turbine intelligent digital twins, with an emphasis on models that integrate physics-based simulations and machine learning. The main contribution is the introduction of a structured classification of hybrid AI methods tailored to gas turbine applications, the development of a novel comparative maturity framework, and the proposal of a layered roadmap for integration. The classification organizes hybrid AI approaches into four categories: (1) artificial neural network (ANN)-augmented thermodynamic models, (2) physics-integrated operational architectures, (3) physics-constrained neural networks (PcNNs) with computational fluid dynamics (CFD) surrogates, and (4) generative and model discovery approaches. The maturity framework evaluates these categories across five criteria: data dependency, interpretability, deployment complexity, workflow integration, and real-time capability. Industrial case studies—including General Electric (GE) Vernova’s SmartSignal, Siemens’ Autonomous Turbine Operation and Maintenance (ATOM), and the Electric Power Research Institute (EPRI) turbine digital twin—illustrate applications in real-time diagnostics, predictive maintenance, and performance optimization. Together, the classification and maturity framework provide the means for systematic assessment of hybrid AI methods in gas turbine intelligent digital twins. The review concludes by identifying key challenges and outlining a roadmap for the future development of scalable, interpretable, and operationally robust intelligent digital twins for gas turbines. Full article
Show Figures

Figure 1

25 pages, 2621 KB  
Article
Analysis of a Driving Simulator’s Steering System for the Evaluation of Autonomous Vehicle Driving
by Juan F. Dols, Samuel Boix, Jaime Molina, Sara Moll, Francisco J. Camacho and Griselda López
Sensors 2025, 25(20), 6471; https://doi.org/10.3390/s25206471 - 20 Oct 2025
Viewed by 424
Abstract
The integration of autonomous vehicles (AVs) into road transport requires robust experimental tools to analyze the human–machine interaction, particularly under conditions of system disengagement. This study presents the primary controls calibration and virtual scenario validation of the EVACH autonomous driving simulator, designed to [...] Read more.
The integration of autonomous vehicles (AVs) into road transport requires robust experimental tools to analyze the human–machine interaction, particularly under conditions of system disengagement. This study presents the primary controls calibration and virtual scenario validation of the EVACH autonomous driving simulator, designed to reproduce the SAE Level 2 and Level 3 driving modes in rural road scenarios. The simulator was customized through hardware and software developments including a dedicated data acquisition system to ensure the accurate detection of braking, steering, and other critical control inputs. Calibration tests demonstrated high fidelity, with minor errors in brake and steering control measurements, consistent with values observed in production vehicles. To validate the virtual driving rural environment, comparative experiments were conducted between naturalistic road tests and simulator-based autonomous driving, where five volunteers participated in the preliminary pilot test. Results showed that average speeds in the simulation closely matched those recorded on real roads, with differences of less than 1 km/h with minimum standard deviation and confidence values. These findings confirm that the EVACH simulator provides a stable and faithful reproduction of autonomous driving conditions. The experimental platform offers valuable support for current and future research on the safe deployment of automated vehicles. Full article
Show Figures

Figure 1

11 pages, 649 KB  
Review
A Narrative Review of Photon-Counting CT and Radiomics in Cardiothoracic Imaging: A Promising Match?
by Salvatore Claudio Fanni, Ilaria Ambrosini, Francesca Pia Caputo, Maria Emanuela Cuibari, Domitilla Deri, Alessio Guarracino, Camilla Guidi, Vincenzo Uggenti, Giancarlo Varanini, Emanuele Neri, Dania Cioni, Mariano Scaglione and Salvatore Masala
Diagnostics 2025, 15(20), 2631; https://doi.org/10.3390/diagnostics15202631 - 18 Oct 2025
Viewed by 364
Abstract
Photon-counting computed tomography (PCCT) represents a major technological innovation compared to conventional CT, offering improved spatial resolution, reduced electronic noise, and intrinsic spectral capabilities. These advances open new perspectives for synergy with radiomics, a field that extracts quantitative features from medical images. The [...] Read more.
Photon-counting computed tomography (PCCT) represents a major technological innovation compared to conventional CT, offering improved spatial resolution, reduced electronic noise, and intrinsic spectral capabilities. These advances open new perspectives for synergy with radiomics, a field that extracts quantitative features from medical images. The ability of PCCT to generate multiple types of datasets, including high-resolution conventional images, iodine maps, and virtual monoenergetic reconstructions, increases the richness of extractable features and potentially enhances radiomics performance. This narrative review investigates the current evidence on the interplay between PCCT and radiomics in cardiothoracic imaging. Phantom studies demonstrate reduced reproducibility between PCCT and conventional CT systems, while intra-scanner repeatability remains high. Nonetheless, PCCT introduces additional complexity, as reconstruction parameters and acquisition settings significantly may affect feature stability. In chest imaging, early studies suggest that PCCT-derived features may improve nodule characterization, but existing machine learning models, such as those applied to interstitial lung disease, may require recalibration to accommodate the new imaging paradigm. In cardiac imaging, PCCT has shown particular promise: radiomic features extracted from myocardial and epicardial tissues can provide additional diagnostic insights, while spectral reconstructions improve plaque characterization. Proof-of-concept studies already suggest that PCCT radiomics can capture myocardial aging patterns and discriminate high-risk coronary plaques. In conclusion, evidence supports a growing synergy between PCCT and radiomics, with applications already emerging in both lung and cardiac imaging. By enhancing the reproducibility and richness of quantitative features, PCCT may significantly broaden the clinical potential of radiomics in computed tomography. Full article
Show Figures

Figure 1

36 pages, 552 KB  
Review
Review of Applications of Regression and Predictive Modeling in Wafer Manufacturing
by Hsuan-Yu Chen and Chiachung Chen
Electronics 2025, 14(20), 4083; https://doi.org/10.3390/electronics14204083 - 17 Oct 2025
Viewed by 658
Abstract
Semiconductor wafer manufacturing is one of the most complex and data-intensive industrial processes, comprising 500–1000 tightly interdependent steps, each requiring nanometer-level precision. As device nodes approach 3 nm and beyond, even minor deviations in parameters such as oxide thickness or critical dimensions can [...] Read more.
Semiconductor wafer manufacturing is one of the most complex and data-intensive industrial processes, comprising 500–1000 tightly interdependent steps, each requiring nanometer-level precision. As device nodes approach 3 nm and beyond, even minor deviations in parameters such as oxide thickness or critical dimensions can lead to catastrophic yield loss, challenging traditional physics-based control methods. In response, the industry has increasingly adopted regression analysis and predictive modeling as essential analytical frameworks. Classical regression, long used to support design of experiments (DOE), process optimization, and yield analysis, has evolved to enable multivariate modeling, virtual metrology, and fault detection. Predictive modeling extends these capabilities through machine learning and AI, leveraging massive sensor and metrology data streams for real-time process monitoring, yield forecasting, and predictive maintenance. These data-driven tools are now tightly integrated into advanced process control (APC), digital twins, and automated decision-making systems, transforming fabs into agile, intelligent manufacturing environments. This review synthesizes foundational and emerging methods, industry applications, and case studies, emphasizing their role in advancing Industry 4.0 initiatives. Future directions include hybrid physics–ML models, explainable AI, and autonomous manufacturing. Together, regression and predictive modeling provide semiconductor fabs with a robust ecosystem for optimizing performance, minimizing costs, and accelerating innovation in an increasingly competitive, high-stakes industry. Full article
(This article belongs to the Special Issue Advances in Semiconductor Devices and Applications)
Show Figures

Figure 1

29 pages, 1030 KB  
Protocol
Secondary Prevention of AFAIS: Deploying Traditional Regression, Machine Learning, and Deep Learning Models to Validate and Update CHA2DS2-VASc for 90-Day Recurrence
by Jenny Simon, Łukasz Kraiński, Michał Karliński, Maciej Niewada and on behalf of the VISTA-Acute Collaboration
J. Clin. Med. 2025, 14(20), 7327; https://doi.org/10.3390/jcm14207327 - 16 Oct 2025
Viewed by 399
Abstract
Backgrounds/Objectives: Atrial fibrillation (AF) confers a fivefold greater risk of acute ischaemic stroke (AIS) relative to normal sinus rhythm. Among patients with AF-related AIS (AFAIS), recurrence is common: AFAIS rate is sixfold higher in secondary versus primary prevention patients. Guidelines recommend oral anticoagulation [...] Read more.
Backgrounds/Objectives: Atrial fibrillation (AF) confers a fivefold greater risk of acute ischaemic stroke (AIS) relative to normal sinus rhythm. Among patients with AF-related AIS (AFAIS), recurrence is common: AFAIS rate is sixfold higher in secondary versus primary prevention patients. Guidelines recommend oral anticoagulation for primary and secondary prevention on the basis of CHA2DS2-VASc. However, guideline adherence is poor for secondary prevention. This is, in part, because the predictive value of CHA2DS2-VASc has not been ascertained with respect to recurrence: patients with and without previous stroke were not routinely differentiated in validation studies. We put forth a protocol to (1) validate, and (2) update CHA2DS2-VASc for secondary prevention, aiming to deliver a CPR that better captures 90-day recurrence risk for a given AFAIS patient. Overwhelmingly poor quality of reporting has been deplored among published clinical prediction rules (CPRs). Combined with the fact that machine learning (ML) and deep learning (DL) methods are rife with challenges, registered protocols are needed to make the CPR literature more validation-oriented, transparent, and systematic. This protocol aims to lead by example for prior planning of primary and secondary analyses to obtain incremental predictive value for existing CPRs. Methods: The Virtual International Stroke Trials Archive (VISTA), which has compiled data from 38 randomised controlled trials (RCTs) in AIS, was screened for patients that (1) had an AF diagnosis, and (2) were treated with vitamin K antagonists (VKAs) or without any antithrombotic medication. This yielded 2763 AFAIS patients. Patients without an AF diagnosis were also retained under the condition that they were treated with VKAs or without any antithrombotic medication, which yielded 7809 non-AF AIS patients. We will validate CHA2DS2-VASc for 90-day recurrence and secondary outcomes (7-day recurrence, 7- and 90-day haemorrhagic transformation, 90-day decline in functional status, and 90-day all-cause mortality) by examining discrimination, calibration, and clinical utility. To update CHA2DS2-VASc, logistic regression (LR), extreme gradient boosting (XGBoost), and multilayer perceptron (MLP) models will be trained using nested cross-validation. The MLP model will employ transfer learning to leverage information from the non-AF AIS patient cohort. Results: Models will be assessed on a hold-out test set (25%) using area under the receiver operating characteristic curve (AUC), calibration curves, and F1 score. Shapley additive explanations (SHAP) will be used to interpret the models and construct the updated CPRs. Conclusions: The CPRs will be compared by means of discrimination, calibration, and clinical utility. In so doing, the CPRs will be evaluated against each other, CHA2DS2-VASc, and default strategies, with test tradeoff analysis performed to balance ease-of-use with clinical utility. Full article
(This article belongs to the Special Issue Application of Anticoagulation and Antiplatelet Therapy)
Show Figures

Figure 1

Back to TopTop