Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,085)

Search Parameters:
Keywords = virtual machines

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
50 pages, 56524 KB  
Review
Toward Digital Twins in 3D IC Packaging: A Critical Review of Physics, Data, and Hybrid Architectures
by Gourab Datta, Sarah Safura Sharif and Yaser Mike Banad
Electronics 2026, 15(8), 1740; https://doi.org/10.3390/electronics15081740 - 20 Apr 2026
Abstract
Three-dimensional integrated circuit (3D IC) packaging and heterogeneous integration have emerged as central pillars of contemporary semiconductor scaling. Yet, the multi-physics coupling inherent to stacked architectures manifesting as thermal hot spots, warpage-induced stresses, and interconnect aging demands monitoring and control capabilities that surpass [...] Read more.
Three-dimensional integrated circuit (3D IC) packaging and heterogeneous integration have emerged as central pillars of contemporary semiconductor scaling. Yet, the multi-physics coupling inherent to stacked architectures manifesting as thermal hot spots, warpage-induced stresses, and interconnect aging demands monitoring and control capabilities that surpass traditional offline metrology. Although Digital Twin (DT) technology provides a principled route to real-time reliability management, the existing literature remains fragmented and frequently blurs the distinction between static multi-physics simulation workflows and truly dynamic, closed-loop twins. This critical review addresses these deficiencies through three main contributions. First, we clarify the Digital Twin hierarchy to resolve terminological ambiguity between digital models, shadows, and twins. Second, we synthesize three foundational enabling technologies. We examine physics-based modeling, emphasizing the shift from finite-element analysis (FEA) to real-time surrogates. We analyze data-driven paradigms, highlighting virtual metrology (VM) for inferring latent metrics. Finally, we explore in situ sensing, which serves as the “nervous system” coupling the physical stack to its virtual counterpart. Third, beyond a descriptive survey, we outline a possible hybrid DT architecture that leverages physics-informed machine learning (e.g., PINNs) to help reconcile data scarcity with latency constraints. Finally, we outline a standards-aligned roadmap incorporating IEEE 1451 and UCIe protocols to support the transition from passive digital shadows toward more adaptive and fully coupled Digital Twin frameworks for 3D IC manufacturing and field operation. Full article
16 pages, 1926 KB  
Article
Performance Evaluation of a Cloud-Native Open-Source Power System Digital Twin for Real-Time Simulation
by Juan-Pablo Noreña and Ernesto Perez
Energies 2026, 19(8), 1982; https://doi.org/10.3390/en19081982 - 20 Apr 2026
Abstract
The increasing complexity of Cyber-Physical Energy Systems, driven by the high penetration of power electronics, advanced control, and digitalization, demands scalable, flexible real-time simulation platforms beyond traditional laboratory-based solutions. This paper investigates the feasibility of deploying open-source real-time power system simulation frameworks on [...] Read more.
The increasing complexity of Cyber-Physical Energy Systems, driven by the high penetration of power electronics, advanced control, and digitalization, demands scalable, flexible real-time simulation platforms beyond traditional laboratory-based solutions. This paper investigates the feasibility of deploying open-source real-time power system simulation frameworks on cloud-based infrastructures while meeting real-time computational constraints. An open-source architecture based on DPsim and the VILLAS framework is implemented and evaluated across five computing environments using open-source tools: bare-metal, non-cloud virtual machines, private cloud Kubernetes clusters, public cloud virtual machines, and public cloud Kubernetes clusters. Each environment is carefully configured and tuned using real-time operating systems, CPU isolation, and affinity mechanisms to improve deterministic behavior. Performance and scalability are assessed through a benchmark based on replicated IEEE 9-bus systems, progressively increasing system size, and measuring simulation-timestep execution time. The results show that cloud and cloud-like infrastructures can support soft and, under controlled conditions, firm real-time simulation tasks, although achievable system scale decreases as additional abstraction layers are introduced. The study identifies practical performance limits for each infrastructure and discusses their suitability for different real-time simulation and co-simulation applications. These findings demonstrate that cloud-based real-time simulation can complement traditional digital real-time simulators, enabling scalable and cost-effective CPES experimentation. Full article
(This article belongs to the Section F1: Electrical Power System)
Show Figures

Figure 1

19 pages, 4280 KB  
Article
Adaptive Recursive Model Predictive Current Control for Linear Motor Drives in CNC Machine Tools Based on Cartesian Distance Minimization
by Lin Song, Ziling Nie, Jun Sun, Yangwei Zhou, Jingxin Yuan and Huayu Li
Mathematics 2026, 14(8), 1377; https://doi.org/10.3390/math14081377 - 20 Apr 2026
Abstract
With the increasing demand for high speed and high-precision motion control in CNC machine tools, permanent magnet linear synchronous motors (PMLSMs) have been widely adopted in feed drive systems due to their excellent dynamic performance and positioning accuracy. However, existing model predictive current [...] Read more.
With the increasing demand for high speed and high-precision motion control in CNC machine tools, permanent magnet linear synchronous motors (PMLSMs) have been widely adopted in feed drive systems due to their excellent dynamic performance and positioning accuracy. However, existing model predictive current control (MPCC) variants still face challenges regarding high computational overhead and strong dependency on accurate motor parameters, which limit their industrial applicability. To address these issues, this paper proposes an adaptive recursive MPCC for PMLSM drives based on the Cartesian distance minimization principle. An adaptive recursive prediction scheme that is inspired by the feedback structure of recurrent architectures is first introduced. By cyclically utilizing the previously sampled current to predict the next period’s state, the strategy effectively decouples the control law from inductance variations. The dependence on resistance is further mitigated by analyzing the correlation between the ideal current vector and voltage vector deviations. Second, the selection of the optimal voltage vector is transformed into a geometric problem: minimizing the Cartesian distance between the reference voltage and 19 candidate deviations within a proposed virtual voltage vector hexagon. To minimize the computational burden, the vector space is partitioned into eight regions, allowing the optimal candidate to be selected from only two pre-derived deviations. The experimental results demonstrate that the proposed method significantly outperforms existing MPCC benchmarks. Specifically, the execution time is reduced by 63.6%. Under severe parameter mismatch, the current THD is reduced from 14.82% to 6.35%, and the thrust ripple is improved from 12.06 N to 5.25 N, validating its superior robustness and efficiency. Full article
(This article belongs to the Special Issue Advances in Control Theory and Applications in Energy Systems)
Show Figures

Figure 1

33 pages, 2685 KB  
Review
Comparative Molecular Insights and Computational Modeling of Multiple Myeloma and Osteosarcoma
by Alina Ioana Ghiță, Vadim V. Silberschmidt and Mariana Ioniță
Int. J. Mol. Sci. 2026, 27(8), 3611; https://doi.org/10.3390/ijms27083611 - 18 Apr 2026
Viewed by 98
Abstract
Multiple myeloma (MM) and osteosarcoma (OS) are two biologically distinct osseous malignancies with similar molecular networks that present translational challenges for their computational modeling. This comparative research analyzes MM and OS biology relevant to in silico approaches, focusing on PI3K-AKT-mTOR signaling, the RANK-RANKL-OPG [...] Read more.
Multiple myeloma (MM) and osteosarcoma (OS) are two biologically distinct osseous malignancies with similar molecular networks that present translational challenges for their computational modeling. This comparative research analyzes MM and OS biology relevant to in silico approaches, focusing on PI3K-AKT-mTOR signaling, the RANK-RANKL-OPG axis, angiogenic factors (VEGF, TGFs), and immune mediators in MM, alongside the transcription factors (SOX9, RUNX2), signaling pathways (PI3K-AKT-mTOR, NOTCH), immune cell state (TAM2), and interleukins in OS. Based on this pathophysiologic foundation, the review outlines five computational paradigms: (i) mechanistic models; (ii) data-driven/machine learning schemes; (iii) hybrid mechanistic approaches; (iv) digital twins/virtual cohorts, and (v) MIDD/PBPK models for real-world applications. A cross-cancer comparison section summarizes common and distinct biological axes and their computational translation as well as the overlapping features from the bone microenvironment. For both MM and OS, the research assesses strengths, limitations, and data needs of current models, outlining the strategic objectives for next-generation multiscale, AI-enabled models providing a roadmap for tissue engineers, oncology scientists, and translational researchers to design clinically relevant preclinical tests and accelerate safer, more effective strategies for tumor-affected bones. The differences between MM and OS impose distinct biological constraints, so their comparisons are rare. Combining all these features with artificial intelligence capabilities will underpin a promising transition in the development of in silico adaptive and learning models. Full article
(This article belongs to the Section Molecular Oncology)
17 pages, 2172 KB  
Article
Combining Augmented Reality Guidance and Virtual Constraints for Skilled Epidural Needle Placement
by Daniel Haro-Mendoza, Marcos Lopez-Magaña, Luis Jimenez-Angeles and Victor J. Gonzalez-Villela
Machines 2026, 14(4), 446; https://doi.org/10.3390/machines14040446 - 17 Apr 2026
Viewed by 230
Abstract
Accurate needle insertion during epidural anesthesia is challenging due to strong dependence on clinician experience and the limited integration of guidance modalities that simultaneously provide visual feedback and physical motion constraints. Current approaches, including ultrasound guidance and augmented reality visualization, mainly offer passive [...] Read more.
Accurate needle insertion during epidural anesthesia is challenging due to strong dependence on clinician experience and the limited integration of guidance modalities that simultaneously provide visual feedback and physical motion constraints. Current approaches, including ultrasound guidance and augmented reality visualization, mainly offer passive assistance and do not actively regulate insertion trajectory and depth, which may lead to variability in accuracy and increased risk of complications. This work presents a multimodal human–machine assistance system that combines augmented reality guidance with virtual fixtures to support lumbar epidural needle placement. A Tuohy needle is coupled to a haptic device interacting with a patient-specific L3–L4 lumbar phantom fabricated using 3D printing and ballistic gel. A model-based force profile reproduces the mechanical response of anatomical layers during insertion. Three experimental conditions are evaluated: freehand execution, augmented reality guidance with trajectory and depth visualization, and cooperative guidance using virtual fixtures defined by a cylindrical corridor and a depth-limiting plane. Results show a progressive reduction in mean depth error from 6.82 ± 3.46 mm (freehand) to 4.96 ± 2.41 mm (augmented reality) and 2.21 ± 1.73 mm (virtual fixtures). These findings indicate that the integration of visual and haptic guidance significantly enhances insertion precision and control. The proposed approach highlights the potential of multimodal human–machine cooperation for safer training and assisted interventions. Full article
Show Figures

Figure 1

21 pages, 8107 KB  
Systematic Review
A Systematic Review of Kernel-Level Security Mechanisms, Vulnerability Detection and Mitigation in Modern Operating Systems
by Zeeshan Ali, Naeem Aslam, Andrea Marotta, Walter Tiberti and Dajana Cassioli
Sensors 2026, 26(8), 2452; https://doi.org/10.3390/s26082452 - 16 Apr 2026
Viewed by 327
Abstract
Kernel attacks are still one of the most severe threats to modern operating systems (OS) due to the kernel’s privileged control over hardware, memory, and process management. This study reviews some significant kernel-level security mechanisms regarding vulnerability detection, as well as the prevention [...] Read more.
Kernel attacks are still one of the most severe threats to modern operating systems (OS) due to the kernel’s privileged control over hardware, memory, and process management. This study reviews some significant kernel-level security mechanisms regarding vulnerability detection, as well as the prevention and mitigation of exploitation in today’s OSs. Using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology, a total of 30 high-quality, peer-reviewed studies were examined and analyzed in detail using the Critical Appraisal Skills Programme (CASP) quality framework. Discussion about the leading research directions emanated from three central questions of this review: What are the predominant kernel attack vectors? How are the techniques for protection and detection that are currently available assessed? What are the emerging research directions? The study identifies the following as the principal sources of kernel compromise: memory corruption, privilege escalation, rootkits, and race condition exploits. It also identifies several techniques for kernel hardening, such as Mandatory Access Control (MAC), the use of SELinux and AppArmor, kernel integrity monitoring, secure and measured boot, fuzz testing, and hardware-assisted protection. Some of these emerged as having a great deal of promise for proactive defense against zero-day vulnerabilities, including machine learning-based detection and live kernel patching. Issues regarding scalability, detection accuracy, and securing containerized and virtualized environments need to be solved. This paper aims to provide relevant, structured, and up-to-date research on kernel security synthesis and offer valuable guidance on the development of robust, adaptive, and novel OS defense mechanisms. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

22 pages, 4742 KB  
Article
A Novel E-Nose Architecture Based on Virtual Sensor-Augmented Embedded Intelligence for a Real-Time In-Vehicle Carbon Monoxide Concentration Estimation System
by Dharmendra Kumar, Anup Kumar Rabha, Ashutosh Mishra, Rakesh Shrestha and Navin Singh Rajput
Electronics 2026, 15(8), 1671; https://doi.org/10.3390/electronics15081671 - 16 Apr 2026
Viewed by 220
Abstract
The increasing risk of air pollution in closed areas like passenger vehicles requires smart and real-time air quality reading solutions. Gases such as carbon monoxide (CO)—which is colorless and odorless and is produced by exhaust systems—air conditioners, and combustion sources are very dangerous [...] Read more.
The increasing risk of air pollution in closed areas like passenger vehicles requires smart and real-time air quality reading solutions. Gases such as carbon monoxide (CO)—which is colorless and odorless and is produced by exhaust systems—air conditioners, and combustion sources are very dangerous to health because they can cause respiratory distress and poisoning at high levels. Traditional in-vehicle CO monitoring systems use a single-point sensor and a fixed threshold, which are insufficient in a dynamic cabin environment subject to factors such as vehicle size, ventilation rate, number of occupants, and incoming traffic. To address these drawbacks, this paper proposes a new E-Nose system with Virtual Sensor-Augmented Embedded Intelligence to estimate the CO concentration in vehicle cabins in real time. The system combines data from cheap gas sensors and improves it using virtual sensor machine learning models trained to predict or enhance sensor responses in real time. Embedded intelligence, deployed locally on edge hardware, supports low-latency processing, dynamic calibration, and noise filtering to respond to fluctuating environmental conditions adaptively. This architecture enables more accurate, robust, and context-aware estimation of CO levels compared to traditional threshold-based methods. Experimental validation across varied vehicular scenarios demonstrates superior precision and responsiveness, providing timely warnings even under complex dispersion patterns. Classifier Gradient Boosting, which builds an ensemble of weak learners sequentially, matched the Random Forest with 99.94% training and 98.59% model accuracy, confirming its strong predictive capability. The system is designed to be cost-effective, scalable, and easily integrable into modern automotive platforms. This study also contributes to the field of smart ecological recording and demonstrates the effectiveness of the virtual sensor-enhanced embedded system as an effective way to improve passenger safety by providing pre-emptive on-board air quality monitoring. Full article
(This article belongs to the Special Issue Emerging IoT Sensor Network Technologies and Applications)
Show Figures

Figure 1

23 pages, 1403 KB  
Article
Toward Mechanism-Driven Control: A Soft-Sensor for Zeta Potential and Settling-Decisive Parameters in Coal Slime Water Treatment
by Jing Chang, Bianbian Guo, Guoyu Bai, Xinyuan Zhang, Hang Zhang, Wei Zhao and Zhen Li
Separations 2026, 13(4), 115; https://doi.org/10.3390/separations13040115 - 13 Apr 2026
Viewed by 204
Abstract
Intelligent dosing in coal slime water treatment remains a challenge due to the lack of real-time and solid hardware-based measurement of key microscopic parameters governing the settling process, particularly zeta potential. This study proposes a soft-sensor method using Sparrow Search Algorithm-optimized Extreme Learning [...] Read more.
Intelligent dosing in coal slime water treatment remains a challenge due to the lack of real-time and solid hardware-based measurement of key microscopic parameters governing the settling process, particularly zeta potential. This study proposes a soft-sensor method using Sparrow Search Algorithm-optimized Extreme Learning Machine (SSA-ELM) to simultaneously predict four critical settling process parameters: settling velocity, supernatant turbidity, sediment layer height, and zeta potential. Key variables influencing the coal slime water settling process, including coal slime water concentration, fines content, water hardness, pH, and chemical dosage, were investigated, and the experimental data were used as inputs for the development of the prediction model. The prediction performance of the proposed SSA-ELM model was evaluated against standard ELM and SSA-optimized Back Propagation (BP) models. The results demonstrate that the SSA-ELM model achieved superior prediction accuracy for all parameters, with R2 values ranging from 0.95 to 0.98, while maintaining favorable computational efficiency. This study establishes a method for virtual measurement of zeta potential, providing a crucial data foundation for developing mechanism-driven, intelligent dosing systems aimed at precise intelligent control and reduced chemical consumption for coal preparation plants. Full article
(This article belongs to the Special Issue Separation Techniques for Wastewater Treatment)
17 pages, 2175 KB  
Article
Evaluating and Classifying Gentleness in VR-Based Surgical Simulation: A VR + fNIRS Study
by Suveyda Sanli and Hasan Onur Keles
Sensors 2026, 26(8), 2388; https://doi.org/10.3390/s26082388 - 13 Apr 2026
Viewed by 350
Abstract
Gentleness, defined as the ability to handle tissues delicately while minimizing unnecessary force, is a critical indicator of surgical proficiency. Objective and real-time assessment of gentleness in virtual reality (VR)-based training can improve the understanding of both psychomotor and cognitive components of surgical [...] Read more.
Gentleness, defined as the ability to handle tissues delicately while minimizing unnecessary force, is a critical indicator of surgical proficiency. Objective and real-time assessment of gentleness in virtual reality (VR)-based training can improve the understanding of both psychomotor and cognitive components of surgical skill. This study evaluates and classifies participants’ gentleness during VR-based laparoscopic simulations using fNIRS-derived hemodynamic features. Twenty-three volunteers with no prior laparoscopic experience performed a VR-based double-grasper task while hemodynamic activity over frontal and motor cortical regions was recorded using eighteen fNIRS channels. In parallel, subjective workload (NASA-TLX), error counts, and gentleness performance score (GPS) were collected. Temporal features, including slope, root mean square, and standard deviation, were extracted from the fNIRS signals and used to train multiple machine learning classifiers. Performance labels were binarized into low and high groups using median splits of the gentleness performance score. Models were evaluated using stratified 5-fold cross-validation. Results revealed stronger right-frontal HbO activity and increased left-motor HbR responses in the low-performance group, suggesting higher cognitive effort and less efficient motor strategies. Across classifiers, slope-based features consistently outperformed variability- and amplitude-based metrics. The highest classification performance was achieved using HbR slope features with Random Forest classifiers (accuracy ≈ 0.85, AUC up to 0.93). These findings highlight the potential of fNIRS-based metrics for automated performance assessment in VR surgical training. Full article
(This article belongs to the Section Optical Sensors)
Show Figures

Figure 1

28 pages, 473 KB  
Article
De-Anonymization Techniques in the Tor Network Using an Experimental Testbed
by Ondrej Kainz, Sebastián Petro, Miroslav Michalko, Miroslav Murin and Ervín Šimko
J. Cybersecur. Priv. 2026, 6(2), 72; https://doi.org/10.3390/jcp6020072 - 13 Apr 2026
Viewed by 226
Abstract
Tor is an anonymization network that enables access to hidden services and protects user identity through layered encryption. While its core technology offers strong privacy, users can still be exposed through indirect attack methods or configuration mistakes. This research not only explores de-anonymization [...] Read more.
Tor is an anonymization network that enables access to hidden services and protects user identity through layered encryption. While its core technology offers strong privacy, users can still be exposed through indirect attack methods or configuration mistakes. This research not only explores de-anonymization techniques but also provides a practical guide for constructing a fully functional experimental Tor environment using virtual machines. The custom-built testbed allows for safe simulation of attacks without impacting the public Tor network. Within this environment, three key information-gathering approaches were evaluated: (1) malware-based reverse shells that establish external communication, (2) malicious PDF and Office files used to trigger outbound connections, and (3) analysis of service misconfigurations that may reveal the IP address of hidden services. The results confirm that although the Tor network itself is resilient, user behavior, improper configurations, and insecure content handling can lead to significant privacy risks. By combining practical environment setup with real-world attack scenarios, this paper serves both as a reference for building experimental Tor networks and as a security-oriented analysis of known de-anonymization vectors. The findings emphasize the critical need for user awareness and precise configuration in privacy-focused technologies. Full article
(This article belongs to the Section Security Engineering & Applications)
Show Figures

Figure 1

40 pages, 4155 KB  
Review
Artificial Intelligence in Pulmonary Endoscopy: Current Evidence, Limitations, and Future Directions
by Sara Lopes, Miguel Mascarenhas, João Fonseca and Adelino F. Leite-Moreira
J. Imaging 2026, 12(4), 167; https://doi.org/10.3390/jimaging12040167 - 12 Apr 2026
Viewed by 207
Abstract
Background: Artificial intelligence (AI) is increasingly applied in pulmonary endoscopy, including diagnostic bronchoscopy, interventional pulmonology and endobronchial imaging. Advances in computer vision, machine learning and robotic systems have expanded the potential for automated lesion detection, navigation to peripheral pulmonary lesions, and real-time [...] Read more.
Background: Artificial intelligence (AI) is increasingly applied in pulmonary endoscopy, including diagnostic bronchoscopy, interventional pulmonology and endobronchial imaging. Advances in computer vision, machine learning and robotic systems have expanded the potential for automated lesion detection, navigation to peripheral pulmonary lesions, and real-time procedural support. However, the current evidence base remains heterogeneous, and translational challenges persist. Methods: This review summarizes current applications and developments of AI across white-light bronchoscopy (WLB), image-enhanced bronchoscopy (e.g., narrow-band imaging and autofluorescence imaging), endobronchial ultrasound (EBUS), virtual and robotic bronchoscopies, and workflow optimization and training. The authors also examine the methodological limitations, regulatory considerations, and implementation barriers that affect translation into routine practice. Results: Reported developments include deep learning-based models for mucosal abnormality detection, lymph-node characterization during EBUS-guided transbronchial needle aspiration (EBUS-TBNA), improved lesion localization, and reduction in operator-dependent variability. Additionally, AI-assisted simulation platforms and decision-support tools are reshaping training paradigms. Nevertheless, most studies remain retrospective or single-center, with limited external validation, dataset heterogeneity, unclear model explainability, and incomplete integration into clinical workflows. Conclusions: AI has the potential to support lesion detection, navigation, and training in pulmonary endoscopy. However, robust prospective validation, standardized datasets, transparent model reporting, robust data governance, multidisciplinary collaboration, and careful integration into clinical practice are required before widespread adoption. Full article
(This article belongs to the Section AI in Imaging)
16 pages, 7123 KB  
Article
Digital Twin of a Material Handling System Based on a Physical Construction-Kit Model for Educational Applications
by Ladislav Rigó, Jana Fabianová, Lucia Čabaníková and Ján Palinský
Machines 2026, 14(4), 429; https://doi.org/10.3390/machines14040429 - 11 Apr 2026
Viewed by 415
Abstract
Digital twin (DT) technology is a key element of Industry 4.0. Despite its rapid development, current research is mainly focused on industrial optimisation and machine-level monitoring. However, its implementation in the educational process lags significantly behind practice. Moreover, existing DT implementations in education [...] Read more.
Digital twin (DT) technology is a key element of Industry 4.0. Despite its rapid development, current research is mainly focused on industrial optimisation and machine-level monitoring. However, its implementation in the educational process lags significantly behind practice. Moreover, existing DT implementations in education often emphasise visualisation or simulation, while neglecting synchronisation and verification of functional equivalence between the physical and virtual systems. This study presents the design, development and experimental verification of a digital twin of a laboratory material handling system. The virtual model created in Tecnomatix Plant Simulation is connected to the physical system controlled by a Siemens PLC SIMATIC S7-1200 and equipped with industrial sensors and an HMI interface. Real-time bidirectional communication is established via the OPC UA protocol using KEPServerEX, ensuring synchronisation between the physical and virtual systems. Experiments confirmed the functional synchronisation of both systems. Additionally, the study presents that DT technology can be adapted for educational purposes and implemented in engineering education. Full article
(This article belongs to the Special Issue Digital Twins Applications in Manufacturing Optimization)
Show Figures

Figure 1

14 pages, 2318 KB  
Article
A Flexible Wearable Data Glove Based on Hybrid Fiber-Optic Sensing for Hand Motion Monitoring
by Jing Li, Xiangting Hou, Ke Du, Huiying Piao and Cheng Li
Materials 2026, 19(8), 1525; https://doi.org/10.3390/ma19081525 - 10 Apr 2026
Viewed by 359
Abstract
Wearable data gloves often suffer from electromagnetic interference, insufficient substrate stability, and limited capability for multi-degree-of-freedom motion measurement. To address these limitations, a flexible glove incorporating a hybrid POF-FBG sensing scheme was designed and fabricated. Plastic optical fibers (POFs) were side-polished and patterned [...] Read more.
Wearable data gloves often suffer from electromagnetic interference, insufficient substrate stability, and limited capability for multi-degree-of-freedom motion measurement. To address these limitations, a flexible glove incorporating a hybrid POF-FBG sensing scheme was designed and fabricated. Plastic optical fibers (POFs) were side-polished and patterned with long-period gratings to improve sensitivity to wrist flexion-extension and abduction-adduction. Then fiber Bragg gratings (FBGs) were embedded in a polydimethylsiloxane substrate and encapsulated using thermoplastic polyurethane fixtures to reduce the influence of skin stretching and improve measurement accuracy of finger-joint angle. Moreover, a thermoplastic polyurethane skeleton with an adaptive sliding-rail structure was 3D printed to maintain the stability of the sensor placement at the joints. Experimental results demonstrated the mean absolute errors of 4.06°, 1.38° and 1.70° for wrist flexion-extension, abduction-adduction and finger-joint bending, respectively, along with excellent gesture classification using a support vector machine algorithm, which indicates great potential in virtual reality interaction and hand rehabilitation applications. Full article
(This article belongs to the Special Issue Advances in Optical Fiber Materials and Their Applications)
Show Figures

Figure 1

41 pages, 84120 KB  
Article
DDS-over-TSN Framework for Time-Critical Applications in Industrial Metaverses
by Taemin Nam, Seongjin Yun and Won-Tae Kim
Appl. Sci. 2026, 16(8), 3641; https://doi.org/10.3390/app16083641 - 8 Apr 2026
Viewed by 294
Abstract
The industrial metaverse is a digital twin space that integrates the real world with virtual environments through bidirectional synchronization. It supports critical services, such as time-sensitive machine control and large-scale collaboration, which require Time-Sensitive Networking and scalable Data Distribution Services. DDS, developed by [...] Read more.
The industrial metaverse is a digital twin space that integrates the real world with virtual environments through bidirectional synchronization. It supports critical services, such as time-sensitive machine control and large-scale collaboration, which require Time-Sensitive Networking and scalable Data Distribution Services. DDS, developed by the Object Management Group, provides excellent scalability and diverse QoS policies but struggles to guarantee transmission delay and jitter for time-critical applications. TSN, based on IEEE 802.1 standards, addresses these challenges by ensuring time-criticality. However, current research lacks comprehensive integration mechanisms for DDS and TSN, particularly from the viewpoints of semantics and system framework. Additionally, there is no adaptive QoS mapping converting the abstract DDS QoS policies to the sophisticated TSN QoS parameters. This paper presents a novel DDS-over-TSN framework that incorporates three key functions to address these challenges. First, Cross-layer QoS Mapping automates correspondences between DDS and TSN parameters, deriving technical constraints from standard documentation through retrieval-augmented generation. Second, Semantic Priority Estimation extracts substantial priority levels by utilizing language model embedding vectors as high-dimensional feature extractors. Third, Adaptive Resource Allocation performs dynamic bandwidth distribution for each priority level through reinforcement learning. Simulation results reveal over 99% mapping accuracy and 97% consistency in priority extraction. The applied Deep Reinforcement Learning paradigm allocated 99% of required resources to high-priority classes and reduced resource wastage by 15% compared to conventional methods. This methodology meets industrial requirements by ensuring both deterministic real-time performance and efficient resource isolation. Full article
(This article belongs to the Special Issue Digital Twin and IoT, 2nd Edition)
Show Figures

Figure 1

25 pages, 2957 KB  
Article
Automating the Detection of Evasive Windows Malware: An Evaluated YARA Rule Library for Anti-VM and Anti-Sandbox Techniques
by Sebastien Kanj, Gorka Vila and Josep Pegueroles
J. Cybersecur. Priv. 2026, 6(2), 69; https://doi.org/10.3390/jcp6020069 - 8 Apr 2026
Viewed by 495
Abstract
Anti-analysis techniques, also known as evasive techniques, enable Windows malware to detect and evade dynamic inspection environments, undermining the effectiveness of virtual-machine and sandbox-based inspection. Despite extensive prior research, no unified classification has been paired with a large-scale empirical evaluation of static detection [...] Read more.
Anti-analysis techniques, also known as evasive techniques, enable Windows malware to detect and evade dynamic inspection environments, undermining the effectiveness of virtual-machine and sandbox-based inspection. Despite extensive prior research, no unified classification has been paired with a large-scale empirical evaluation of static detection capabilities for these behaviors. This paper addresses this gap by presenting a comprehensive classification and detection framework. We consolidate 94 anti-analysis techniques from academic, community, and threat-intelligence sources into nine mechanistic categories and derive corresponding YARA rules for static identification. In total, 82 YARA signatures were authored or refined and evaluated on 459,508 malware and 92,508 goodware samples. After iterative refinement using precision thresholds, 42 rules achieved high accuracy (≥75%), 16 showed moderate precision (50–75%), and 24 were discarded due to unreliability. The results indicate strong static detectability for firmware- and BIOS-based checks, but limited precision for timing-based evasions, which frequently overlap with benign behavior. Although YARA provides broad coverage of observable artifacts, its static nature limits detection under obfuscation or runtime mutation; our measurements therefore represent conservative estimates of technique prevalence. All validated rules are released in an open-source repository to support reproducibility, improve incident-response workflows, and strengthen prevention and mitigation against real-world threats. Future work will explore hybrid validation, container-evasion extensions, and forensic attribution based on signature co-occurrence patterns. Full article
(This article belongs to the Special Issue Intrusion/Malware Detection and Prevention in Networks—2nd Edition)
Show Figures

Figure 1

Back to TopTop