Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,197)

Search Parameters:
Keywords = integrated processes of order two

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 1137 KB  
Article
Detecting TLS Protocol Anomalies Through Network Monitoring and Compliance Tools
by Diana Gratiela Berbecaru and Marco De Santo
Future Internet 2026, 18(1), 62; https://doi.org/10.3390/fi18010062 - 21 Jan 2026
Abstract
The Transport Layer Security (TLS) protocol is widely used nowadays to create secure communications over TCP/IP networks. Its purpose is to ensure confidentiality, authentication, and data integrity for messages exchanged between two endpoints. In order to facilitate its integration into widely used applications, [...] Read more.
The Transport Layer Security (TLS) protocol is widely used nowadays to create secure communications over TCP/IP networks. Its purpose is to ensure confidentiality, authentication, and data integrity for messages exchanged between two endpoints. In order to facilitate its integration into widely used applications, the protocol is typically implemented through libraries, such as OpenSSL, BoringSSL, LibreSSL, WolfSSL, NSS, or mbedTLS. These libraries encompass functions that execute the specialized TLS handshake required for channel establishment, as well as the construction and processing of TLS records, and the procedures for closing the secure channel. However, these software libraries may contain vulnerabilities or errors that could potentially jeopardize the security of the TLS channel. To identify flaws or deviations from established standards within the implemented TLS code, a specialized tool known as TLS-Anvil can be utilized. This tool also verifies the compliance of TLS libraries with the specifications outlined in the Request for Comments documents published by the IETF. TLS-Anvil conducts numerous tests with a client/server configuration utilizing a specified TLS library and subsequently generates a report that details the number of successful tests. In this work, we exploit the results obtained from a selected subset of TLS-Anvil tests to generate rules used for anomaly detection in Suricata, a well-known signature-based Intrusion Detection System. During the tests, TLS-Anvil generates .pcap capture files that report all the messages exchanged. Such files can be subsequently analyzed with Wireshark, allowing for a detailed examination of the messages exchanged during the tests and a thorough understanding of their structure on a byte-by-byte basis. Through the analysis of the TLS handshake messages produced during testing, we develop customized Suricata rules aimed at detecting TLS anomalies that result from flawed implementations within the intercepted traffic. Furthermore, we describe the specific test environment established for the purpose of deriving and validating certain Suricata rules intended to identify anomalies in nodes utilizing a version of the OpenSSL library that does not conform to the TLS specification. The rules that delineate TLS deviations or potential attacks may subsequently be integrated into a threat detection platform supporting Suricata. This integration will enhance the capability to identify TLS anomalies arising from code that fails to adhere to the established specifications. Full article
(This article belongs to the Special Issue DDoS Attack Detection for Cyber–Physical Systems)
18 pages, 587 KB  
Article
Bridging the Engagement–Regulation Gap: A Longitudinal Evaluation of AI-Enhanced Learning Attitudes in Social Work Education
by Duen-Huang Huang and Yu-Cheng Wang
Information 2026, 17(1), 107; https://doi.org/10.3390/info17010107 - 21 Jan 2026
Abstract
The rapid adoption of generative artificial intelligence (AI) in higher education has intensified a pedagogical dilemma: while AI tools can increase immediate classroom engagement, they do not necessarily foster the self-regulated learning (SRL) capacities required for ethical and reflective professional practice, particularly in [...] Read more.
The rapid adoption of generative artificial intelligence (AI) in higher education has intensified a pedagogical dilemma: while AI tools can increase immediate classroom engagement, they do not necessarily foster the self-regulated learning (SRL) capacities required for ethical and reflective professional practice, particularly in human-service fields. In this two-time-point, pre-post cohort-level (repeated cross-sectional) evaluation, we examined a six-week AI-integrated curriculum incorporating explicit SRL scaffolding among social work undergraduates at a Taiwanese university (pre-test N = 37; post-test N = 35). Because the surveys were administered anonymously and individual responses could not be linked across time, pre-post comparisons were conducted at the cohort level using independent samples. The participating students completed the AI-Enhanced Learning Attitude Scale (AILAS); this is a 30-item instrument grounded in the Technology Acceptance Model, Attitude Theory and SRL frameworks, assessing six dimensions of AI-related learning attitudes. Prior pilot evidence suggested an engagement regulation gap, characterized by relatively strong learning process engagement but weaker learning planning and learning habits. Accordingly, the curriculum incorporated weekly goal-setting activities, structured reflection tasks, peer accountability mechanisms, explicit instructor modeling of SRL strategies and simple progress tracking tools. The conducted psychometric analyses demonstrated excellent internal consistency for the total scale at the post-test stage (Cronbach’s α = 0.95). The independent-samples t-tests indicated that, at the post-test stage, the cohorts reported higher mean scores across most dimensions, with the largest cohort-level differences in Learning Habits (Cohen’s d = 0.75, p = 0.003) and Learning Process (Cohen’s d = 0.79, p = 0.002). After Bonferroni adjustment, improvements in the Learning Desire, Learning Habits and Learning Process dimensions and the Overall Attitude scores remained statistically robust. In contrast, the Learning Planning dimension demonstrated only marginal improvement (d = 0.46, p = 0.064), suggesting that higher-order planning skills may require longer or more sustained instructional support. No statistically significant gender differences were identified at the post-test stage. Taken together, the findings presented in this study offer preliminary, design-consistent evidence that SRL-oriented pedagogical scaffolding, rather than AI technology itself, may help narrow the engagement regulation gap, while the consolidation of autonomous planning capacities remains an ongoing instructional challenge. Full article
Show Figures

Graphical abstract

16 pages, 493 KB  
Article
‘Layered Resilience’ in Urban Context: An Investigation into the Interplay Between the Local State and Ethnic Minority Groups in Two European Cities During the COVID-19 Pandemic
by Jörg Dürrschmidt and John Eade
Soc. Sci. 2026, 15(1), 53; https://doi.org/10.3390/socsci15010053 - 21 Jan 2026
Abstract
This article explores urban ‘societal resilience’ during the global pandemic of 2020–2021. This health crisis involved a complex interweaving of social, cultural, political, and economic processes which involved both top-down measures undertaken by nation-state governments and bottom-up actions by local residents. In a [...] Read more.
This article explores urban ‘societal resilience’ during the global pandemic of 2020–2021. This health crisis involved a complex interweaving of social, cultural, political, and economic processes which involved both top-down measures undertaken by nation-state governments and bottom-up actions by local residents. In a research study undertaken in two European cities—Stuttgart and London—we focussed on two migrant minorities and the involvement by ‘experts’ and ‘non-experts’ in the meso-level where these top-down measures and bottom-up actions met. Our study provided a grounded understanding of ‘layered resilience’ where resiliency develops through the disjunctive order of communication patterns, public service delivery, institutionalized dialogue, narratives, and values. Through distinguishing between resiliency and resilience, we seek to illustrate the ‘elastic’ character of urban modes of integration. Our study suggests the need for more empirically grounded investigations into the continuity and difference between adaptation and adjustment, normality and normalcy, and resilience and resiliency. It also highlights the importance of context-specific and path-dependent notions of resilience and resiliency. Full article
(This article belongs to the Special Issue Understanding Societal Resilience)
Show Figures

Figure 1

19 pages, 4790 KB  
Article
Enhancing First-Year Mathematics Achievement Through a Complex Gamified Learning System
by Anna Muzsnay, Sára Szörényi, Anna K. Stirling, Csaba Szabó and Janka Szeibert
Educ. Sci. 2026, 16(1), 159; https://doi.org/10.3390/educsci16010159 - 20 Jan 2026
Abstract
The transition from high school to university-level mathematics is often accompanied by significant challenges. During the COVID-19 pandemic, these difficulties were further exacerbated by the abrupt shift to online learning. In response, educators increasingly turned to gamification—“a process of enhancing a service with [...] Read more.
The transition from high school to university-level mathematics is often accompanied by significant challenges. During the COVID-19 pandemic, these difficulties were further exacerbated by the abrupt shift to online learning. In response, educators increasingly turned to gamification—“a process of enhancing a service with affordances for gameful experiences in order to support users’ overall value creation”—as a strategy to address the limitations of remote instruction. In this study, we designed a gamified environment for a first-year Number Theory course. The system was constructed using targeted game elements such as leaderboards, optional challenge exams, and recognition for elegant solutions. These features were then integrated into a comprehensive point-based assessment system, which accounted for weekly quizzes and active participation. Following a quasi-experimental design, this study compared two groups of pre-service mathematics teachers: the class of 2017 (N = 62), which received traditional in-person instruction (control group), and the class of 2020 (N = 61), which participated in an online, gamified version of the course (experimental group). Both groups were taught by the same lecturer, using identical content, concepts, and similar tasks throughout the course. Academic performance was measured using midterm exam results. While no significant difference emerged on the first midterm in week 6 (their average percentages were 50% and 51%), the experimental group significantly outperformed the control group on the second midterm at the end of the term (their average percentages were 65% and 49%). These results suggest that a thoughtfully designed, gamified approach can enhance learning outcomes in an online mathematics course. Full article
Show Figures

Figure 1

19 pages, 1418 KB  
Article
Eco-Efficiency Assessment as an Enabler to Achieve Zero-Waste Manufacturing
by Marcelo Sousa, Sara M. Pinto, Venus Hydar and Flavia V. Barbosa
Sustainability 2026, 18(2), 997; https://doi.org/10.3390/su18020997 - 19 Jan 2026
Viewed by 55
Abstract
Achieving the ambitious EU goals of zero-waste manufacturing requires innovative tools and methodologies that address both efficiency and environmental sustainability. This study presents a comprehensive methodology for assessing the efficiency and eco-efficiency of industrial processes, in order to support zero-waste manufacturing strategies. The [...] Read more.
Achieving the ambitious EU goals of zero-waste manufacturing requires innovative tools and methodologies that address both efficiency and environmental sustainability. This study presents a comprehensive methodology for assessing the efficiency and eco-efficiency of industrial processes, in order to support zero-waste manufacturing strategies. The proposed approach assesses critical performance metrics while integrating environmental-impact analysis to provide a holistic view of process optimization. The methodology was applied to two industrial use cases in the composites sector, a field with significant environmental impact due to the resource-intensive nature of composite manufacturing and challenges associated with the end-of-life management. By implementing this dual assessment, the study identifies key areas for improvement in operational performance and sustainability, offering actionable insights for process optimization and waste reduction. The results reveal that labor costs emerged as the primary contributor to the total costs for both use cases, more than 50%. On the other hand, the resin infusion phase accounts for the majority of the environmental impacts, accounting for more than 70% of the total impacts. This analysis highlights that eco-efficiency assessments, integrating environmental and cost data, allow the identification of inefficiencies, helping industries to prioritize improvement areas. In this specific case, the high environmental impact of resin infusion needs enhanced waste monitoring and process optimization, while the labor-intensive operations need streamlined workflows to reduce operational time and associated costs. The present methodology intends to serve as a practical tool for industries aiming to balance high-performance manufacturing with reduced environmental impact. Full article
Show Figures

Figure 1

27 pages, 1283 KB  
Article
Supplier Evaluation in the Electric Vehicle Industry: A Hybrid Model Integrating AHP-TOPSIS and XGBoost for Risk Prediction
by Weikai Yan, Ziqi Song, Senyi Liu and Ershun Pan
Sustainability 2026, 18(2), 977; https://doi.org/10.3390/su18020977 - 18 Jan 2026
Viewed by 139
Abstract
As the supply chain of the electric vehicle (EV) industry becomes increasingly complex and vulnerable, traditional supplier evaluation methods reveal inherent limitations. These approaches primarily emphasize static performance while neglecting dynamic future risks. To address this issue, this study proposes a comprehensive supplier [...] Read more.
As the supply chain of the electric vehicle (EV) industry becomes increasingly complex and vulnerable, traditional supplier evaluation methods reveal inherent limitations. These approaches primarily emphasize static performance while neglecting dynamic future risks. To address this issue, this study proposes a comprehensive supplier evaluation model that integrates a hybrid Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) framework with the Extreme Gradient Boosting (XGBoost) algorithm, contextualized for the EV sector. The hybrid AHP-TOPSIS framework is first applied to rank suppliers based on multidimensional performance criteria, including quality, delivery capability, supply stability and scale. Subsequently, the XGBoost algorithm uses historical monthly data to capture nonlinear relationships and predict future supplier risk probabilities. Finally, a risk-adjusted framework combines these two components to construct a dynamic dual-dimensional performance–risk evaluation system. A case study using real data from an automobile manufacturer demonstrates that the hybrid AHP–TOPSIS model effectively distinguishes suppliers’ historical performance, while the XGBoost model achieves high predictive accuracy under five-fold cross-validation, with an AUC of 0.851 and an F1 score of 0.928. After risk adjustment, several suppliers exhibiting high performance but elevated risk experienced significant declines in their overall rankings, thereby validating the robustness and practicality of the integrated model. This study provides a feasible theoretical framework and empirical evidence for EV enterprises to develop supplier decision-making systems that balance performance and risk, offering valuable insights for enhancing supply chain resilience and intelligence. Full article
Show Figures

Figure 1

28 pages, 12687 KB  
Article
Fatigue Analysis and Numerical Simulation of Loess Reinforced with Permeable Polyurethane Polymer Grouting
by Lisha Yue, Xiaodong Yang, Shuo Liu, Chengchao Guo, Zhihua Guo, Loukai Du and Lina Wang
Polymers 2026, 18(2), 242; https://doi.org/10.3390/polym18020242 - 16 Jan 2026
Viewed by 113
Abstract
Loess subgrades are prone to significant strength reduction and deformation under cyclic traffic loads and moisture ingress. Permeable polyurethane polymer grouting has emerged as a promising non-excavation technique for rapid subgrade reinforcement. This study systematically investigated the fatigue behavior of polymer-grouted loess using [...] Read more.
Loess subgrades are prone to significant strength reduction and deformation under cyclic traffic loads and moisture ingress. Permeable polyurethane polymer grouting has emerged as a promising non-excavation technique for rapid subgrade reinforcement. This study systematically investigated the fatigue behavior of polymer-grouted loess using laboratory fatigue tests and numerical simulations. A series of stress-controlled cyclic tests were conducted on grouted loess specimens under varying moisture contents and stress levels, revealing that fatigue life decreased with increasing moisture and stress levels, with a maximum life of 200,000 cycles achieved under optimal conditions. The failure process was categorized into three distinct stages, culminating in a “multiple-crack” mode, indicating improved stress distribution and ductility. Statistical analysis confirmed that fatigue life followed a two-parameter Weibull distribution, enabling the development of a probabilistic fatigue life prediction model. Furthermore, a 3D finite element model of the road structure was established in Abaqus and integrated with Fe-safe for fatigue life assessment. The results demonstrated that polymer grouting reduced subgrade stress by nearly one order of magnitude and increased fatigue life by approximately tenfold. The consistency between the simulation outcomes and experimentally derived fatigue equations underscores the reliability of the proposed numerical approach. This research provides a theoretical and practical foundation for the fatigue-resistant design and maintenance of loess subgrades reinforced with permeable polyurethane polymer grouting, contributing to the development of sustainable infrastructure in loess-rich regions. Full article
(This article belongs to the Section Polymer Applications)
Show Figures

Figure 1

31 pages, 33847 KB  
Article
Incremental Data Cube Architecture for Sentinel-2 Time Series: Multi-Cube Approaches to Dynamic Baseline Construction
by Roxana Trujillo and Mauricio Solar
Remote Sens. 2026, 18(2), 260; https://doi.org/10.3390/rs18020260 - 14 Jan 2026
Viewed by 265
Abstract
Incremental computing is becoming increasingly important for processing large-scale datasets. In satellite imagery, spatial resolution, temporal depth, and large files pose significant computational challenges, requiring efficient architectures to manage processing time and resource usage. Accordingly, in this study, we propose a dynamic architecture, [...] Read more.
Incremental computing is becoming increasingly important for processing large-scale datasets. In satellite imagery, spatial resolution, temporal depth, and large files pose significant computational challenges, requiring efficient architectures to manage processing time and resource usage. Accordingly, in this study, we propose a dynamic architecture, termed Multi-Cube, for optical satellite time series. The framework introduces a modular and baseline-aware approach that enables scalable subdivision, incremental growth, and consistent management of spatiotemporal data. Built on NetCDF, xarray, and Zarr, Multi-Cube automatically constructs stable multidimensional data cubes while minimizing redundant reprocessing, formalizing automated internal decisions governing cube subdivision, baseline reuse, and incremental updates to support recurrent monitoring workflows. Its performance was evaluated using more than 83,000 Sentinel-2 images (covering 2016–2024) across multiple areas of interest. The proposed approach achieved a 5.4× reduction in end-to-end runtime, decreasing execution time from 53 h to 9 h, while disk I/O requirements were reduced by more than two orders of magnitude compared with a traditional sequential reprocessing pipeline. The framework supports parallel execution and on-demand sub-cube extraction for responsive large-area monitoring while internally handling incremental updates and adaptive cube management without requiring manual intervention. The results demonstrate that the Multi-Cube architecture provides a decision-driven foundation for integrating dynamic Earth observation workflows with analytical modules. Full article
Show Figures

Figure 1

22 pages, 4042 KB  
Article
The Concept of a Hierarchical Digital Twin
by Magdalena Jarzyńska, Andrzej Nierychlok and Małgorzata Olender-Skóra
Appl. Sci. 2026, 16(2), 605; https://doi.org/10.3390/app16020605 - 7 Jan 2026
Viewed by 223
Abstract
The concept of a digital twin has become a key driver of industrial transformation, enabling a seamless connection between physical systems and their virtual counterparts. The growing need for adaptability has accelerated the use of advanced technologies and tools to maintain competitiveness. In [...] Read more.
The concept of a digital twin has become a key driver of industrial transformation, enabling a seamless connection between physical systems and their virtual counterparts. The growing need for adaptability has accelerated the use of advanced technologies and tools to maintain competitiveness. In this context, the article introduces the concept of a hierarchical digital twin and illustrates its operation through a practical example. Production resource structures and timing data were generated in the KbRS (Knowledge-based Rescheduling System), which will serve as the Level II digital twin in this article. The acquired data is transferred via Excel to the FlexSim simulation environment, which represents the Level I digital twin responsible for modeling the flow of production processes. Because a digital twin must accurately reflect a specific production system, the study begins by formulating a general mathematical model. Algorithms for product ordering and for constructing the digital twin of the production processes were developed. Furthermore, three implementation scenarios for the hierarchical digital twin were proposed using the KbRS and FlexSim tools. The implementation of the hierarchical digital twin concept facilitated the development of the more comprehensive virtual model. At the same time, the integration of data between the two software environments enabled the generation of more detailed and precise results. Traditionally, a digital twin created solely within a single simulation platform is unable to represent all the structural components of a production system—an issue addressed by the hierarchical approach presented in this study. Full article
Show Figures

Figure 1

15 pages, 1604 KB  
Article
Host-Filtered Blood Nucleic Acids for Pathogen Detection: Shared Background, Sparse Signal, and Methodological Limits
by Zhaoxia Wang, Guangchan Chen, Mei Yang, Saihua Wang, Jiahui Fang, Ce Shi, Yuying Gu and Zhongping Ning
Pathogens 2026, 15(1), 55; https://doi.org/10.3390/pathogens15010055 - 6 Jan 2026
Viewed by 284
Abstract
Plasma cell-free RNA (cfRNA) metagenomics is increasingly explored for blood-based pathogen detection, but the structure of the shared background “blood microbiome”, the reproducibility of reported signals, and the practical limits of this approach remain unclear. We performed a critical re-analysis and benchmarking (“stress [...] Read more.
Plasma cell-free RNA (cfRNA) metagenomics is increasingly explored for blood-based pathogen detection, but the structure of the shared background “blood microbiome”, the reproducibility of reported signals, and the practical limits of this approach remain unclear. We performed a critical re-analysis and benchmarking (“stress test”) of host-filtered blood RNA sequencing data from two cohorts: a bacteriologically confirmed tuberculosis (TB) cohort (n = 51) previously used only to derive host cfRNA signatures, and a coronary artery disease (CAD) cohort (n = 16) previously reported to show a CAD-shifted “blood microbiome” enriched for periodontal taxa. Both datasets were processed with a unified pipeline combining stringent human read removal and taxonomic profiling using the latest versions of specialized tools Kraken2 and MetaPhlAn4. Across both cohorts, only a minority of non-host reads were classifiable; under strict host filtering, classified non-host reads comprised 7.3% (5.0–12.0%) in CAD and 21.8% (5.4–31.5%) in TB, still representing only a small fraction of total cfRNA. Classified non-host communities were dominated by recurrent, low-abundance taxa from skin, oral, and environmental lineages, forming a largely shared, low-complexity background in both TB and CAD. Background-derived bacterial signatures showed only modest separation between disease and control groups, with wide intra-group variability. Mycobacterium tuberculosis-assigned reads were detectable in many TB-positive samples but accounted for ≤0.001% of total cfRNA and occurred at similar orders of magnitude in a subset of TB-negative samples, precluding robust discrimination. Phylogeny-aware visualization confirmed that visually “enriched” taxa in TB-positive plasma arose mainly from background-associated clades rather than a distinct pathogen-specific cluster. Collectively, these findings provide a quantitative benchmark of the background-dominated regime and practical limits of plasma cfRNA metagenomics for pathogen detection, highlighting that practical performance is constrained more by a shared, low-complexity background and sparse pathogen-derived fragments than by large disease-specific shifts, underscoring the need for transparent host filtering, explicit background modeling, and integration with targeted or orthogonal assays. Full article
(This article belongs to the Section Bacterial Pathogens)
Show Figures

Figure 1

18 pages, 3115 KB  
Article
A Novel Reactive Power Decoupling Strategy for VSG Inverter Systems Using Adaptive Dynamic Virtual Impedance
by Wei Luo, Chenwei Zhang, Weizhong Chen, Bin Zhang and Zhenyu Lv
Electronics 2026, 15(1), 241; https://doi.org/10.3390/electronics15010241 - 5 Jan 2026
Viewed by 170
Abstract
Virtual synchronous machine (VSG) technology provides a robust framework for integrating electric vehicle energy storage into modern microgrids. Nonetheless, conventional VSG control often suffers from intense interaction between active and reactive power flows, which can trigger persistent steady-state errors, power fluctuations, and potential [...] Read more.
Virtual synchronous machine (VSG) technology provides a robust framework for integrating electric vehicle energy storage into modern microgrids. Nonetheless, conventional VSG control often suffers from intense interaction between active and reactive power flows, which can trigger persistent steady-state errors, power fluctuations, and potential system collapse. This research addresses these challenges by developing a 5th-order electromagnetic dynamic model tailored for a two-stage cascaded bridge inverter. By synthesizing a 3rd-order power regulation loop with a 2nd-order output stage, the proposed model captures stability boundaries across an extensive parameter spectrum. Unlike traditional 3rd-order “quasi-steady-state” approaches—which overlook essential dynamics under weak-damping or low-inertia conditions—this study utilizes the 5th-order model to derive an adaptive dynamic virtual impedance decoupling technique. This strategy facilitates real-time compensation of the cross-coupling between active and reactive channels, significantly boosting the inverter’s damping ratio. Quantitative analysis confirms that this approach curtails overshoot by 85.6% and accelerates the stabilization process by 42%, markedly enhancing the overall dynamic performance of the grid-connected system. Full article
(This article belongs to the Special Issue Intelligent Control Strategies for Power Electronics)
Show Figures

Figure 1

16 pages, 2031 KB  
Article
Cooperative 4D Trajectory Prediction and Conflict Detection in Integrated Airspace
by Xin Ma, Linxin Zheng, Jiajun Zhao and Yuxin Wu
Algorithms 2026, 19(1), 32; https://doi.org/10.3390/a19010032 - 1 Jan 2026
Viewed by 192
Abstract
In order to effectively ensure the flight safety of unmanned aerial vehicles (UAVs) and effectively deal with the risk of integrated airspace operation, this study carried out a series of key technology exploration and verification. In terms of data processing, Density-based spatial clustering [...] Read more.
In order to effectively ensure the flight safety of unmanned aerial vehicles (UAVs) and effectively deal with the risk of integrated airspace operation, this study carried out a series of key technology exploration and verification. In terms of data processing, Density-based spatial clustering of applications with noise (DBSCAN) clustering method is used to preprocess the characteristics of UAV automatic dependent surveillance–broadcast (ADS-B) data, effectively purify the data from the source, eliminate the noise and outliers of track data in spatial dimension and spatial-temporal dimension, significantly improve the data quality and standardize the data characteristics, and lay a reliable and high-quality data foundation for subsequent trajectory analysis and prediction. In terms of trajectory prediction, the convolutional neural networks-bidirectional gated recurrent unit (CNN-BiGRU) trajectory prediction model is innovatively constructed, and the integrated intelligent calculation of ‘prediction-judgment’ is successfully realized. The output of the model can accurately and prospectively judge the conflict situation and conflict degree between any two trajectories, and provide core and direct technical support for trajectory conflict warning. In the aspect of conflict detection, the performance of the model and the effect of conflict detection are fully verified by simulation experiments. By comparing the predicted data of the model with the real track data, it is confirmed that the CNN-BiGRU prediction model has high accuracy and reliability in calculating the distance between aircraft. At the same time, the preset conflict detection method is used for further verification. The results show that there is no conflict risk between the UAV and the manned aircraft in integrated airspace during the full 800 s of terminal area flight. In summary, the trajectory prediction model and conflict detection method proposed in this study provide a key technical guarantee for the construction of an active and accurate integrated airspace security management and control system, and have important application value and reference significance for improving airspace management efficiency and preventing flight conflicts. Full article
Show Figures

Figure 1

34 pages, 2000 KB  
Article
Unlocking Organizational Performance Through Employee Experience Capital: Mediation of Resonance and Vitality with Employee Well-Being as Moderator
by Mohammad Ahmad Al-Omari, Jihene Mrabet, Yamijala Suryanarayana Murthy, Rohit Bansal, Ridhima Sharma, Aulia Luqman Aziz and Arfendo Propheto
Adm. Sci. 2026, 16(1), 20; https://doi.org/10.3390/admsci16010020 - 30 Dec 2025
Viewed by 407
Abstract
The research elaborates on and empirically verifies an integrative model that describes how the combination of various workplace resources results in the improvement of employee and organizational outcomes. It is based on the Job Demands–Resources model and the Resource-Based View to conceptualize Employee [...] Read more.
The research elaborates on and empirically verifies an integrative model that describes how the combination of various workplace resources results in the improvement of employee and organizational outcomes. It is based on the Job Demands–Resources model and the Resource-Based View to conceptualize Employee Experience Capital (EEC) as a higher-order construct, consisting of seven interrelation drivers, including digital autonomy, inclusive cognition, sustainability alignment, AI synergy, mindful design, learning agility, and wellness technology. This study examines the effect of these resources in developing two psychological processes, work resonance and employee vitality, which subsequently improves organizational performance. It also examines how the well-being of employees can be a contextual moderator that determines such relationships. The study, based on a cross-sectional design and the diversified sample of the employees who work in various digitally transformed industries, proves that EEC is a great way to improve resonance and vitality, which are mutually complementary mediators between resource bundles and performance outcomes. Employee well-being turns out to be a factor of performance, as opposed to a circumscribed condition. The results put EEC as one of the strategic types of human capital that values digital, sustainable, and wellness-oriented practices to employee well-being and sustainable organizational performance and provides new theoretical contributions and practical guidance to leaders striving to create resource-rich, high-performing workplaces. Full article
Show Figures

Figure 1

34 pages, 6240 KB  
Article
Mechanistic Prediction of Machining-Induced Deformation in Metallic Alloys Using Property-Based Regression and Principal Component Analysis
by Mohammad S. Alsoufi and Saleh A. Bawazeer
Machines 2026, 14(1), 37; https://doi.org/10.3390/machines14010037 - 28 Dec 2025
Viewed by 301
Abstract
Accurately predicting machining-induced deformation is crucial for high-precision CNC turning, particularly when working with dissimilar metallic alloys. This study presents a novel, data-driven framework that integrates empirical deformation analysis, multivariate regression, and principal component analysis (PCA) to predict axial deformation as a function [...] Read more.
Accurately predicting machining-induced deformation is crucial for high-precision CNC turning, particularly when working with dissimilar metallic alloys. This study presents a novel, data-driven framework that integrates empirical deformation analysis, multivariate regression, and principal component analysis (PCA) to predict axial deformation as a function of intrinsic material properties, including Brinell hardness, thermal conductivity, and Young’s modulus. The approach begins with second-order polynomial modeling of experimentally observed force–deformation behavior, from which three physically interpretable coefficients, nonlinear (a), load-sensitive (b), and intercept (c), are extracted. Each coefficient is then modeled using log-linear power-law regression, revealing strong statistical relationships with material properties. Specifically, the nonlinear coefficient correlates predominantly with thermal conductivity, while both the linear and offset terms are governed mainly by hardness, with average R2 values exceeding 0.999 across all materials. To improve physical insight and reduce dimensionality, three non-dimensional ratios (H/E, k/E, H/k) are also introduced, enhancing correlation and interpretability. PCA further confirms that over 93% of the total variance in deformation behavior can be captured using just two principal components, with clear separation of materials based on thermomechanical signature and deformation coefficients. This is the first comprehensive study to unify empirical modeling, property-driven regression, and PCA for deformation prediction in CNC-machined alloys. The resulting framework offers a scalable, interpretable, and physically grounded alternative to black-box models, providing rapid screening of new materials, reduced experimental demand, and support for smart manufacturing applications, such as digital twins and material-informed process optimization. Full article
(This article belongs to the Section Advanced Manufacturing)
Show Figures

Figure 1

15 pages, 1569 KB  
Article
Integrative COI Barcoding and Species Delimitation in Echinodermata from Vietnam
by Tran My Linh, Nguyen Chi Mai, Pham Thi Hoe, Le Quang Trung, Nguyen Tuong Van, Luu Xuan Hoa, Hoang Dinh Chieu, Pham Tran Dinh Nho, Nguyen Kim Thoa, Le Quynh Lien and Do Cong Thung
Fishes 2026, 11(1), 15; https://doi.org/10.3390/fishes11010015 - 27 Dec 2025
Viewed by 274
Abstract
Echinoderms are marine invertebrates that play important roles in structuring marine benthic ecosystems. DNA barcoding has become a valuable tool for species identification; however, reference DNA barcode libraries for echinoderms remain incomplete. This study aims to: (i) develop a COI-5′ reference dataset for [...] Read more.
Echinoderms are marine invertebrates that play important roles in structuring marine benthic ecosystems. DNA barcoding has become a valuable tool for species identification; however, reference DNA barcode libraries for echinoderms remain incomplete. This study aims to: (i) develop a COI-5′ reference dataset for echinoderms from Vietnam by integrating DNA barcodes with morphological data; (ii) evaluate species resolution and barcode gaps using multiple analytical approaches; (iii) assess the consistency of species assignments from BOLD and GenBank for echinoderms collected in Vietnam; (iv) make barcode data publicly available to support global reference database development. Thirty-two echinoderm specimens representing 16 species were analyzed for COI-5′ sequences, and BLAST assignments were highly concordant with those from GenBank and BOLD. Integrative validation confirmed that all taxa were monophyletic in the Neighbor Joining Tree, formed single OTUs in Cluster Sequences, and exhibited clear barcode gaps greater than 3% to the nearest-neighbor species. These results provided species-level resolution for 75% and genus-level resolution for 90% of the records. The dataset, spanning four classes, eight orders, and eleven families, enhances barcode coverage and contributes records (ProcessIDs. BINs; GenBank accessions) to public repositories. This study delivers the first curated COI-5′ reference library, supporting regional baselines for taxonomy, conservation, and biodiversity assessment. Full article
(This article belongs to the Special Issue Molecular Phylogeny and Taxonomy of Aquatic Animals)
Show Figures

Graphical abstract

Back to TopTop