Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,450)

Search Parameters:
Keywords = high-dimensional sampling

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
12 pages, 307 KB  
Article
Blockwise Exponential Covariance Modeling for High-Dimensional Portfolio Optimization
by Congying Fan and Jacquline Tham
Symmetry 2026, 18(1), 171; https://doi.org/10.3390/sym18010171 (registering DOI) - 16 Jan 2026
Abstract
This paper introduces a new framework for high-dimensional covariance matrix estimation, the Blockwise Exponential Covariance Model (BECM), which extends the traditional block-partitioned representation to the log-covariance domain. By exploiting the block-preserving properties of the matrix logarithm and exponential transformations, the proposed model guarantees [...] Read more.
This paper introduces a new framework for high-dimensional covariance matrix estimation, the Blockwise Exponential Covariance Model (BECM), which extends the traditional block-partitioned representation to the log-covariance domain. By exploiting the block-preserving properties of the matrix logarithm and exponential transformations, the proposed model guarantees strict positive definiteness while substantially reducing the number of parameters to be estimated through a blockwise log-covariance parameterization, without imposing any rank constraint. Within each block, intra- and inter-group dependencies are parameterized through interpretable coefficients and kernel-based similarity measures of factor loadings, enabling a data-driven representation of nonlinear groupwise associations. Using monthly stock return data from the U.S. stock market, we conduct extensive rolling-window tests to evaluate the empirical performance of the BECM in minimum-variance portfolio construction. The results reveal three main findings. First, the BECM consistently outperforms the Canonical Block Representation Model (CBRM) and the native 1/N benchmark in terms of out-of-sample Sharpe ratios and risk-adjusted returns. Second, adaptive determination of the number of clusters through cross-validation effectively balances structural flexibility and estimation stability. Third, the model maintains numerical robustness under fine-grained partitions, avoiding the loss of positive definiteness common in high-dimensional covariance estimators. Overall, the BECM offers a theoretically grounded and empirically effective approach to modeling complex covariance structures in high-dimensional financial applications. Full article
(This article belongs to the Section Mathematics)
23 pages, 773 KB  
Article
Predicting Employee Turnover Based on Improved ADASYN and GS-CatBoost
by Shuigen Hu and Kai Dong
Mathematics 2026, 14(2), 313; https://doi.org/10.3390/math14020313 (registering DOI) - 16 Jan 2026
Abstract
In corporate management practices, human resources are among the most active and critical elements, and frequent employee turnover can impose substantial losses on firms. Accurately predicting employee turnover dynamics and identifying turnover propensity in advance is therefore of significant importance for organizational development. [...] Read more.
In corporate management practices, human resources are among the most active and critical elements, and frequent employee turnover can impose substantial losses on firms. Accurately predicting employee turnover dynamics and identifying turnover propensity in advance is therefore of significant importance for organizational development. To improve turnover prediction performance, this study proposes an employee turnover prediction model that integrates an improved ADASYN data rebalancing algorithm with a grid-search-optimized CatBoost classifier. In practice, turnover instances typically constitute a minority class; severe class imbalance may lead to overfitting or underfitting and thus degrade predictive performance. To mitigate imbalance, we employ ADASYN oversampling to reduce skewness in the dataset. However, because ADASYN is primarily designed for continuous features, it may generate invalid or meaningless values when discrete variables are present. Accordingly, we improve ADASYN by introducing a new distance metric and an enhanced sample generation strategy, making it applicable to turnover data with mixed (continuous and discrete) features. Given CatBoost’s strong predictive capability in high-dimensional settings, we adopt CatBoost as the base learner. Nonetheless, CatBoost performance is highly sensitive to hyperparameter choices, and different parameter combinations can yield markedly different results. Therefore, we apply grid search (GS) to efficiently optimize CatBoost hyperparameters and obtain the best-performing configuration. Experimental results on three datasets demonstrate that the proposed improved-ADASYN GS-CatBoost model effectively enhances turnover prediction performance, exhibiting strong robustness and adaptability. Compared with existing models, our approach improves predictive accuracy by approximately 4.6112%. Full article
(This article belongs to the Section E5: Financial Mathematics)
25 pages, 7202 KB  
Article
Optimal Design of a Coaxial Magnetic Gear Considering Thermal Demagnetization and Structural Robustness for Torque Density Enhancement
by Tae-Kyu Ji and Soo-Whang Baek
Actuators 2026, 15(1), 59; https://doi.org/10.3390/act15010059 - 16 Jan 2026
Abstract
This study presents an optimal design combined with comprehensive multiphysics validation to enhance the torque density of a coaxial magnetic gear (CMG) incorporating an overhang structure. Four high non-integer gear-ratio CMG configurations exceeding 1:10 were designed using different pole-pair combinations, and three-dimensional finite [...] Read more.
This study presents an optimal design combined with comprehensive multiphysics validation to enhance the torque density of a coaxial magnetic gear (CMG) incorporating an overhang structure. Four high non-integer gear-ratio CMG configurations exceeding 1:10 were designed using different pole-pair combinations, and three-dimensional finite element method (3D FEM) was employed to accurately capture axial leakage flux and overhang-induced three-dimensional effects. Eight key geometric design variables were selected within non-saturating limits, and 150 sampling points were generated using an Optimal Latin Hypercube Design (OLHD). Multiple surrogate models were constructed and evaluated using the root-mean-square error (RMSE), and the Kriging model was selected for multi-objective optimization using a genetic algorithm. The optimized CMG with a 1:10.66 gear ratio achieved a 130.76% increase in average torque (65.75 Nm) and a 162.51% improvement in torque density (117.14 Nm/L) compared with the initial design. Harmonic analysis revealed a strengthened fundamental component and a reduction in total harmonic distortion, indicating improved waveform quality. To ensure the feasibility of the optimized design, comprehensive multiphysics analyses—including electromagnetic–thermal coupled simulation, high-temperature demagnetization analysis, and structural stress evaluation—were conducted. The results confirm that the proposed CMG design maintains adequate thermal stability, magnetic integrity, and mechanical robustness under rated operating conditions. These findings demonstrate that the proposed optimal design approach provides a reliable and effective means of enhancing the torque density of high gear-ratio CMGs, offering practical design guidance for electric mobility, robotics, and renewable energy applications. Full article
Show Figures

Figure 1

17 pages, 1704 KB  
Article
Multi-Objective Optimization of Meat Sheep Feed Formulation Based on an Improved Non-Dominated Sorting Genetic Algorithm
by Haifeng Zhang, Yuwei Gao, Xiang Li and Tao Bai
Appl. Sci. 2026, 16(2), 912; https://doi.org/10.3390/app16020912 - 15 Jan 2026
Abstract
Feed formulation is a typical multi-objective optimization problem that aims to minimize cost while satisfying multiple nutritional constraints. However, existing methods often suffer from limitations in handling nonlinear constraints, high-dimensional decision spaces, and solution feasibility. To address these challenges, this study proposes a [...] Read more.
Feed formulation is a typical multi-objective optimization problem that aims to minimize cost while satisfying multiple nutritional constraints. However, existing methods often suffer from limitations in handling nonlinear constraints, high-dimensional decision spaces, and solution feasibility. To address these challenges, this study proposes a multi-objective feed formulation method based on an improved Non-dominated Sorting Genetic Algorithm II (NSGA-II). A hybrid Dirichlet–Latin Hypercube Sampling (Dirichlet-LHS) strategy is introduced to generate an initial population with high feasibility and diversity, together with an iterative normalization-based dynamic repair operator to efficiently handle ingredient proportion and nutritional constraints. In addition, an adaptive termination mechanism based on the hypervolume improvement rate (Hypervolume Termination, HVT) is designed to avoid redundant computation while ensuring effective convergence of the Pareto front. Experimental results demonstrate that the Dirichlet–LHS strategy outperforms random sampling, Dirichlet sampling, and Latin hypercube sampling in terms of hypervolume and solution diversity. Under identical nutritional constraints, the improved NSGA-II reduces formulation cost by 1.52% compared with multi-objective Bayesian optimization and by 2.17% relative to conventional feed formulation methods. In a practical application to meat sheep diet formulation, the optimized feed cost is reduced to 1162.23 CNY per ton, achieving a 4.83% cost reduction with only a 1.09 s increase in computation time. These results indicate that the proposed method effectively addresses strongly constrained multi-objective feed formulation problems and provides reliable technical support for precision feeding in intelligent livestock production. Full article
(This article belongs to the Section Agricultural Science and Technology)
Show Figures

Figure 1

24 pages, 4850 KB  
Article
Multi-Dimensional Monitoring of Agricultural Drought at the Field Scale
by Yehao Wu, Liming Zhu, Maohua Ding and Lijie Shi
Agriculture 2026, 16(2), 227; https://doi.org/10.3390/agriculture16020227 - 15 Jan 2026
Abstract
The causes of agricultural drought are complex, and its actual occurrence process is often characterized by rapid onset in terms of time and small scale in terms of space. Monitoring agricultural drought using satellite remote sensing with low spatial resolution makes it difficult [...] Read more.
The causes of agricultural drought are complex, and its actual occurrence process is often characterized by rapid onset in terms of time and small scale in terms of space. Monitoring agricultural drought using satellite remote sensing with low spatial resolution makes it difficult to accurately capture the details of small-scale drought events. High-resolution satellite remote sensing has relatively long revisit cycles, making it difficult to capture the rapid evolution of drought conditions. Furthermore, the occurrence of agricultural drought is linked to multiple factors including precipitation, evapotranspiration, soil properties, and crop physiological characteristics. Consequently, relying on a single variable or indicator is insufficient for multidimensional monitoring of agricultural drought. This study takes Hebi City, Henan Province as the research area. It uses Sentinel-1 satellite data (HV, VV), Sentinel-2 data (NDVI, B2, B11), elevation, slope, aspect, and GPM precipitation data from 2019 to 2024 as independent variables. Three machine learning algorithms—Random Forest (RF), Random Forest-Recursive Feature Elimination (RF-RFE), and eXtreme Gradient Boosting (XGBoost)—were employed to construct a multi-dimensional agricultural drought monitoring model at the field scale. Additionally, the study verified the sensitivity of different environmental variables to agricultural drought monitoring and analyzed the accuracy performance of different machine learning algorithms in agricultural drought monitoring. The research results indicate that under the condition of full-factor input, all three models exhibit the optimal predictive performance. Among them, the XGBoost model performs the best, with the smallest Relative Root Mean Square Error (RRMSE) of 0.45 and the highest Correlation Coefficient (R) of 0.79. The absence of Digital Elevation Model (DEM) data impairs the models’ ability to capture the patterns of key features, which in turn leads to a reduction in predictive accuracy. Meanwhile, there is a significant correlation between model performance and sample size. Ultimately, the constructed XGBoost model takes the lead with an accuracy of 89%, while the accuracies of Random Forest (RF) and Random Forest-Recursive Feature Elimination (RF-RFE) are 88% and 86%, respectively. Based on these three drought monitoring models, this study further monitored a drought event that occurred in Hebi City in 2023, presented the spatiotemporal distribution of agricultural drought in Hebi City, and applied the Mann–Kendall test for time series analysis, aiming to identify the abrupt change process of agricultural drought. Meanwhile, on the basis of the research results, the feasibility of verifying drought occurrence using irrigation signals was discussed, and the potential reasons for the significantly lower drought occurrence probability in the western mountainous areas of the study region were analyzed. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

20 pages, 8641 KB  
Article
A Novel Stochastic Finite Element Model Updating Method Based on Multi-Point Sensitivities
by Zheng Yang, Zhiyu Shi and Jinyan Li
Appl. Sci. 2026, 16(2), 867; https://doi.org/10.3390/app16020867 - 14 Jan 2026
Viewed by 20
Abstract
A novel stochastic finite element model updating method based on multi-point sensitivities is proposed to improve the reproduction and prediction ability of finite element models for experimental data. Drawing upon the theory of small perturbations, this approach employs the sensitivity matrix in conjunction [...] Read more.
A novel stochastic finite element model updating method based on multi-point sensitivities is proposed to improve the reproduction and prediction ability of finite element models for experimental data. Drawing upon the theory of small perturbations, this approach employs the sensitivity matrix in conjunction with the probability distribution of responses evaluated at multiple parameter points to determine the probability density associated with each parameter point and to estimate the statistical properties of the parameters. To achieve this objective, principal component analysis is employed to unify the dimensionality of the parameters and the responses; the least squares method was used to estimate the characteristics of the parameters. The reliability and validity of this method were confirmed through experimentation with a 3-degree-of-freedom spring-mass system and an aerospace thermal insulation structure. A comparison of this method with classical methods reveals significant advantages in terms of robustness across varying computational scales. Notably, it attains superior accuracy with smaller sample sizes while maintaining precision comparable to conventional methods with large samples. Consequently, this characteristic confers upon the method a distinct advantage in scenarios where the costs of finite element computation are prohibitively high. Full article
Show Figures

Figure 1

23 pages, 9357 KB  
Article
Intelligent Evaluation of Rice Resistance to White-Backed Planthopper (Sogatella furcifera) Based on 3D Point Clouds and Deep Learning
by Yuxi Zhao, Huilai Zhang, Wei Zeng, Litu Liu, Qing Li, Zhiyong Li and Chunxian Jiang
Agriculture 2026, 16(2), 215; https://doi.org/10.3390/agriculture16020215 - 14 Jan 2026
Viewed by 24
Abstract
Accurate assessment of rice resistance to Sogatella furcifera (Horváth) is essential for breeding insect-resistant cultivars. Traditional assessment methods rely on manual scoring of damage severity, which is subjective and inefficient. To overcome these limitations, this study proposes an automated resistance evaluation approach based [...] Read more.
Accurate assessment of rice resistance to Sogatella furcifera (Horváth) is essential for breeding insect-resistant cultivars. Traditional assessment methods rely on manual scoring of damage severity, which is subjective and inefficient. To overcome these limitations, this study proposes an automated resistance evaluation approach based on multi-view 3D reconstruction and deep learning–based point cloud segmentation. Multi-view videos of rice materials with different resistance levels were collected over time and processed using Structure from Motion (SfM) and Multi-View Stereo (MVS) to reconstruct high-quality 3D point clouds. A well-annotated “3D Rice WBPH Damage” dataset comprising 174 samples (15 rice materials, three replicates each, 45 pots) was established, where each sample corresponds to a reconstructed 3D point cloud from a video sequence. A comparative study of various point cloud semantic segmentation models, including PointNet, PointNet++, ShellNet, and PointCNN, revealed that the PointNet++ (MSG) model, which employs a Multi-Scale Grouping strategy, demonstrated the best performance in segmenting complex damage symptoms. To further accurately quantify the severity of damage, an adaptive point cloud dimensionality reduction method was proposed, which effectively mitigates the interference of leaf shrinkage on damage assessment. Experimental results demonstrated a strong correlation (R2 = 0.95) between automated and manual evaluations, achieving accuracies of 86.67% and 93.33% at the sample and material levels, respectively. This work provides an objective, efficient, and scalable solution for evaluating rice resistance to S. furcifera, offering promising applications in crop resistance breeding. Full article
(This article belongs to the Section Crop Protection, Diseases, Pests and Weeds)
Show Figures

Figure 1

30 pages, 3060 KB  
Article
LLM-Based Multimodal Feature Extraction and Hierarchical Fusion for Phishing Email Detection
by Xinyang Yuan, Jiarong Wang, Tian Yan and Fazhi Qi
Electronics 2026, 15(2), 368; https://doi.org/10.3390/electronics15020368 - 14 Jan 2026
Viewed by 25
Abstract
Phishing emails continue to evade conventional detection systems due to their increasingly sophisticated, multi-faceted social engineering tactics. To address the limitations of single-modality or rule-based approaches, we propose SAHF-PD, a novel phishing detection framework that integrates multi-modal feature extraction with semantic-aware hierarchical fusion, [...] Read more.
Phishing emails continue to evade conventional detection systems due to their increasingly sophisticated, multi-faceted social engineering tactics. To address the limitations of single-modality or rule-based approaches, we propose SAHF-PD, a novel phishing detection framework that integrates multi-modal feature extraction with semantic-aware hierarchical fusion, based on large language models (LLMs). Our method leverages modality-specialized large models, each guided by domain-specific prompts and constrained to a standardized output schema, to extract structured feature representations from four complementary sources associated with each phishing email: email body text; open-source intelligence (OSINT) derived from the key embedded URL; screenshot of the landing page; and the corresponding HTML/JavaScript source code. This design mitigates the unstructured and stochastic nature of raw generative outputs, yielding consistent, interpretable, and machine-readable features. These features are then integrated through our Semantic-Aware Hierarchical Fusion (SAHF) mechanism, which organizes them into core, auxiliary, and weakly associated layers according to their semantic relevance to phishing intent. This layered architecture enables dynamic weighting and redundancy reduction based on semantic relevance, which in turn highlights the most discriminative signals across modalities and enhances model interpretability. We also introduce PhishMMF, a publicly released multimodal feature dataset for phishing detection, comprising 11,672 human-verified samples with meticulously extracted structured features from all four modalities. Experiments with eight diverse classifiers demonstrate that the SAHF-PD framework enables exceptional performance. For instance, XGBoost equipped with SAHF attains an AUC of 0.99927 and an F1-score of 0.98728, outperforming the same model using the original feature representation. Moreover, SAHF compresses the original 228-dimensional feature space into a compact 56-dimensional representation (a 75.4% reduction), reducing the average training time across all eight classifiers by 43.7% while maintaining comparable detection accuracy. Ablation studies confirm the unique contribution of each modality. Our work establishes a transparent, efficient, and high-performance foundation for next-generation anti-phishing systems. Full article
(This article belongs to the Section Artificial Intelligence)
Show Figures

Figure 1

28 pages, 1779 KB  
Review
Two-Dimensional Carbon-Based Electrochemical Sensors for Pesticide Detection: Recent Advances and Environmental Monitoring Applications
by K. Imran, Al Amin, Gajapaneni Venkata Prasad, Y. Veera Manohara Reddy, Lestari Intan Gita, Jeyaraj Wilson and Tae Hyun Kim
Biosensors 2026, 16(1), 62; https://doi.org/10.3390/bios16010062 - 14 Jan 2026
Viewed by 38
Abstract
Pesticides have been widely applied in agricultural practices over the past decades to protect crops from pests and other harmful organisms. However, their extensive use results in the contamination of soil, water, and agricultural products, posing significant risks to human and environmental health. [...] Read more.
Pesticides have been widely applied in agricultural practices over the past decades to protect crops from pests and other harmful organisms. However, their extensive use results in the contamination of soil, water, and agricultural products, posing significant risks to human and environmental health. Exposure to pesticides can lead to skin irritation, respiratory disorders, and various chronic health problems. Moreover, pesticides frequently enter surface water bodies such as rivers and lakes through agricultural runoff and leaching processes. Therefore, developing effective analytical methods for the rapid and sensitive detection of pesticides in food and water is of great importance. Electrochemical sensing techniques have shown remarkable progress in pesticide analysis due to their high sensitivity, simplicity, and potential for on-site monitoring. Two-dimensional (2D) carbon nanomaterials have emerged as efficient electrocatalysts for the precise and selective detection of pesticides, owing to their large surface area, excellent electrical conductivity, and unique structural features. In this review, we summarize recent advancements in the electrochemical detection of pesticides using 2D carbon-based materials. Comprehensive information on electrode fabrication, sensing mechanisms, analytical performance—including sensing range and limit of detection—and the versatility of 2D carbon composites for pesticide detection is provided. Challenges and future perspectives in developing highly sensitive and selective electrochemical sensing platforms are also discussed, highlighting their potential for simultaneous pesticide monitoring in food and environmental samples. Carbon-based electrochemical sensors have been the subject of many investigations, but their practical application in actual environmental and food samples is still restricted because of matrix effects, operational instability, and repeatability issues. In order to close the gap between laboratory research and real-world applications, this review critically examines sensor performance in real-sample conditions and offers innovative approaches for in situ pesticide monitoring. Full article
Show Figures

Figure 1

13 pages, 2281 KB  
Article
Microstructural Engineering of Magnetic Wood for Enhanced Magnetothermal Conversion
by Yuxi Lin, Chen Chen and Wei Xu
Magnetochemistry 2026, 12(1), 11; https://doi.org/10.3390/magnetochemistry12010011 - 13 Jan 2026
Viewed by 66
Abstract
The increasing energy crisis demands sustainable functional materials. Wood, with its natural three-dimensional porous structure, offers an ideal renewable template. This study demonstrates that microstructural engineering of wood is a decisive strategy for enhancing magnetothermal conversion. Using eucalyptus wood, we precisely tailored its [...] Read more.
The increasing energy crisis demands sustainable functional materials. Wood, with its natural three-dimensional porous structure, offers an ideal renewable template. This study demonstrates that microstructural engineering of wood is a decisive strategy for enhancing magnetothermal conversion. Using eucalyptus wood, we precisely tailored its pore architecture via delignification and synthesized Fe3O4 nanoparticles in situ through coprecipitation. We systematically investigated the effects of delignification and precursor immersion time (24, 48, 72 h) on the loading, distribution, and magnetothermal performance of the composites. Delignification drastically increased wood porosity, raising the Fe3O4 loading capacity from ~5–6% (in non-delignified wood) to over 14%. Immersion time critically influenced nanoparticle distribution: 48 h achieved optimal deep penetration and uniformity, whereas extended time (72 h) induced minor local agglomeration. The optimized composite (MDW-48) achieved an equilibrium temperature of 51.2 °C under a low alternating magnetic field (0.06 mT, 35 kHz), corresponding to a temperature rise (ΔT) > 24 °C and a Specific Loss Power (SLP) of 1.31W·g−1. This performance surpasses that of the 24 h sample (47 °C, SLP = 1.16 W·g−1) and rivals other bio-based magnetic systems. This work establishes a clear microstructure–property relationship: delignification enables high loading, while controlled impregnation tunes distribution uniformity, both directly governing magnetothermal efficiency. Our findings highlight delignified magnetic wood as a robust, sustainable platform for efficient low-field magnetothermal conversion, with promising potential in low-carbon thermal management. Full article
Show Figures

Figure 1

21 pages, 1234 KB  
Article
ReShuffle-MS: Region-Guided Data Augmentation Improves Artificial Intelligence-Based Resistance Prediction in Escherichia coli from MALDI-TOF Mass Spectrometry
by Dongbo Dai, Chenyang Huang, Junjie Li, Xiao Wei, Shengzhou Li, Qiong Wu and Huiran Zhang
Microorganisms 2026, 14(1), 177; https://doi.org/10.3390/microorganisms14010177 - 13 Jan 2026
Viewed by 109
Abstract
Rapid antimicrobial resistance (AMR) prediction from MALDI-TOF mass spectrometry (MS) remains challenging, particularly when training artificial intelligence (AI) models under small-sample constraints. Performance is often hampered by the high dimensionality of spectral data and the subtle nature of resistance-related signals: full-spectrum approaches risk [...] Read more.
Rapid antimicrobial resistance (AMR) prediction from MALDI-TOF mass spectrometry (MS) remains challenging, particularly when training artificial intelligence (AI) models under small-sample constraints. Performance is often hampered by the high dimensionality of spectral data and the subtle nature of resistance-related signals: full-spectrum approaches risk overfitting to high-dimensional noise, whereas peak-selection strategies risk discarding structurally informative, low-intensity signals. Here, we propose ReShuffle-MS, a region-guided data augmentation framework for MS data. Each spectrum is partitioned into a Main Discriminative Region (MDR) and a Peripheral Peak Region (PPR). By recombining signals within the PPR across samples of the same class while keeping the MDR intact, ReShuffle-MS generates structure-preserving augmented samples. On a clinical dataset for Escherichia coli (E. coli) levofloxacin resistance prediction, ReShuffle-MS delivered significant and consistent performance gains. It improved the average accuracy of classical machine learning models by 3.7% and enabled a one-dimensional convolutional neural network (CNN) to achieve 83.25% accuracy and 97.28% recall. Visualization using Grad-CAM revealed a shift from sparse, peak-dependent attention toward broader and more meaningful spectral patterns. Validation on the external DRIAMS-C dataset for ceftriaxone resistance further demonstrated that the method generalizes to a distinct laboratory setting and a different antibiotic target. These findings suggest that ReShuffle-MS can enhance the robustness and clinical utility of AI-based AMR prediction from routinely acquired MALDI-TOF spectra. Full article
Show Figures

Figure 1

19 pages, 6871 KB  
Article
A BIM-Derived Synthetic Point Cloud (SPC) Dataset for Construction Scene Component Segmentation
by Yiquan Zou, Tianxiang Liang, Wenxuan Chen, Zhixiang Ren and Yuhan Wen
Data 2026, 11(1), 16; https://doi.org/10.3390/data11010016 - 12 Jan 2026
Viewed by 125
Abstract
In intelligent construction and BIM–Reality integration applications, high-quality, large-scale construction scene point cloud data with component-level semantic annotations constitute a fundamental basis for three-dimensional semantic understanding and automated analysis. However, point clouds acquired from real construction sites commonly suffer from high labeling costs, [...] Read more.
In intelligent construction and BIM–Reality integration applications, high-quality, large-scale construction scene point cloud data with component-level semantic annotations constitute a fundamental basis for three-dimensional semantic understanding and automated analysis. However, point clouds acquired from real construction sites commonly suffer from high labeling costs, severe occlusion, and unstable data distributions. Existing public datasets remain insufficient in terms of scale, component coverage, and annotation consistency, limiting their suitability for data-driven approaches. To address these challenges, this paper constructs and releases a BIM-derived synthetic construction scene point cloud dataset, termed the Synthetic Point Cloud (SPC), targeting component-level point cloud semantic segmentation and related research tasks.The dataset is generated from publicly available BIM models through physics-based virtual LiDAR scanning, producing multi-view and multi-density three-dimensional point clouds while automatically inheriting component-level semantic labels from BIM without any manual intervention. The SPC dataset comprises 132 virtual scanning scenes, with an overall scale of approximately 8.75×109 points, covering typical construction components such as walls, columns, beams, and slabs. By systematically configuring scanning viewpoints, sampling densities, and occlusion conditions, the dataset introduces rich geometric and spatial distribution diversity. This paper presents a comprehensive description of the SPC data generation pipeline, semantic mapping strategy, virtual scanning configurations, and data organization scheme, followed by statistical analysis and technical validation in terms of point cloud scale evolution, spatial coverage characteristics, and component-wise semantic distributions. Furthermore, baseline experiments on component-level point cloud semantic segmentation are provided. The results demonstrate that models trained solely on the SPC dataset can achieve stable and engineering-meaningful component-level predictions on real construction point clouds, validating the dataset’s usability in virtual-to-real research scenarios. As a scalable and reproducible BIM-derived point cloud resource, the SPC dataset offers a unified data foundation and experimental support for research on construction scene point cloud semantic segmentation, virtual-to-real transfer learning, scan-to-BIM updating, and intelligent construction monitoring. Full article
Show Figures

Figure 1

54 pages, 3354 KB  
Review
Mamba for Remote Sensing: Architectures, Hybrid Paradigms, and Future Directions
by Zefeng Li, Long Zhao, Yihang Lu, Yue Ma and Guoqing Li
Remote Sens. 2026, 18(2), 243; https://doi.org/10.3390/rs18020243 - 12 Jan 2026
Viewed by 98
Abstract
Modern Earth observation combines high spatial resolution, wide swath, and dense temporal sampling, producing image grids and sequences far beyond the regime of standard vision benchmarks. Convolutional networks remain strong baselines but struggle to aggregate kilometre-scale context and long temporal dependencies without heavy [...] Read more.
Modern Earth observation combines high spatial resolution, wide swath, and dense temporal sampling, producing image grids and sequences far beyond the regime of standard vision benchmarks. Convolutional networks remain strong baselines but struggle to aggregate kilometre-scale context and long temporal dependencies without heavy tiling and downsampling, while Transformers incur quadratic costs in token count and often rely on aggressive patching or windowing. Recently proposed visual state-space models, typified by Mamba, offer linear-time sequence processing with selective recurrence and have therefore attracted rapid interest in remote sensing. This survey analyses how far that promise is realised in practice. We first review the theoretical substrates of state-space models and the role of scanning and serialization when mapping two- and three-dimensional EO data onto one-dimensional sequences. A taxonomy of scan paths and architectural hybrids is then developed, covering centre-focused and geometry-aware trajectories, CNN– and Transformer–Mamba backbones, and multimodal designs for hyperspectral, multisource fusion, segmentation, detection, restoration, and domain-specific scientific applications. Building on this evidence, we delineate the task regimes in which Mamba is empirically warranted—very long sequences, large tiles, or complex degradations—and those in which simpler operators or conventional attention remain competitive. Finally, we discuss green computing, numerical stability, and reproducibility, and outline directions for physics-informed state-space models and remote-sensing-specific foundation architectures. Overall, the survey argues that Mamba should be used as a targeted, scan-aware component in EO pipelines rather than a drop-in replacement for existing backbones, and aims to provide concrete design principles for future remote sensing research and operational practice. Full article
(This article belongs to the Section AI Remote Sensing)
Show Figures

Graphical abstract

35 pages, 15567 KB  
Article
Multi-Module Collaborative Optimization for SAR Image Aircraft Recognition: The SAR-YOLOv8l-ADE Network
by Xing Wang, Wen Hong, Qi Li, Yunqing Liu, Qiong Zhang and Ping Xin
Remote Sens. 2026, 18(2), 236; https://doi.org/10.3390/rs18020236 - 11 Jan 2026
Viewed by 122
Abstract
As a core node of the air transportation network, airports rely on aircraft model identification as a key link to support the development of smart aviation. Synthetic Aperture Radar (SAR), with its strong-penetration imaging capabilities, provides high-quality data support for this task. However, [...] Read more.
As a core node of the air transportation network, airports rely on aircraft model identification as a key link to support the development of smart aviation. Synthetic Aperture Radar (SAR), with its strong-penetration imaging capabilities, provides high-quality data support for this task. However, the field of SAR image interpretation faces numerous challenges. To address the core challenges in SAR image-based aircraft recognition, including insufficient dataset samples, single-dimensional target features, significant variations in target sizes, and high missed-detection rates for small targets, this study proposed an improved network architecture, SAR-YOLOv8l-ADE. Four modules achieve collaborative optimization: SAR-ACGAN integrates a self-attention mechanism to expand the dataset; SAR-DFE, a parameter-learnable dual-residual module, extracts multidimensional, detailed features; SAR-C2f, a residual module with multi-receptive field fusion, adapts to multi-scale targets; and 4SDC, a four-branch module with adaptive weights, enhances small-target recognition. Experimental results on the fused dataset SAR-Aircraft-EXT show that the mAP50 of the SAR-YOLOv8l-ADE network is 6.1% higher than that of the baseline network YOLOv8l, reaching 96.5%. Notably, its recognition accuracy for small aircraft targets shows a greater improvement, reaching 95.2%. The proposed network outperforms existing methods in terms of recognition accuracy and generalization under complex scenarios, providing technical support for airport management and control, as well as for emergency rescue in smart aviation. Full article
Show Figures

Figure 1

24 pages, 7954 KB  
Article
Machine Learning-Based Prediction of Maximum Stress in Observation Windows of HOV
by Dewei Li, Zhijie Wang, Zhongjun Ding and Xi An
J. Mar. Sci. Eng. 2026, 14(2), 151; https://doi.org/10.3390/jmse14020151 - 10 Jan 2026
Viewed by 171
Abstract
With advances in deep-sea exploration technologies, utilizing human-occupied vehicles (HOV) in marine science has become widespread. The observation window is a critical component, as its structural strength affects submersible safety and performance. Under load, it experiences stress concentration, deformation, cracking, and catastrophic failure. [...] Read more.
With advances in deep-sea exploration technologies, utilizing human-occupied vehicles (HOV) in marine science has become widespread. The observation window is a critical component, as its structural strength affects submersible safety and performance. Under load, it experiences stress concentration, deformation, cracking, and catastrophic failure. The observation window will experience different stress distributions in high-pressure environments. The maximum principal stress is the most significant phenomenon that determines the most likely failure of materials in windows of HOV. This study proposes an artificial intelligence-based method to predict the maximum principal stress of observation windows in HOV for rapid safety assessment. Samples were designed, while strain data with corresponding maximum principal stress values were collected under different loading conditions. Three machine learning algorithms—transformer–CNN-BiLSTM, CNN-LSTM, and Gaussian process regression (GP)—were employed for analysis. Results show that the transformer–CNN-BiLSTM model achieved the highest accuracy, particularly at the point exhibiting the maximum the principal stress value. Evaluation metrics, including mean squared error (MSE), mean absolute error (MAE), and root squared residual (RSR), confirmed its superior performance. The proposed hybrid model incorporates a positional encoding layer to enrich input data with locational information and combines the strengths of bidirectional long short-term memory (LSTM), one-dimensional CNN, and transformer–CNN-BiLSTM encoders. This approach effectively captures local and global stress features, offering a reliable predictive tool for health monitoring of submersible observation windows. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

Back to TopTop