Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,537)

Search Parameters:
Keywords = high-dimensional sampling

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 1999 KB  
Article
A Small-Sample Fault Diagnosis Method for High-Voltage Circuit Breaker Spring Mechanisms Based on Multi-Source Feature Fusion and Stacking Ensemble Learning
by Xining Li, Hanyan Xiao, Ke Zhao, Lei Sun, Tianxin Zhuang, Haoyan Zhang and Hongwei Mei
Sensors 2026, 26(5), 1485; https://doi.org/10.3390/s26051485 (registering DOI) - 26 Feb 2026
Abstract
To address the practical engineering challenges of limited fault samples for high-voltage circuit breaker spring operating mechanisms and the inability of single features to fully reflect equipment status, this paper proposes a small-sample fault diagnosis method based on multi-source feature fusion and Stacking [...] Read more.
To address the practical engineering challenges of limited fault samples for high-voltage circuit breaker spring operating mechanisms and the inability of single features to fully reflect equipment status, this paper proposes a small-sample fault diagnosis method based on multi-source feature fusion and Stacking ensemble learning. First, a multi-source sensing system containing MEMS (Micro-Electro-Mechanical System) pressure and travel, coil, and motor current was constructed to achieve comprehensive monitoring of the mechanical and electrical states of a 220 kV circuit breaker; in particular, the introduction of non-invasive MEMS sensors effectively solves the difficulty of capturing static spring fatigue characteristics inherent in traditional methods. Second, a high-dimensional feature space was constructed using Savitzky–Golay filtering and physical feature extraction techniques. To address the characteristics of small-sample data distribution, a two-layer Stacking ensemble learning model based on 5-fold cross-validation was designed. This model utilizes the SVM (Support Vector Machine), RF (Random Forest), and KNN (K-Nearest Neighbors) as base classifiers and Logistic Regression as the meta-learner, achieving an adaptive fusion of the advantages of heterogeneous algorithms. True-type experimental results show that the average diagnostic accuracy of this method under normal conditions and four typical fault conditions reaches 96.1%, which is superior to single base models (the RF was 94.2%). Feature importance analysis further confirms that closing and opening pressures are the most critical features for distinguishing mechanical faults. This study provides effective theoretical basis and technical support for condition-based maintenance of high-voltage circuit breakers under small-sample conditions. Full article
(This article belongs to the Special Issue Advanced Sensor Technologies for Corrosion Monitoring)
19 pages, 2815 KB  
Article
Federated Intrusion Detection via Unidirectional Serialization and Multi-Scale 1D Convolutions with Attention Reweighting
by Wenqing Li, Di Gao and Tianrong Zhang
Future Internet 2026, 18(3), 117; https://doi.org/10.3390/fi18030117 - 26 Feb 2026
Abstract
Deployed in distributed organizations and edge networks, contemporary intrusion detection increasingly requires high-performing models without centralizing sensitive traffic logs. This study presents a lightweight federated intrusion detection framework that integrates (i) unidirectional serialization to convert tabular flow records into short sequences, (ii) multi-scale [...] Read more.
Deployed in distributed organizations and edge networks, contemporary intrusion detection increasingly requires high-performing models without centralizing sensitive traffic logs. This study presents a lightweight federated intrusion detection framework that integrates (i) unidirectional serialization to convert tabular flow records into short sequences, (ii) multi-scale one-dimensional convolutions to capture heterogeneous temporal–statistical patterns at different receptive fields, and (iii) an attention-based reweighting module that emphasizes informative feature channels prior to classification. A sample-size-weighted FedAvg aggregation protocol is used to train a global detector without transferring raw data. Experiments on three widely used benchmarks (UNSW-NB15, KDD Cup 99, and NSL-KDD) under multiple client configurations report consistently high detection effectiveness, with peak accuracies of 99.38% (UNSW-NB15), 99.86% (KDD Cup 99), and 99.02% (NSL-KDD), alongside strong precision, recall, and F1 scores. In addition, the proposed framework is quantitatively benchmarked on UNSW-NB15 against two recent federated intrusion detection baselines, FedMSP-SPEC and a multi-view federated CAE-NSVM model, demonstrating improvements of more than 10 percentage points in macro F1-score while retaining a compact architecture. The manuscript further specifies a concrete threat model, clarifies the client data partitioning strategy and Non-IID quantification, and provides a reproducibility protocol (hyperparameters, random seeds, and evaluation procedures) to facilitate independent verification. Full article
(This article belongs to the Section Cybersecurity)
Show Figures

Graphical abstract

16 pages, 305 KB  
Article
Development and Pilot Validation of an Age-Friendly City Assessment Tool Based on Older Adults’ Perspectives in a Semi-Urban Community
by Autchariya Punyakaew, Pich Karakate, Tanaporn Nukeaw, Thanaporn Saopasee and Supawadee Putthinoi
Int. J. Environ. Res. Public Health 2026, 23(3), 287; https://doi.org/10.3390/ijerph23030287 - 26 Feb 2026
Abstract
Background: Age-friendly city (AFC) initiatives are widely promoted to support healthy aging; however, most existing AFC assessments rely on administrative or expert-driven evaluations that primarily reflect institutional perspectives. These approaches may overlook how age-friendly characteristics are experienced by older adults—the population most directly [...] Read more.
Background: Age-friendly city (AFC) initiatives are widely promoted to support healthy aging; however, most existing AFC assessments rely on administrative or expert-driven evaluations that primarily reflect institutional perspectives. These approaches may overlook how age-friendly characteristics are experienced by older adults—the population most directly affected by community environments—particularly in semi-urban settings. This study aimed to develop and conduct a preliminary psychometric evaluation of an AFC assessment tool based on older adults’ perspectives. Methods: A Research and Development (R&D) design was employed. The instrument was conceptually grounded in the World Health Organization Age-Friendly Cities framework and adapted from a governmental checklist through item reformulation and contextual modification for semi-urban application in Thai setting. Content validity was examined by an expert panel using the Index of Item–Objective Congruence (IOC). Preliminary internal consistency reliability testing was conducted with a small purposive sample of older adults. The refined instrument was then pilot-tested with an independent sample of community-dwelling older adults to evaluate feasibility and descriptive response patterns. Internal consistency reliability was assessed using Cronbach’s alpha, and descriptive analyses were performed across domains and subdomains. Results: The finalized instrument comprised 52 items across three domains and eight subdomains. Content validity was strong, with IOC values ranging from 0.80 to 1.00. Preliminary reliability testing demonstrated high internal consistency (Cronbach’s alpha = 0.97), indicating suitability for pilot use while suggesting potential item redundancy. Pilot responses showed predominantly high perceived age-friendliness, with moderate scores in selected subdomains. Conclusions: The AFC Assessment Tool demonstrated strong preliminary psychometric properties and practical feasibility for use among community-dwelling older adults in semi-urban settings. By incorporating older adults’ perspectives, the tool provides a context-sensitive approach that complements existing administrative and objective assessments. Further validation using larger and more diverse samples is needed to establish construct validity, confirm dimensional structure, and strengthen applicability in public health and environmental gerontology research. Full article
Show Figures

Graphical abstract

18 pages, 2374 KB  
Article
Parametric Sensitivity of Shear Correction Factors for Multiwall Corrugated Structures
by Julia Graczyk, Jędrzej Tworzydło and Tomasz Garbowski
Materials 2026, 19(5), 863; https://doi.org/10.3390/ma19050863 - 26 Feb 2026
Abstract
Transverse shear deformation plays a non-negligible role in lightweight periodic-core structures and motivates the use of shear-corrected reduced-order plate and beam models. However, the shear correction factor ks is often treated as a constant despite its strong dependence on cross-sectional heterogeneity and [...] Read more.
Transverse shear deformation plays a non-negligible role in lightweight periodic-core structures and motivates the use of shear-corrected reduced-order plate and beam models. However, the shear correction factor ks is often treated as a constant despite its strong dependence on cross-sectional heterogeneity and geometry. This work quantifies the global sensitivity of ks in corrugated paperboard by combining an energy-consistent pixel-based identification of the effective shear stiffness GA)eff with a space-filling exploration of the parameter domain. Representative three-ply (single-wall) and five-ply (double-wall) configurations are generated directly in the pixel domain using sinusoidal fluting descriptions and non-overlapping liner bands. The effective shear stiffness is obtained from a heterogeneous shear-energy equivalence, where a normalized two-dimensional shear-stress shape function is computed from pixel-based sectional descriptors and integrated with spatially varying shear moduli. Latin Hypercube Sampling is employed to explore wide ranges of flute period, height, and thickness, liner thicknesses, and liner–flute shear-modulus contrasts. Global sensitivity is reported using unit-free normalized indices, including log-elasticities (based on the slope of lnks versus lnx) and partial rank correlation coefficients. The results demonstrate that flute geometry is the primary driver of ks variability, while material contrast significantly modulates shear-energy localization, particularly in double-wall boards with two distinct flutings. The proposed framework enables high-throughput shear correction assessment and supports robust parameterized reduced-order models for corrugated structures. Full article
(This article belongs to the Section Materials Simulation and Design)
Show Figures

Figure 1

28 pages, 2771 KB  
Article
Improving Tree-Based Lung Disease Classification from Chest X-Ray Images Using Deep Feature Representations
by Abdulaziz A. Alsulami, Qasem Abu Al-Haija, Rayed Alakhtar, Huda Alsobhi, Rayan A. Alsemmeari, Badraddin Alturki and Ahmad J. Tayeb
Bioengineering 2026, 13(3), 267; https://doi.org/10.3390/bioengineering13030267 - 25 Feb 2026
Abstract
Healthcare systems worldwide face increasing pressure to deliver accurate, affordable, and scalable diagnostic services while maintaining long-term sustainability. Chest X-ray screening is considered one of the most cost-effective methods for detecting lung disease. However, many deep learning approaches are computationally intensive and difficult [...] Read more.
Healthcare systems worldwide face increasing pressure to deliver accurate, affordable, and scalable diagnostic services while maintaining long-term sustainability. Chest X-ray screening is considered one of the most cost-effective methods for detecting lung disease. However, many deep learning approaches are computationally intensive and difficult to interpret, which limits their adoption in high-throughput, resource-constrained clinical settings. This study proposes a hybrid CNN–tree framework for automated lung disease classification from chest X-ray images, which targets COVID-19, pneumonia, tuberculosis, lung cancer, and normal cases. To ensure robustness and generalization, four publicly available chest X-ray datasets from different sources are merged into a unified five-class dataset, which introduces realistic variations in imaging conditions and patient populations. A ResNet-18 model is fine-tuned to extract domain-specific deep feature representations. Feature dimensionality and redundancy are reduced using Principal Component Analysis, while class imbalance is addressed through the Synthetic Minority Over-sampling Technique. The resulting compact feature vectors are used to train interpretable tree-based classifiers, which include Decision Tree, Random Forest, and XGBoost. Experiments conducted using five-fold stratified cross-validation demonstrate substantial and consistent performance gains. When trained on fine-tuned and preprocessed deep features, all evaluated tree-based classifiers achieve weighted F1-scores between 0.977 and 0.982 using five-fold cross-validation, with a significant reduction in inter-class confusion. In addition, the proposed framework maintains low per-sample inference latency, which supports energy-efficient and scalable deployment. These results indicate that combining deep feature learning with interpretable tree-based models provides a practical and reliable solution for sustainable chest X-ray screening in real-world clinical environments. Full article
(This article belongs to the Section Biosignal Processing)
Show Figures

Figure 1

19 pages, 3397 KB  
Article
Nyemo Xuelai Tibetan Paper (Tibet, China): Research on Synergistic Correlations Between Surface Properties, Aging Resistance Mechanisms, Traditional Papermaking Crafts, and Protection Strategies
by Zhipeng Xiao, Xinyun Zhang, Yanxiang Li, Zhengfeng Liu, Haomiao Li, Xinyuan Zhang and Ruiying Ma
Coatings 2026, 16(3), 273; https://doi.org/10.3390/coatings16030273 - 25 Feb 2026
Abstract
As a representative intangible cultural heritage of Tibet, China, Nyemo Xuelai Tibetan paper has maintained its millennium-old inheritance, relying on its unique surface properties and aging resistance. However, at present, there remains a research gap regarding the surface characteristics of Nimu Xuela Tibetan [...] Read more.
As a representative intangible cultural heritage of Tibet, China, Nyemo Xuelai Tibetan paper has maintained its millennium-old inheritance, relying on its unique surface properties and aging resistance. However, at present, there remains a research gap regarding the surface characteristics of Nimu Xuela Tibetan paper and their correlation with aging mechanisms. To reveal their intrinsic mechanisms and provide scientific protection schemes, this study systematically analyzed the surface microstructure, chemical composition, pH variation, and aging resistance of 7 groups of Xuelai Tibetan paper samples using SEM-EDS, ATR-FTIR, pH testing, and dry-heat aging experiments (105 °C, 144 h). Combined with traditional crafts, the formation mechanism of properties was clarified, and multi-dimensional protection strategies were proposed. The results show that aging time exerted a highly significant effect on the D65 brightness, pH value, and tensile index of Xuelai Tibetan paper (p < 0.001). The fibers of Xuelai Tibetan paper are flat and ribbon-like, with an aspect ratio of 50–80, forming a tightly intertwined network structure. The core chemical component is cellulose with a relatively low lignin content, and the elemental composition is dominated by carbon and oxygen. Some samples contain calcium-based substances (0%–1.79%) derived from salt lake alkali. After aging, the D65 blue light diffuse reflectance factor (abbreviated as D65 brightness) retention rate of the samples ranges from 84.81% to 92.21%, and the tensile strength retention rate ranges from 30.78% to 90.00%. Calcium-based substances can inhibit the hydrolysis of cellulose glycosidic bonds through a weak alkaline buffering effect, improving aging-resistance stability. The excellent performance of Tibetan paper originates from the synergistic effect of traditional crafts: Stellera chamaejasme as raw material provides the material basis of high cellulose and long fibers; alkaline cooking removes lignin and retains the buffering components; manual beating optimizes the fiber’s interweaving structure; and natural air-drying ensures surface uniformity. Based on this, a multi-dimensional strategy of preventive protection and living inheritance is proposed: cultural relic protection focuses on pH stabilization, controlled storage, and non-destructive cleaning, and craft inheritance achieves sustainable development through raw material standardization, process refinement, and digital training. This study establishes the craft–characteristic–performance correlation mechanism of Xuelai Tibetan paper, verifying the statistical significance of aging-induced property changes and providing a scientific basis for the protection and inheritance of traditional handmade paper. Full article
Show Figures

Graphical abstract

16 pages, 2082 KB  
Article
MFF-AE: Enhanced Quality Control for Proteomics Mass Spectrometry Data via Multi-Scale Feature Fusion
by Guangkui Fan, Xinyu Ji, Hunyue Liao, Bo Meng, Duotao Pan, Jinze Huang and Yang Zhao
Int. J. Mol. Sci. 2026, 27(5), 2121; https://doi.org/10.3390/ijms27052121 - 25 Feb 2026
Abstract
Mass spectrometry (MS) is a core analytical tool in proteomics, and the quality of the generated data directly determines the effectiveness of downstream analyses and the reliability of final research conclusions. While MS is also widely used in other omics applications, this study [...] Read more.
Mass spectrometry (MS) is a core analytical tool in proteomics, and the quality of the generated data directly determines the effectiveness of downstream analyses and the reliability of final research conclusions. While MS is also widely used in other omics applications, this study focuses on label-free quantitative proteomics, where samples are represented as protein-abundance matrices derived from MaxQuant. However, MS data are typically characterized by high dimensionality and substantial noise, posing serious challenges for quality control (QC). Existing QC methods have limited feature extraction capabilities and struggled to capture the key information embedded in the data, resulting in poor performance in identifying anomalous samples. Here, we propose the Multi-Scale Feature Fusion-based Autoencoder (MFF-AE). This deep learning-based anomaly detection model achieves precise identification of anomalous samples by integrating both global and local data features. The model consists of three modules: an autoencoder-based backbone network that efficiently embeds raw data into a low-dimensional semantic space, a local feature extraction and fusion module designed to capture and integrate multi-scale features within MS data, and a sample identification module that enhances discriminative representations to enable accurate anomaly detection. To evaluate the effectiveness of the proposed model, we conduct extensive experiments on a benchmark dataset with synthesized anomalies. Quantitative results on the benchmark dataset show that, compared with 15 baseline models from statistical learning, deep learning, and ensemble learning, our model consistently achieves the best performance across key metrics. Furthermore, through linear relationship analysis on real-world clinical datasets, the exclusion of outlier samples significantly increased the statistical significance and fold change in the identified differential proteins. Overall, the proposed model establishes a solid data foundation, paving the way for downstream mechanistic studies and target discovery. Full article
Show Figures

Figure 1

20 pages, 10551 KB  
Article
Tribological Behavior and Material Removal Mechanisms in Sapphire Lapping Using HFCVD Diamond-Coated Tools
by Wei Feng, Xiaokang Sun and Shuai Zhou
Materials 2026, 19(5), 831; https://doi.org/10.3390/ma19050831 - 24 Feb 2026
Viewed by 45
Abstract
Diamond coatings with three distinct surface textures, namely spherical, pyramidal, and prismatic morphologies, were fabricated using the hot-filament chemical-vapor deposition (HFCVD) method. Scanning electron microscopy (SEM) was employed to analyze the surface morphological characteristics and differences among the coatings. Raman spectroscopic analysis further [...] Read more.
Diamond coatings with three distinct surface textures, namely spherical, pyramidal, and prismatic morphologies, were fabricated using the hot-filament chemical-vapor deposition (HFCVD) method. Scanning electron microscopy (SEM) was employed to analyze the surface morphological characteristics and differences among the coatings. Raman spectroscopic analysis further confirmed that all three diamond films exhibited excellent deposition uniformity and high crystalline quality. A three-dimensional optical microscopy system was used to measure the surface roughness values, which were determined to be Ra 0.423 μm, Ra 0.515 μm, and Ra 0.809 μm, respectively. An HFCVD diamond-coated tool was innovatively employed for the lapping of sapphire wafers, enabling a systematic investigation of the tribological behavior during the lapping process. Based on the experimental results, three morphological material removal models were established. The study demonstrates that the spherical diamond coating achieves a superior surface finish (Ra 0.22 μm) due to its continuous multi-point contact geometry, governed by the agglomerated nanocrystalline structure. Sample 3 had the highest removal rate of 24.3 μm/min. This is related to its surface morphology characteristics and is also due to the two-body contact between the diamond-coated tool and sapphire, offering a high-efficiency alternative for precision machining. Full article
(This article belongs to the Section Carbon Materials)
Show Figures

Figure 1

21 pages, 4748 KB  
Article
Quantitative Analysis of Polyphenols in Lonicera caerulea Based on Mid-Infrared Spectroscopy and Hybrid Variable Selection
by Haiwei Wu, Xuexin Li, Jianwei Liu, Zhihao Wang and Yuchun Liu
Molecules 2026, 31(4), 750; https://doi.org/10.3390/molecules31040750 - 23 Feb 2026
Viewed by 154
Abstract
Lonicera caerulea L. (blue honeysuckle) is rich in antioxidant polyphenols, and rapid and accurate determination of its polyphenol content is of great significance for functional food quality control. This study proposed a hybrid variable selection strategy designed for high-dimensional small-sample scenarios and developed [...] Read more.
Lonicera caerulea L. (blue honeysuckle) is rich in antioxidant polyphenols, and rapid and accurate determination of its polyphenol content is of great significance for functional food quality control. This study proposed a hybrid variable selection strategy designed for high-dimensional small-sample scenarios and developed a quantitative prediction model for polyphenol content based on mid-infrared (MIR) spectroscopy. A total of 191 Lonicera caerulea samples were collected from Northeast China, and 7468-dimensional spectral data were acquired using a Fourier transform infrared spectrometer. Polyphenol reference values were determined by the Folin–Ciocalteu method. Samples were divided into calibration (n = 152) and prediction (n = 39) sets using the SPXY algorithm. Among the 10 preprocessing methods evaluated, MSC combined with Savitzky–Golay first derivative achieved the best performance and was therefore used for subsequent modeling. The proposed hybrid variable selection method (VIP1.0∩RFR30%) intersected PLS variable importance in projection (VIP ≥ 1.0) with the top 30% important variables from random forest regression, selecting 984 key wavelengths and achieving 86.8% dimensionality reduction. A three-stage hyperparameter tuning strategy was implemented across four models (PLS, RFR, SVR, and XGBoost) to validate feature stability and control overfitting. The optimized XGBoost model achieved excellent performance on the independent test set (R2 = 0.92, RMSE = 0.098, RPD = 3.47). Compared with the classical CARS method (R2 = 0.78, RPD = 2.14), R2 improved by 16.3% and RPD improved by 55.2%. The results demonstrate that the proposed hybrid variable selection strategy can effectively address the challenges of high-dimensional MIR spectral data in small-sample modeling, providing a reliable tool for rapid and non-destructive quantitative analysis of polyphenols in Lonicera caerulea. Full article
Show Figures

Figure 1

23 pages, 649 KB  
Article
Manifold Causal Conditional Deep Networks for Heterogeneous Treatment Effect Estimation and Policy Evaluation
by Jong-Min Kim
Mathematics 2026, 14(4), 738; https://doi.org/10.3390/math14040738 - 22 Feb 2026
Viewed by 141
Abstract
We present a comprehensive framework for estimating heterogeneous treatment effects and evaluating decision-making policies in high-dimensional settings. Our approach combines nonlinear manifold learning techniques—UMAP, t-SNE, and Isomap—with a Causal Conditional Deep Network (CCDN) to model complex nonlinear interactions among covariates, treatments, and outcomes. [...] Read more.
We present a comprehensive framework for estimating heterogeneous treatment effects and evaluating decision-making policies in high-dimensional settings. Our approach combines nonlinear manifold learning techniques—UMAP, t-SNE, and Isomap—with a Causal Conditional Deep Network (CCDN) to model complex nonlinear interactions among covariates, treatments, and outcomes. Within this framework, we assess five treatment assignment policies—Greedy, Thompson Sampling, Epsilon-Greedy, Random, and a novel LLM-guided Thompson policy—across simulated and real-world datasets, including Adult, Wine Quality, and Boston Housing. Empirical results reveal a fundamental trade-off: exploitative policies like Greedy minimize cumulative regret but underperform in recovering heterogeneous treatment effects, whereas exploratory policies, particularly Random and LLM-Thompson, achieve a lower Conditional Average Treatment Effect Root Mean Squared Error (CATE RMSE) by providing broader coverage of the action–covariate space. Notably, LLM-Thompson consistently delivers strong performance across noisy, real-world datasets, highlighting the advantage of uncertainty-aware exploration in capturing treatment heterogeneity. Overall, the framework demonstrates that integrating manifold-informed deep networks with principled exploration strategies enhances both policy optimization and individualized treatment effect estimation in high-dimensional, complex environments. Full article
Show Figures

Figure 1

22 pages, 4040 KB  
Article
Data-Driven Design of Epoxy–Granite Machine Foundations: Bayesian Optimization for Enhanced Compressive Strength and Vibration Damping
by Mohammed Y. Abdellah, Osama M. Irfan and Hanafy M. Omar
Polymers 2026, 18(4), 532; https://doi.org/10.3390/polym18040532 - 21 Feb 2026
Viewed by 238
Abstract
Epoxy–granite (EG) composites, comprising granite quarry waste and low-cost epoxy, present a sustainable alternative to cast iron for machine tool foundations. This study develops a data-driven simulation framework to enhance the mechanical properties of epoxy–granite systems by integrating published experimental data with Gaussian [...] Read more.
Epoxy–granite (EG) composites, comprising granite quarry waste and low-cost epoxy, present a sustainable alternative to cast iron for machine tool foundations. This study develops a data-driven simulation framework to enhance the mechanical properties of epoxy–granite systems by integrating published experimental data with Gaussian Process Regression (GPR) surrogate modeling and Bayesian optimization (BO). The objective is to maximize compressive strength and vibration damping—both critical factors for machining accuracy and dynamic stability. Experimental results from composites with 12–25 wt% epoxy and varied aggregate gradations demonstrate compressive strengths up to 76.8 MPa and flexural strengths reaching 35.4 MPa. The peak damping ratio of 0.0202 was observed at intermediate epoxy content. Mixtures enriched with fine particles also exhibited enhanced fracture toughness and low water absorption, outperforming cementitious concretes, polymer concretes, and natural granite. To address the limitations of experimental coverage, a GPR-based simulation model was employed to explore the four-dimensional design space defined by epoxy content and aggregate fractions. Integrated with BO under realistic manufacturing constraints, the framework identifies optimal formulations comprising 22–26 wt% epoxy and 55–70% fine aggregates. These compositions yield predicted compressive strengths of 78–85 MPa and damping ratios approaching 0.022, indicating significant improvement in overall mechanical properties. Bayesian Weibull analysis further quantifies reliability, revealing shape parameters α ≈ 2.4–2.9, which indicate consistent performance with moderate variability. This work presents the first reported application of an integrated GPR-BO-Bayesian Weibull simulation framework to epoxy–granite composites, enabling simultaneous optimization of conflicting objectives and probabilistic reliability assessment of key mechanical properties. The approach reduces experimental effort by over 70% and supports the circular economy through valorization of granite waste in high-value manufacturing. Nonetheless, predictive uncertainty remains high in under-sampled regions (e.g., damping with n = 2). Future experimental validation—comprising at least 10–15 data points across varied epoxy ratios and gradations—is essential to corroborate the predicted optimum. Full article
(This article belongs to the Section Artificial Intelligence in Polymer Science)
Show Figures

Figure 1

30 pages, 543 KB  
Article
Corporate ESG Performance and Export Product Quality: Evidence from Chinese Listed Companies
by Mingguo Xia, Bing Jian and Ye Tian
Sustainability 2026, 18(4), 2118; https://doi.org/10.3390/su18042118 - 20 Feb 2026
Viewed by 252
Abstract
While it is a global imperative that firms should achieve superior environmental, social, and governance (ESG) performance, the specific impact of ESG on export product quality remains under-explored. Based on stakeholder theory and principal–agent theory, this paper utilizes a sample of Chinese listed [...] Read more.
While it is a global imperative that firms should achieve superior environmental, social, and governance (ESG) performance, the specific impact of ESG on export product quality remains under-explored. Based on stakeholder theory and principal–agent theory, this paper utilizes a sample of Chinese listed companies and the High-Dimensional Fixed Effects (HDFE) Model to empirically examine the impact and underlying mechanisms of ESG performance on export product quality. The results indicate a U-shaped relationship between ESG performance and export product quality, a non-linear correlation that has received limited attention in the previous literature. This U-shaped relationship is more pronounced among state-owned enterprises (SOEs), firms producing non-high-tech products, and those in heavy-polluting industries. Mechanism analysis reveals that ESG performance influences export product quality primarily through three channels: innovation levels, total factor productivity (TFP), and supply chain stability. By unveiling these non-linear dynamics and their underlying pathways, this study provides a novel theoretical framework and critical empirical evidence that reconcile conflicting views on ESG effects. These findings offer important insights for policymakers and exporters seeking to align ESG practices with export objectives, thereby contributing to more sustainable and high-quality development of foreign trade in China and beyond. Full article
Show Figures

Figure 1

25 pages, 2500 KB  
Article
Mechanistic Insights into AAV Capsid–Stationary Phase Interactions Governing Native Stability and Chromatographic Separation Using AAV8 as a Model System
by Timotej Žvanut, Mitja Martelanc, Aleš Štrancar and Andreja Gramc Livk
Pharmaceutics 2026, 18(2), 263; https://doi.org/10.3390/pharmaceutics18020263 - 20 Feb 2026
Viewed by 248
Abstract
Background/Objectives: Adeno-associated viruses (AAVs) are widely used gene therapy vectors; yet their physicochemical stability and chromatographic behavior are highly sensitive to the solution conditions they are in. Effective separation of full (F), empty (E), and partially filled (P) capsids—most commonly achieved by anion [...] Read more.
Background/Objectives: Adeno-associated viruses (AAVs) are widely used gene therapy vectors; yet their physicochemical stability and chromatographic behavior are highly sensitive to the solution conditions they are in. Effective separation of full (F), empty (E), and partially filled (P) capsids—most commonly achieved by anion exchange (AEX) chromatography—is essential for standard analytical characterization, process development, and product safety. However, conventional AEX methods rely on low-conductivity alkaline mobile phases with low salt, which promote capsid binding and therefore higher resolution, at the expense of structural stability. Conversely, formulations such as near-neutral buffers might preserve capsid integrity but often impair AEX retention and separation resolution. Methods: Here, we extend a mechanistic investigation using AAV8 capsids as a model system, focusing on detailed capsid interactions with strong AEX, and present novel AAV8 separation strategies on a weak AEX stationary phase. Results: By systematically varying buffer pH and ionic strength, we identify operational regimes that balance capsid stability with chromatographic separation efficiency. In parallel, we introduce an integrated two-dimensional (2D) in-line buffer exchange configuration that decouples AEX performance from sample formulation, enabling robust separation of stability-optimized, high-salt matrices without off-line desalting. Conclusions: By elucidating the roles of capsid charge modulation, ligand physicochemical properties, and local microenvironmental buffering, this study establishes practical design principles for stability-preserving chromatography. It lays a foundation for more reliable analytical and future preparative AAV workflows. Full article
(This article belongs to the Special Issue Adeno-Associated Virus (AAV) as a Vector for Gene Therapy)
Show Figures

Graphical abstract

21 pages, 2951 KB  
Article
Evaluating SWIR Spectral Data and Random Forest Models for Copper Mineralization Discrimination in the Zhunuo Porphyry Deposit
by Jiale Cao, Lifang Wang, Xiaofeng Liu and Song Wu
Minerals 2026, 16(2), 213; https://doi.org/10.3390/min16020213 - 19 Feb 2026
Viewed by 152
Abstract
In recent years, with the widespread application of shortwave infrared (SWIR) spectroscopy in mineral identification and hydrothermal alteration studies, an increasing number of studies have attempted to integrate SWIR spectral data with machine learning approaches to fully exploit mineralization-related discriminative information embedded in [...] Read more.
In recent years, with the widespread application of shortwave infrared (SWIR) spectroscopy in mineral identification and hydrothermal alteration studies, an increasing number of studies have attempted to integrate SWIR spectral data with machine learning approaches to fully exploit mineralization-related discriminative information embedded in high-dimensional spectral datasets. In this study, the Zhunuo porphyry copper deposit in Tibet was selected as the research target. SWIR drill core spectral data were systematically acquired, and a random forest (RF) machine learning model was applied to full-band SWIR spectra (1300–2500 nm) to conduct integrated analyses of copper grade regression and mineralization discrimination. A total of 2140 drill core samples were measured, with three replicate measurements per sample, yielding 6420 spectra. After standardized preprocessing and interpolation resampling, a unified spectral feature dataset was constructed for regression and classification analyses. SWIR spectral data are characterized by a large number of bands, strong inter-band correlations, and relatively limited sample sizes; under such conditions, model generalization ability and stability become critical factors in method selection. Based on ensemble learning, the random forest model constructs multiple decision trees and aggregates their predictions through voting or averaging, effectively reducing model variance and mitigating overfitting, and is therefore well suited for high-dimensional, small-sample, and highly correlated geological spectral datasets. In porphyry copper systems, the spectral characteristics of hydrothermal alteration minerals and mineralization intensity commonly exhibit complex nonlinear relationships, which can be effectively captured by random forest models without requiring predefined functional forms. The regression results indicate that accurate quantitative prediction of copper grade based solely on SWIR spectral data remains limited. In contrast, when a threshold-based binary classification was introduced using an industrial cutoff grade of 0.2% Cu, the model achieved an overall accuracy of 75%, an F1 score of 0.69, and an area under the ROC curve (AUC) of 0.80, demonstrating strong mineralization discrimination capability and stability. Overall, the integration of SWIR spectroscopy with machine learning methods provides an efficient, reliable, and geologically interpretable technical approach for early-stage exploration and detailed drill core interpretation in porphyry copper deposits. Full article
Show Figures

Figure 1

30 pages, 1973 KB  
Article
Human-Centered AI Perception Prediction in Construction: A Regularized Machine Learning Approach for Industry 5.0
by Annamária Behúnová, Matúš Pohorenec, Tomáš Mandičák and Marcel Behún
Appl. Sci. 2026, 16(4), 2057; https://doi.org/10.3390/app16042057 - 19 Feb 2026
Viewed by 167
Abstract
Industry 5.0 emphasizes human-centered integration of artificial intelligence in industrial contexts, yet successful adoption depends critically on workforce perception and acceptance. This research develops and validates a machine learning framework for predicting AI-related perceptions and expected impacts in the construction industry under small [...] Read more.
Industry 5.0 emphasizes human-centered integration of artificial intelligence in industrial contexts, yet successful adoption depends critically on workforce perception and acceptance. This research develops and validates a machine learning framework for predicting AI-related perceptions and expected impacts in the construction industry under small sample constraints typical of specialized industrial surveys. Specifically, the study aims to develop and empirically validate a predictive AI decision support model that estimates the expected impact of AI adoption in the construction sector based on digital competencies, ICT utilization, AI training and experience, and AI usage at both individual and organizational levels, operationalized through a composite AI Impact Index and two process-oriented outcomes (perceived task automation and perceived cost reduction). Using a dataset of 51 survey responses from Slovak construction professionals collected in 2025, we implement a methodologically rigorous approach specifically designed for limited-data regimes. The framework encompasses ordinal target simplification from five to three classes, dimensionality reduction through theoretically grounded composite indices reducing features from 15 to 7, exclusive deployment of low variance regularized models, and leave-one-out cross-validation for unbiased performance estimation. The optimal model (Lasso regression with recursive feature elimination) predicts cost reduction perception with R2 = 0.501, MAE = 0.551, and RMSE = 0.709, while six classification targets achieve weighted F1 = 0.681, representing statistically optimal performance given sample constraints and perception measurement variability. Comparative evaluation confirms regularized models outperform high variance alternatives: random forest (R2 = 0.412) and gradient boosting (R2 = 0.292) exhibit substantially lower generalization performance, empirically validating the bias-variance trade-off rationale. Key methodological contributions include explicit bias-variance optimization preventing overfitting, feature selection via RFE reducing input space to six predictors (personal AI usage, AI impact on budgeting, ICT utilization, AI training, company size, and age), and demonstration that principled statistical approaches achieve meaningful predictions without requiring large-scale datasets or complex architectures. The framework provides a replicable blueprint for perception and impact prediction in data-constrained Industry 5.0 contexts, enabling targeted interventions, including customized training programs, strategic communication prioritization, and resource allocation for change management initiatives aligned with predicted adoption patterns. Full article
Show Figures

Figure 1

Back to TopTop