Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,640)

Search Parameters:
Keywords = novel forests

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
28 pages, 4176 KB  
Article
Evaluating the Financial Performance of CSR Strategies and Sustainable Operations in Mexican Companies: An Explainable Machine Learning Approach
by Laura Elena Jiménez-Casillas, Román Rodríguez-Aguilar, Marisol Velázquez-Salazar and Santiago García-Álvarez
Mathematics 2026, 14(3), 557; https://doi.org/10.3390/math14030557 - 4 Feb 2026
Abstract
Research on how corporate social responsibility (CSR) practices linked to sustainable operations (SO) affect corporate financial performance (FP) is still limited. This study presents a novel methodological proposal to measure the individual impact of such practices on the profitability of companies listed on [...] Read more.
Research on how corporate social responsibility (CSR) practices linked to sustainable operations (SO) affect corporate financial performance (FP) is still limited. This study presents a novel methodological proposal to measure the individual impact of such practices on the profitability of companies listed on the Mexican Stock Exchange. The method employed consists of a Random Forest (RF) model complemented by Explainable Machine Learning (XML) techniques, namely Individual Conditional Expectation (ICE), Partial Dependence Plots (PDPs) and SHapley Additive exPlanations (SHAP), to calculate the individualized marginal effect in the return on assets (RoA), return on equity (RoE) and return on investment capital (ROIC) for each company, explained by the environmental, social, and governance scores provided by Bloomberg (Bloomberg Finance, L.P., New York, NY, USA), such as the market capitalization, debt-to-equity ratio, sales growth, and years since listing. The novelty of this model lies in the application of RF and XML, which offers a comprehensive and interpretable perspective on the CSR–FP relationship and the use of lagged explanatory variables to avoid endogeneity problems, overcoming the limitations of traditional analyses. The results indicate that environmental scores exhibit the most consistent contribution to FP, whereas social and governance effects are highly metric-dependent. The SHAP analysis reveals substantial heterogeneity in the drivers of firm FP, highlighting the relevance of XML methods. Full article
30 pages, 3451 KB  
Article
A Novel Investment Risk Assessment Model for Complex Construction Projects Based on the IFA-LSSVM
by Rupeng Ren, Shengmin Wang and Jun Fang
Buildings 2026, 16(3), 624; https://doi.org/10.3390/buildings16030624 - 2 Feb 2026
Viewed by 38
Abstract
The project cycle of complex construction projects covers the whole process from project decision-making, design, bidding, construction, completion acceptance, and the initial stage of operation. Among them, the investment risk assessment of complex construction projects focuses on the early decision-making stage of the [...] Read more.
The project cycle of complex construction projects covers the whole process from project decision-making, design, bidding, construction, completion acceptance, and the initial stage of operation. Among them, the investment risk assessment of complex construction projects focuses on the early decision-making stage of the project, aiming to provide a basis for investment feasibility analysis. The investment risk of complex construction projects is highly nonlinear and uncertain, and the traditional risk assessment methods have limitations in model generalization ability and prediction accuracy. To improve the accuracy and reliability of quantitative risk assessment, this study proposed a novel investment risk assessment model based on the perspective of investors. Firstly, through literature research, a multi-dimensional comprehensive risk assessment index system covering policies and regulations, economic environment, technical management, construction safety, and financial cost was systematically identified and constructed. Subsequently, the Least Squares Support Vector Machine (LSSVM) was used to establish a nonlinear mapping relationship between risk indicators and final risk levels. Aiming at the problem that the parameter selection of the standard LSSVM model has a significant impact on the performance, this paper proposed an improved Firefly Algorithm (IFA) to automatically optimize the penalty factor and kernel function parameters of LSSVM, so as to overcome the blindness of artificial parameter selection and improve the convergence speed and generalization ability of the model. Compared with the classical Firefly Algorithm, IFA strengthens learning and adaptive strategies by adding depth. The conclusions are as follows. (1) Compared with the Backpropagation Neural Network (BPNN), Random Forest (RF), and eXtreme Gradient Boosting (XGBoost), this model showed higher prediction accuracy on the test set, and its accuracy was reduced by about 3%. (2) Compared with FA, Genetic Algorithm (GA), and Particle Swarm Optimization (PSO), IFA had a stronger global retrieval ability. (3) The model could effectively fit the complex risk nonlinear relationship, and the risk assessment results were highly consistent with the actual situation. Therefore, the risk assessment model based on the improved LSSVM constructed in this study not only provides a more scientific and accurate quantitative tool for investment decision-making of construction projects, but also has important theoretical and practical significance for preventing and resolving significant investment risks. Full article
(This article belongs to the Special Issue Advances in Life Cycle Management of Buildings)
Show Figures

Figure 1

20 pages, 1258 KB  
Review
Increasing Forest Ecosystem Resilience Is a Matter of Ecosystem Legacy Management: Conceptual Model for Restoration in Hemiboreal Forests
by Kalev Jõgiste, Lee E. Frelich, Floortje Vodde, Āris Jansons, Endijs Bāders, Peter B. Reich, John A. Stanturf, Sille Rebane, Kajar Köster and Marek Metslaid
Forests 2026, 17(2), 197; https://doi.org/10.3390/f17020197 - 2 Feb 2026
Viewed by 43
Abstract
In the face of accelerating climate change and increasingly complex disturbance regimes, enhancing forest ecosystem resilience has become a core priority in forest ecology and management. This paper argues that long-term resilience in hemiboreal forests depends fundamentally on the management of ecosystem legacies—structural, [...] Read more.
In the face of accelerating climate change and increasingly complex disturbance regimes, enhancing forest ecosystem resilience has become a core priority in forest ecology and management. This paper argues that long-term resilience in hemiboreal forests depends fundamentally on the management of ecosystem legacies—structural, compositional, and functional remnants that persist following past disturbances and land use. Organized under the resilience framework, this perspective emphasizes that resilience is not solely a matter of response or effect, but an emergent property shaped by abiotic and biotic legacies, including life history traits, landscape heterogeneity, and both anthropogenic and natural disturbance. In this paper, drawing from disturbance ecology, resilience theory, and regional empirical studies, a conceptual model is presented that integrates legacy attributes, environmental filters, and management objectives to support adaptive restoration strategies. It helps design restoration pathways that are ecologically meaningful, operationally realistic, and robust to novel disturbance regimes. By operationalizing legacy–action linkages, the model offers practitioners concrete entry points for retention, disturbance use, and landscape design to enhance resilience. Full article
(This article belongs to the Section Forest Ecology and Management)
Show Figures

Figure 1

11 pages, 1985 KB  
Article
Design of Double-Lattice Photonic Crystal of DUV Laser by ANN-RBF Neural Network
by Bochao Zhang, Minyan Zhang, Lei Li, Jianglang Bie, Shuoyi Jiao, Zhuanzhuan Guo, Xinjie Cai and Bowen Hou
Optics 2026, 7(1), 11; https://doi.org/10.3390/opt7010011 - 2 Feb 2026
Viewed by 41
Abstract
In this study, a double-lattice photonic crystal structure was designed to achieve deep ultraviolet lasing without the use of any Distributed Bragg Reflector (DBR), which is called a photonic-crystal surface-emitting laser (PCSEL). The plane wave expansion (PWE) method was used to study the [...] Read more.
In this study, a double-lattice photonic crystal structure was designed to achieve deep ultraviolet lasing without the use of any Distributed Bragg Reflector (DBR), which is called a photonic-crystal surface-emitting laser (PCSEL). The plane wave expansion (PWE) method was used to study the influence of various structural parameters on the resonant wavelength. Utilizing the random forest algorithm, we determined that the importance of the lattice constant to the resonant wavelength is 95.24%. Furthermore, we realized the reverse design of double-lattice photonic crystals from the target wavelength to optimal structural parameters through a radial basis function (RBF) network algorithm. Comparative analysis of the extreme learning machine (ELM) and back propagation (BP) algorithms demonstrated that RBF-based performance was notably superior to the training outcomes of other algorithms. The mean absolute error (MAE) of the lattice constant of the test set in the training results was 0.7610 nm, root mean square error (RMSE) was 1.143×10-3 nm, and mean absolute relative error (MARE) was 5.489×10-3. We verified the reliability of the algorithm and designed 13 groups of photonic crystals with different epitaxial structures. The mean square error (MSE) was 0.6188 nm2 compared with that of the plane wave expansion method. This work demonstrates applicability across various wavebands and epitaxial structures in GaN-based devices, providing a novel approach for the rapid iteration of deep ultraviolet PCSELs. Full article
Show Figures

Figure 1

16 pages, 3751 KB  
Article
Combined Transcriptomic and Metabolomic Analyses of Low-Temperature Adaptation in Bursaphelenchus xylophilus
by Xiong Xiong, Jie Li, Shuaibin Sun, Chengming Yu, Yehan Tian, Chuanrong Li and Huixiang Liu
Int. J. Mol. Sci. 2026, 27(3), 1470; https://doi.org/10.3390/ijms27031470 - 2 Feb 2026
Viewed by 55
Abstract
Bursaphelenchus xylophilus (PWN), a highly destructive invasive forest pest, has expanded northward in China, even colonizing cold regions, implying evolved low-temperature tolerance. To explore its cold adaptation mechanisms, we selected PWN isolates from diverse origins, screened cold-tolerant strains via low-temperature stress assays, and [...] Read more.
Bursaphelenchus xylophilus (PWN), a highly destructive invasive forest pest, has expanded northward in China, even colonizing cold regions, implying evolved low-temperature tolerance. To explore its cold adaptation mechanisms, we selected PWN isolates from diverse origins, screened cold-tolerant strains via low-temperature stress assays, and conducted integrative transcriptomic and metabolomic analyses. We also compared invasive and native populations to clarify adaptive pattern differentiation. The results showed that northern Chinese isolates had significantly higher survival rates, with cold tolerance closely linked to lysophosphatidylethanolamine (LysoPE). Silencing the LysoPE-related gene BX02G0260 markedly elevated nematode mortality under low temperatures. Unlike native populations, invasive PWN may have developed a cold adaptation strategy centered on genetic material protection, with xanthosine as a key metabolite. These findings provide critical molecular insights into invasive species’ rapid cold adaptation in novel environments. Full article
(This article belongs to the Section Molecular Plant Sciences)
Show Figures

Figure 1

32 pages, 2526 KB  
Article
HSE-GNN-CP: Spatiotemporal Teleconnection Modeling and Conformalized Uncertainty Quantification for Global Crop Yield Forecasting
by Salman Mahmood, Raza Hasan and Shakeel Ahmad
Information 2026, 17(2), 141; https://doi.org/10.3390/info17020141 - 1 Feb 2026
Viewed by 188
Abstract
Global food security faces escalating threats from climate variability and resource constraints. Accurate crop yield forecasting is essential; however, existing methods frequently overlook complex spatial dependencies driven by climate teleconnections, such as the ENSO, and lacks rigorous uncertainty quantification. This paper presents HSE-GNN-CP, [...] Read more.
Global food security faces escalating threats from climate variability and resource constraints. Accurate crop yield forecasting is essential; however, existing methods frequently overlook complex spatial dependencies driven by climate teleconnections, such as the ENSO, and lacks rigorous uncertainty quantification. This paper presents HSE-GNN-CP, a novel framework integrating heterogeneous stacked ensembles, graph neural networks (GNNs), and conformal prediction (CP). Domain-specific features are engineered, including growing degree days and climate suitability scores, and explicitly model spatial patterns via rainfall correlation graphs. The ensemble combines random forest and gradient boosting learners with bootstrap aggregation, while GNNs encode inter-regional climate dependencies. Conformalized quantile regression ensures statistically valid prediction intervals. Evaluated on a global dataset spanning 15 countries and six major crops from 1990 to 2023, the framework achieves an R2 of 0.9594 and an RMSE of 4882 hg/ha. Crucially, it delivers calibrated 80% prediction intervals with 80.72% empirical coverage, significantly outperforming uncalibrated baselines at 40.03%. SHAP analysis identifies crop type and rainfall as dominant predictors, while the integrated drought classifier achieves perfect accuracy. These contributions advance agricultural AI by merging robust ensemble learning with explicit teleconnection modeling and trustworthy uncertainty quantification. Full article
Show Figures

Graphical abstract

24 pages, 2031 KB  
Article
A Unified Approach for Ensemble Function and Threshold Optimization in Anomaly-Based Failure Forecasting
by Nikolaos Kolokas, Vasileios Tatsis, Angeliki Zacharaki, Dimosthenis Ioannidis and Dimitrios Tzovaras
Appl. Sci. 2026, 16(3), 1452; https://doi.org/10.3390/app16031452 - 31 Jan 2026
Viewed by 191
Abstract
This paper introduces a novel approach to anomaly-based failure forecasting that jointly optimizes both the ensemble function and the anomaly threshold used for decision making. Unlike conventional methods that apply fixed or classifier-defined thresholds, the proposed framework simultaneously tunes the threshold of the [...] Read more.
This paper introduces a novel approach to anomaly-based failure forecasting that jointly optimizes both the ensemble function and the anomaly threshold used for decision making. Unlike conventional methods that apply fixed or classifier-defined thresholds, the proposed framework simultaneously tunes the threshold of the failure probability or anomaly score and the parameters of an ensemble function that integrates multiple machine learning models—specifically, Random Forest and Isolation Forest classifiers trained under diverse preprocessing configurations. The distinctive contribution of this work lies in introducing a weighted mean ensemble function, whose coefficients are co-optimized with the anomaly threshold using a global optimization algorithm, enabling adaptive, data-driven decision boundaries. The method is designed for predictive maintenance applications and validated using sensor data from three industrial domains: aluminum anode production, plastic injection molding, and automotive manufacturing. The experimental results demonstrate that the proposed combined optimization significantly enhances forecasting reliability, improving the Matthews Correlation Coefficient by up to 6.5 percentage units compared to previous approaches. Beyond its empirical gains, this work establishes a scalable and computationally efficient framework for integrating threshold and ensemble optimization in real-world, cross-industry predictive maintenance systems. Full article
Show Figures

Figure 1

27 pages, 7975 KB  
Article
Identification and Prediction of the Invasion Pattern of the Mikania micrantha with WaveEdgeNet Model Using UAV-Based Images in Shenzhen
by Hui Lin, Yang Yin, Xiaofen He, Jiangping Long, Tingchen Zhang, Zilin Ye and Xiaojia Deng
Remote Sens. 2026, 18(3), 437; https://doi.org/10.3390/rs18030437 - 30 Jan 2026
Viewed by 102
Abstract
Mikania micrantha is one of the most detrimental invasive plant species in the southeastern coastal region of China. To accurately predict the invasion pattern of Mikania micrantha and offer guidance for production practices, it is essential to determine its precise location and the [...] Read more.
Mikania micrantha is one of the most detrimental invasive plant species in the southeastern coastal region of China. To accurately predict the invasion pattern of Mikania micrantha and offer guidance for production practices, it is essential to determine its precise location and the driving factors. Therefore, a design of the wavelet convolution and dynamic feature fusion module was studied, and WaveEdgeNet was proposed. This model has the abilities to deeply extract image semantic features, retain features, perform multi-scale segmentation, and conduct fusion. Moreover, to quantify the impact of human and natural factors, we developed a novel proximity factor based on land use data. Additionally, a new feature selection framework was applied to identify driving factors by analyzing the relationships between environmental variables and Mikania micrantha. Finally, the MaxEnt model was utilized to forecast its potential future habitats. The results demonstrate that WaveEdgeNet effectively extracts image features and improves model performance, attaining an MIoU of 85% and an overall accuracy of 98.62%, outperforming existing models. Spatial analysis shows that the invaded area in 2024 was smaller than that in 2023, indicating that human intervention measures have achieved some success. Furthermore, the feature selection framework not only enhances MaxEnt’s accuracy but also cuts down computational time by 82.61%. According to MaxEnt modeling, human disturbance, proximity to forests, distance from roads, and elevation are recognized as the primary factors. In the future, we will concentrate on overcoming the seasonal limitations and attaining the objective of predicting the growth and reproduction of kudzu before they happen, which can offer a foundation for manual intervention. This study lays a solid technical foundation and offers comprehensive data support for comprehending the species’ dispersal patterns and driving factors and for guiding environmental conservation. Full article
(This article belongs to the Section Forest Remote Sensing)
Show Figures

Figure 1

18 pages, 2504 KB  
Article
Prediction of PM2.5 Concentrations in the Pearl River Delta by Integrating the PLUS and LUR Models
by Xiyao Zhang, Peizhe Chen, Ying Cai and Jinyao Lin
Land 2026, 15(2), 240; https://doi.org/10.3390/land15020240 - 30 Jan 2026
Viewed by 201
Abstract
Since land use considerably affects the spatial variation of PM2.5 levels, it is crucial to predict PM2.5 concentrations under future land use changes. However, prior research has primarily concentrated on meteorological factors influencing PM2.5 predictions, while neglecting the effect of [...] Read more.
Since land use considerably affects the spatial variation of PM2.5 levels, it is crucial to predict PM2.5 concentrations under future land use changes. However, prior research has primarily concentrated on meteorological factors influencing PM2.5 predictions, while neglecting the effect of land use configurations. Consequently, in our study, a novel Patch-generating Land Use Simulation–Land Use Regression (PLUS-LUR) method was developed by integrating the PLUS model’s dynamic prediction capability with the LUR model’s spatial interpretation strength. The incorporation of landscape indices as key variables was essential for predicting PM2.5 concentrations. First, the random forest-optimized LUR method was trained with PM2.5 datasets from the Pearl River Delta (PRD) monitoring stations and multi-source spatial datasets. We assessed the modeling accuracy with and without considering landscape indices using the test dataset. Subsequently, the PLUS approach was applied to forecast land use as well as associated landscape indices in 2028. Based on these projections, grid-scale influencing factors were input into the previously constructed LUR model to forecast future PM2.5 distributions at a grid scale. The results reveal a spatial pattern with higher PM2.5 levels in central areas and lower levels in peripheral regions. Furthermore, the PM2.5 concentrations in the PRD are all below the Grade II threshold of the China Ambient Air Quality Benchmark in 2028. Notably, the predictions incorporating landscape indices demonstrate higher accuracy and reliability compared to those excluding them. These results provide methodological support for future PM2.5 assessment and land use management. Full article
Show Figures

Figure 1

33 pages, 3882 KB  
Article
Hybrid Feature Selection and Interpretable Random Forest Modeling for Olympic Medal Forecasting: Integrating CFO Optimization and Uncertainty Analysis
by Xinran Chen, Xuming Yan and Tanran Zhang
Mathematics 2026, 14(3), 478; https://doi.org/10.3390/math14030478 - 29 Jan 2026
Viewed by 101
Abstract
This study develops a data-driven predictive framework integrating hybrid feature selection, interpretable machine learning, and uncertainty quantification to forecast Olympic medal performance among elite nations. Focusing on the top ten countries from Paris 2024, the analysis employs a three-stage feature selection procedure combining [...] Read more.
This study develops a data-driven predictive framework integrating hybrid feature selection, interpretable machine learning, and uncertainty quantification to forecast Olympic medal performance among elite nations. Focusing on the top ten countries from Paris 2024, the analysis employs a three-stage feature selection procedure combining Spearman correlation screening, random forest embedded importance, and the Caterpillar Fungus Optimizer (CFO) to identify stable long-term predictors. A novel test variable, rank, capturing historical competitive strength, and a refined continuous host-effect indicator derived from gravity-type trade models are introduced. Two complementary modeling strategies—a two-way fixed-effects econometric model and a CFO-optimized random forest—are implemented and validated. SHAP, LIME, and partial dependence plots enhance model interpretability, revealing nonlinear mechanisms underlying medal outcomes. Kernel density estimation generates probabilistic interval forecasts for Los Angeles 2028. Results demonstrate that historical performance and event-specific characteristics dominate medal predictions, while macroeconomic factors (GDP, population) and conventional host status contribute marginally once related variables are controlled. Consistent variable rankings across models and close alignment between 2028 projections and 2024 outcomes validate the framework’s robustness and practical applicability for sports policy and resource allocation decisions. Full article
Show Figures

Figure 1

17 pages, 3692 KB  
Article
Data-Driven Optimization and Modelling of the Gap Bridgeability Performance of Multi-Pin Friction Stir Welded EN AW 7020-T651 Joints
by Ramin Delir Nazarlou, Pouya Zarei, Samita Salim, Michael Wiegand, Martin Kahlmeyer and Stefan Böhm
Materials 2026, 19(3), 544; https://doi.org/10.3390/ma19030544 - 29 Jan 2026
Viewed by 189
Abstract
Friction stir welding (FSW) of high-strength aluminum alloys, including EN AW 7020-T651, encounters significant challenges under weld line gap conditions, leading to compromised joint integrity. This study develops a predictive, data-driven framework to assess and optimize the gap bridgeability performance of FSW joints [...] Read more.
Friction stir welding (FSW) of high-strength aluminum alloys, including EN AW 7020-T651, encounters significant challenges under weld line gap conditions, leading to compromised joint integrity. This study develops a predictive, data-driven framework to assess and optimize the gap bridgeability performance of FSW joints with weld line gaps ranging from 0 to 4 mm in 2 mm thick plates. A structured experimental matrix was implemented, systematically varying rotational speed, welding speed, axial force, and tool shoulder diameter. To promote stable material flow and consistent weld quality under varying gap conditions, a multi-pin tool was employed throughout the welding trials. This configuration supported defect-free weld formation across a broad process window and contributed to improved weld soundness under gap conditions. Weld quality was evaluated using a comprehensive, multi-criteria approach that required (i) defect-free joints verified by visual and cross-sectional (metallographic) inspection, (ii) an ultimate tensile strength of at least 230 MPa, and (iii) a novel metric termed weak area percentage (WAP). Derived from micro-hardness mapping, WAP quantified the proportion of the heat-affected zone (HAZ) exhibiting hardness below 96 HV, providing a more robust and spatially sensitive measure of mechanical integrity than conventional average hardness values. Two machine learning models, Logistic Regression and Random Forest, were trained to classify weld acceptability. The Random Forest model demonstrated superior performance, achieving 92.5% classification accuracy and an F1-score of 0.90. Feature importance analysis identified the interaction terms “welding speed × gap size” and “rotational speed × gap size” as the most influential predictors of weld quality. Full article
Show Figures

Figure 1

24 pages, 3822 KB  
Article
Optimising Calculation Logic in Emergency Management: A Framework for Strategic Decision-Making
by Yuqi Hang and Kexi Wang
Systems 2026, 14(2), 139; https://doi.org/10.3390/systems14020139 - 29 Jan 2026
Viewed by 222
Abstract
Given the increasing demand for rapid emergency management decision-making, which must be both timely and reliable, even slight delays can result in substantial human and economic losses. However, current systems and recent state-of-the-art work often use inflexible rule-based logic that cannot adapt to [...] Read more.
Given the increasing demand for rapid emergency management decision-making, which must be both timely and reliable, even slight delays can result in substantial human and economic losses. However, current systems and recent state-of-the-art work often use inflexible rule-based logic that cannot adapt to rapidly changing emergency conditions or dynamically optimise response allocation. As a result, our study presents the Calculation Logic Optimisation Framework (CLOF), a novel data-driven approach that enhances decision-making intelligently and strategically through learning-based predictive and multi-objective optimisation, utilising the 911 Emergency Calls data set, comprising more than half a million records from Montgomery County, Pennsylvania, USA. The CLOF examines patterns over space and time and uses optimised calculation logic to reduce response latency and increase decision reliability. The suggested framework outperforms the standard Decision Tree, Random Forest, Gradient Boosting, and XGBoost baselines, achieving 94.68% accuracy, a log-loss of 0.081, and a reliability score (R2) of 0.955. The mean response time error is reported to have been reduced by 19%, illustrating robustness to real-world uncertainty. The CLOF aims to deliver results that confirm the scalability, interpretability, and efficiency of modern EM frameworks, thereby improving safety, risk awareness, and operational quality in large-scale emergency networks. Full article
(This article belongs to the Section Artificial Intelligence and Digital Systems Engineering)
Show Figures

Figure 1

24 pages, 8057 KB  
Article
Retrieval of Mangrove Leaf Area Index Using Multispectral Vegetation Indices and Machine Learning Regression Algorithms
by Liangchao Deng, Xuyang Chen, Li Xu, Bolin Fu, Yongze Xing, Shuo Yu, Tengfang Deng, Yuzhou Huang and Qianguang Liu
Forests 2026, 17(2), 180; https://doi.org/10.3390/f17020180 - 29 Jan 2026
Viewed by 127
Abstract
Leaf Area Index (LAI) is the total leaf area per unit of land surface area and is a crucial parameter for assessing vegetation growth and productivity. Machine learning regression algorithms are widely applied for LAI estimation. Due to spectral response variations among sensors [...] Read more.
Leaf Area Index (LAI) is the total leaf area per unit of land surface area and is a crucial parameter for assessing vegetation growth and productivity. Machine learning regression algorithms are widely applied for LAI estimation. Due to spectral response variations among sensors and susceptibility of mangrove-derived variables to environmental noise suppression, obtaining sensitivity indices and optimal machine learning regression models is essential for retrieving mangrove LAI at the population scale. This study proposes a novel approach to processing and retrieving mangrove LAI data by integrating multispectral indices with machine learning methods. Box–Cox transformation and CatBoost-based feature selection were employed to obtain the optimal dataset. Random Forest (RF), Gradient Boosting Regression Trees (GBRT), and Categorical Boosting (CatBoost) algorithms were used to evaluate the accuracy of LAI retrieval from Unmanned Aerial Vehicle (UAV) and Gaofen-6 (GF-6) data. Results indicate that when LAI > 3, LAI does not immediately saturate as CVI, MTVI 2, and other indices increase, demonstrating higher sensitivity. UAV data outperformed GF-6 data in retrieving LAI for diverse mangrove populations; during model training, RF proved more suitable for small-sample datasets, while CatBoost effectively suppressed environmental noise. Both RF and CatBoost demonstrated higher robustness in estimating Avicennia marina (AM) (RF: R2 = 0.704) and Aegiceras corniculatum (AC) (R2 = 0.766), respectively. Spatial distribution analysis of LAI indicates that healthy AM and AC cover 85.36% and 96.67% of the area, respectively. Spartina alterniflora and aquaculture wastewater may be among the factors affecting the health of mangrove forests in the study area. LAI retrieval holds significant importance for mangrove health monitoring and risk early warning. Full article
Show Figures

Figure 1

20 pages, 4026 KB  
Article
Ensemble Machine Learning for Operational Water Quality Monitoring Using Weighted Model Fusion for pH Forecasting
by Wenwen Chen, Yinzi Shao, Zhicheng Xu, Bing Zhou, Shuhe Cui, Zhenxiang Dai, Shuai Yin, Yuewen Gao and Lili Liu
Sustainability 2026, 18(3), 1200; https://doi.org/10.3390/su18031200 - 24 Jan 2026
Viewed by 171
Abstract
Water quality monitoring faces increasing challenges due to accelerating industrialization and urbanization, demanding accurate, real-time, and reliable prediction technologies. This study presents a novel ensemble learning framework integrating Gaussian Process Regression, Support Vector Regression, and Random Forest algorithms for high-precision water quality pH [...] Read more.
Water quality monitoring faces increasing challenges due to accelerating industrialization and urbanization, demanding accurate, real-time, and reliable prediction technologies. This study presents a novel ensemble learning framework integrating Gaussian Process Regression, Support Vector Regression, and Random Forest algorithms for high-precision water quality pH prediction. The research utilized a comprehensive spatiotemporal dataset, comprising 11 water quality parameters from 37 monitoring stations across Georgia, USA, spanning 705 days from January 2016 to January 2018. The ensemble model employed a dynamic weight allocation strategy based on cross-validation error performance, assigning optimal weights of 34.27% to Random Forest, 33.26% to Support Vector Regression, and 32.47% to Gaussian Process Regression. The integrated approach achieved superior predictive performance, with a mean absolute error of 0.0062 and coefficient of determination of 0.8533, outperforming individual base learners across multiple evaluation metrics. Statistical significance testing using Wilcoxon signed-rank tests with a Bonferroni correction confirmed that the ensemble significantly outperforms all individual models (p < 0.001). Comparison with state-of-the-art models (LightGBM, XGBoost, TabNet) demonstrated competitive or superior ensemble performance. Comprehensive ablation experiments revealed that Random Forest removal causes the largest performance degradation (+4.43% MAE increase). Feature importance analysis revealed the dissolved oxygen maximum and conductance mean as the most influential predictors, contributing 22.1% and 17.5%, respectively. Cross-validation results demonstrated robust model stability with a mean absolute error of 0.0053 ± 0.0002, while bootstrap confidence intervals confirmed narrow uncertainty bounds of 0.0060 to 0.0066. Spatiotemporal analysis identified station-specific performance variations ranging from 0.0036 to 0.0150 MAE. High-error stations (12, 29, 33) were analyzed to distinguish characteristics, including higher pH variability and potential upstream pollution influences. An integrated software platform was developed featuring intuitive interface, real-time prediction, and comprehensive visualization tools for environmental monitoring applications. Full article
Show Figures

Figure 1

24 pages, 5858 KB  
Article
NADCdb: A Joint Transcriptomic Database for Non-AIDS-Defining Cancer Research in HIV-Positive Individuals
by Jiajia Xuan, Chunhua Xiao, Runhao Luo, Yonglei Luo, Qing-Yu He and Wanting Liu
Int. J. Mol. Sci. 2026, 27(3), 1169; https://doi.org/10.3390/ijms27031169 - 23 Jan 2026
Viewed by 129
Abstract
Non-AIDS-defining cancers (NADCs) have emerged as an increasingly prominent cause of non-AIDS-related morbidity and mortality among people living with HIV (PLWH). However, the scarcity of NADC clinical samples, compounded by privacy and security constraints, continues to present formidable obstacles to advancing pathological and [...] Read more.
Non-AIDS-defining cancers (NADCs) have emerged as an increasingly prominent cause of non-AIDS-related morbidity and mortality among people living with HIV (PLWH). However, the scarcity of NADC clinical samples, compounded by privacy and security constraints, continues to present formidable obstacles to advancing pathological and clinical investigations. In this study, we adopted a joint analysis strategy and deeply integrated and analyzed transcriptomic data from 12,486 PLWH and cancer patients to systematically identify potential key regulators for 23 NADCs. This effort culminated in NADCdb—a database specifically engineered for NADC pathological exploration, structured around three mechanistic frameworks rooted in the interplay of immunosuppression, chronic inflammation, carcinogenic viral infections, and HIV-derived oncogenic pathways. The “rNADC” module performed risk assessment by prioritizing genes with aberrant expression trajectories, deploying bidirectional stepwise regression coupled with logistic modeling to stratify the risks for 21 NADCs. The “dNADC” module, synergized patients’ dysregulated genes with their regulatory networks, using Random Forest (RF) and Conditional Inference Trees (CITs) to identify pathogenic drivers of NADCs, with an accuracy exceeding 75% (in the external validation cohort, the prediction accuracy of the HIV-associated clear cell renal cell carcinoma model exceeded 90%). Meanwhile, “iPredict” identified 1905 key immune biomarkers for 16 NADCs based on the distinct immune statuses of patients. Importantly, we conducted multi-dimensional profiling of these key determinants, including in-depth functional annotations, phenotype correlations, protein–protein interaction (PPI) networks, TF-miRNA-target regulatory networks, and drug prediction, to deeply dissect their mechanistic roles in NADC pathogenesis. In summary, NADCdb serves as a novel, centralized resource that integrates data and provides analytical frameworks, offering fresh perspectives and a valuable platform for the scientific exploration of NADCs. Full article
(This article belongs to the Special Issue Novel Molecular Pathways in Oncology, 3rd Edition)
Show Figures

Figure 1

Back to TopTop