Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (4,743)

Search Parameters:
Keywords = local forest

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 5562 KB  
Article
Integrative Transcriptomic and Biochemical Profiling Reveals Bacillus amyloliquefaciens JL54 Primes Larix olgensis Defenses Against Neofusicoccum laricinum Attack
by Xiangyu Zhao, Fengze Yang, Lingyu Kong, Yanru Wang, Kexin Liu, Yinjuan Zhao, Xun Deng, Liwen Song, Ke Wei and Jiajin Tan
Plants 2026, 15(8), 1181; https://doi.org/10.3390/plants15081181 (registering DOI) - 11 Apr 2026
Abstract
Larix olgensis, a keystone timber species in Northeast China, is increasingly threatened by Neofusicoccum laricinum-induced shoot blight, a devastating disease that compromises forest health and necessitates sustainable management strategies. Here, we demonstrate that the endophytic bacterium Bacillus amyloliquefaciens JL54 elicits multifaceted [...] Read more.
Larix olgensis, a keystone timber species in Northeast China, is increasingly threatened by Neofusicoccum laricinum-induced shoot blight, a devastating disease that compromises forest health and necessitates sustainable management strategies. Here, we demonstrate that the endophytic bacterium Bacillus amyloliquefaciens JL54 elicits multifaceted defense responses in L. olgensis, enhancing resistance to pathogen infection. Greenhouse assays revealed that JL54 pretreatment reduced disease incidence by 12.5% and achieved 43.75% control efficacy while maintaining host vigor. Histochemical analyses identified JL54-induced rapid hydrogen peroxide (H2O2) accumulation, extensive lignin deposition, and localized programmed cell death (PCD), indicative of a primed immune response. Transcriptomic analyses uncovered distinct temporal defense patterns: early-stage responses (0 h post-inoculation) were characterized by upregulation of cutin, suberin, and wax biosynthesis pathways, reinforcing physical barriers, whereas late-stage responses (12 h post-inoculation) were dominated by ribosome- and proteostasis-related pathways (e.g., heat shock proteins [HSPs], glutathione S-transferases [GSTs]) to mitigate cellular damage. Biochemical assays corroborated these findings, with JL54 colonization reducing membrane lipid peroxidation (27.2% decrease in malondialdehyde content) and significantly elevating the activity of key defense enzymes, including peroxidase (POD), phenylalanine ammonia-lyase (PAL), and GST. Phytohormone profiling implicated jasmonic acid (JA) as the central mediator of induced systemic resistance (ISR), with JL54-potentiated JA signaling preceding pathogen containment. Collectively, these results demonstrate that JL54 contributes to a coordinated defense strategy in L. olgensis, integrating structural reinforcement (cuticle/lignin), oxidative stress management, and JA-mediated immune priming. These insights advance the understanding of endophyte-conferred resistance in conifers and highlight JL54’s potential as a biocontrol agent for sustainable forestry. Full article
(This article belongs to the Section Plant Protection and Biotic Interactions)
Show Figures

Figure 1

20 pages, 4549 KB  
Article
Online Track Anomaly Detection: Comparison of Different Machine Learning Techniques Through Injection of Synthetic Defects on Experimental Datasets
by Giovanni Bellacci, Luca Di Carlo, Marco Fiaschi, Luca Bocciolini, Carmine Zappacosta and Luca Pugi
Machines 2026, 14(4), 424; https://doi.org/10.3390/machines14040424 - 10 Apr 2026
Abstract
The adoption of instrumented wheelsets on diagnostic trains offers the possibility of continuous monitoring of wheel–rail contact forces. The collection of large datasets can be exploited for diagnostic purposes, aiming to localize specific track defects, allowing significant improvements in terms of safety and [...] Read more.
The adoption of instrumented wheelsets on diagnostic trains offers the possibility of continuous monitoring of wheel–rail contact forces. The collection of large datasets can be exploited for diagnostic purposes, aiming to localize specific track defects, allowing significant improvements in terms of safety and maintenance costs. Machine learning (ML) techniques can be used to automate anomaly detection. In this work, the authors compare the application of various ML algorithms based on the identification of different frequency or time-based features of analyzed signals. To perform the activity, a significant number and variety of local defects have been included in the recorded data. From a practical point of view, the insertion of real known defects into an existing line is extremely time-consuming, expensive, and not immune to safety issues. On the other hand, the design of anomaly detection algorithms involves the usage of relatively extended datasets with different faulty conditions. The authors propose deliberately adding real contact force profiles of healthy lines to a mix of synthetic signals, which substantially reproduce the behavior and the variability of foreseen faulty conditions. The results of this work, although preliminary and still to be completed, offer a contribution to the scientific community both in terms of obtained results and adopted methodologies. Full article
(This article belongs to the Special Issue AI-Driven Reliability Analysis and Predictive Maintenance)
Show Figures

Figure 1

20 pages, 881 KB  
Article
Characterization of Residual Woody Biomass for the Production of Densified Solid Biofuels and Their Local Utilization
by Mario Morales-Máximo, Ramiro Gudiño-Macedo, José Guadalupe Rutiaga-Quiñones, Juan Carlos Coral-Huacuz, Luis Fernando Pintor-Ibarra, Luis Bernardo López-Sosa and Víctor Manuel Ruíz-García
Fuels 2026, 7(2), 23; https://doi.org/10.3390/fuels7020023 - 10 Apr 2026
Abstract
The energy utilization of residual woody biomass is a relevant strategy for the decentralized energy transition and local waste management in rural areas. The objective of this study was to characterize (physically, chemically, and energetically) five types of residual biomass: pine branches, huinumo [...] Read more.
The energy utilization of residual woody biomass is a relevant strategy for the decentralized energy transition and local waste management in rural areas. The objective of this study was to characterize (physically, chemically, and energetically) five types of residual biomass: pine branches, huinumo (this material refers to the long, thin pine needles that, after drying and falling, form a layer on the forest floor), cherry branches and leaves, and grass waste generated in the community of San Francisco Pichátaro, Michoacán, Mexico, in order to evaluate its viability for the production of densified solid biofuels. A comprehensive analysis was conducted, including moisture content, higher heating value, proximate characterization, structural chemical analysis (using the Van Soest method), elemental CHONS analysis, ash microanalysis (by ICP-OES), and a multicriteria analysis with normalized energy and compositional indicators. The results showed that huinumo and cherry leaves were the most outstanding biomasses, presenting the highest heating values (20.7 MJ/kg) and low moisture and ash contents. Pine branches obtained the most balanced results, characterized by their equilibrium in fixed carbon and lignin, as well as their low potassium content. The multicriteria analysis showed that there is no absolute optimal biomass; however, it indicates that pine branches and huinumo are the most robust feedstocks for the production of briquettes or pellets. The results confirm the significant technical and environmental potential of local lignocellulosic residues for the production of solid biofuels and for contributing to sustainable energy solutions at the local scale. Full article
(This article belongs to the Special Issue Biofuels and Bioenergy: New Advances and Challenges)
Show Figures

Figure 1

20 pages, 702 KB  
Article
Tree Height Prediction Using a Double Hidden-Layer Neural Network and a Mixed-Effects Model
by Jianbo Shen, Xiangdong Lei, Yutang Li, Yuehong Pan and Gongming Wang
Plants 2026, 15(8), 1176; https://doi.org/10.3390/plants15081176 - 10 Apr 2026
Abstract
The double hidden-layer neural network has increasingly been applied in tree height modeling due to its superior performance. To improve the precision of tree height estimation, this study compared the performance of a double hidden-layer neural network with that of a nonlinear mixed-effects [...] Read more.
The double hidden-layer neural network has increasingly been applied in tree height modeling due to its superior performance. To improve the precision of tree height estimation, this study compared the performance of a double hidden-layer neural network with that of a nonlinear mixed-effects model, aiming to provide a new method for tree height prediction. Taking the Larix olgensis forest plantation in Jilin Province as the research object, a double hidden-layer back propagation (BP) neural network was established for tree height prediction by adopting trial and error, k-fold cross-validation, and near-domain optimization strategies. In constructing the nonlinear mixed-effects model, the overall and local differences in forest growth data, as well as the autocorrelation among the various levels of data, were considered. Accordingly, after determining the base model, random effects were introduced, the correlation variance–covariance matrix was calculated, and random parameters were estimated to compare the predictive performance of the two aforementioned models. For the mixed-effects model, the coefficient of determination R2 was 0.8590, the root mean square error (RMSE) was 1.6230, and the mean absolute error (MAE) was 2.2658. For the double hidden-layer BP neural network, the R2 reached 0.9068 (an increase of 5.56%), the RMSE was 1.3197 (a decrease of 18.69%), and the MAE was 1.2736 (a decrease of 43.79%). The results demonstrate that the double hidden-layer BP neural network is superior to the nonlinear mixed-effects model for tree height prediction. Therefore, the results provide a more accurate method for tree height prediction. Full article
(This article belongs to the Special Issue AI-Driven Machine Vision Technologies in Plant Science)
Show Figures

Figure 1

31 pages, 1306 KB  
Article
Governing Forest Rights Mortgage Loans Through Hybrid Governance: Institutional Innovation and Organizational Mediation in China’s Collective Forest Regions
by Liushan Fan, Wenlan Wang, Yuanzhu Wei, Yongbo Lai and Xingwei Ye
Forests 2026, 17(4), 464; https://doi.org/10.3390/f17040464 - 10 Apr 2026
Abstract
Forest Rights Mortgage Loans (FRMLs) have grown quickly in China’s collective forest areas, even though the basic conditions for this type of lending remain far from ideal. In many places, forest holdings are small and scattered, property rights are complex and not fully [...] Read more.
Forest Rights Mortgage Loans (FRMLs) have grown quickly in China’s collective forest areas, even though the basic conditions for this type of lending remain far from ideal. In many places, forest holdings are small and scattered, property rights are complex and not fully consolidated, and channels for disposing of collateral are limited. Under these circumstances, the Fulin Loan Model (FLM) in Fujian provides a useful case for understanding how forest-rights lending can still function in practice. Drawing on fieldwork, semi-structured interviews, and process tracing, this study explores both how the model was established and how it has been sustained over time. The analysis suggests that the FLM is neither a straightforward market-based lending tool nor merely a top-down policy arrangement. Rather, it relies on a more mixed form of governance in which local government support, banking procedures, and village-level social relations are brought together through specific organizational arrangements. These arrangements help lower the costs of early institutional experimentation, distribute and manage lending risks, and translate locally rooted trust into a form of credit support that formal financial institutions can recognize. As a single-case study, the FLM points to one possible way in which rural finance can be made workable under conditions of incomplete markets and strong social embeddedness. Full article
Show Figures

Figure 1

27 pages, 524 KB  
Article
Synthetic Data Augmentation for Imbalanced Tabular Protein Subcellular Localization: A Comparative Study of SMOTE, CTGAN, TVAE, and TabDDPM Methods
by Ali Fatih Gündüz and Canan Batur Şahin
Appl. Sci. 2026, 16(8), 3694; https://doi.org/10.3390/app16083694 - 9 Apr 2026
Abstract
Class imbalance is a persistent challenge in supervised machine learning, particularly in biological datasets where minority classes represent functionally critical categories. Synthetic data generation has emerged as a principal strategy for mitigating this problem, yet systematic comparisons of classical and modern deep generative [...] Read more.
Class imbalance is a persistent challenge in supervised machine learning, particularly in biological datasets where minority classes represent functionally critical categories. Synthetic data generation has emerged as a principal strategy for mitigating this problem, yet systematic comparisons of classical and modern deep generative approaches remain limited. This study presents a comprehensive benchmark evaluation of four synthetic data generation methods—SMOTE, CTGAN, TVAE, and TabDDPM—across two well-established biological datasets from the UCI Machine Learning Repository: the E. coli protein localization dataset (307 samples, 6 features, 4 classes) and the yeast protein localization dataset (1299 samples, 8 features, 4 classes). Synthetic data quality was rigorously assessed using a multi-dimensional evaluation framework encompassing distributional fidelity (Fréchet Distance, Wasserstein Distance), machine learning utility (Train-on-Synthetic-Test-on-Real and Train-on-Real-Test-on-Real protocols using XGBoost version 3.2.0, Logistic Regression, Support Vector Machines, and Random Forest), and distinguishability (Classifier Two-Sample Test). The datasets are rather imbalanced. During the experiments, the dataset size increased to three times its original size while preserving the imbalanced class-sample ratio. To evaluate the quality of synthetic data, the max(AUC,1−AUC) score is proposed. This score is inversely proportional to classification performance, indicating that synthetic data are not easily distinguishable from real data. Per-class analysis reveals that minority classes remain the primary challenge across all generative methods. SMOTE and TabDDPM obtained the highest predictive utility F1-scores across both datasets. TVAE offers the strongest distributional fidelity among deep generative models, producing synthetic samples that are most difficult to distinguish from real data (lowest C2ST scores). CTGAN exhibits significant performance degradation on both small- and medium-scale datasets, with F1 utility ratios below 0.50. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

23 pages, 13020 KB  
Article
Identification of Key Osteoarthritis-Associated Genes Based on DNA Methylation
by Jian Zhao, Changwu Wu, Zhejun Kuang, Han Wang and Lijuan Shi
Int. J. Mol. Sci. 2026, 27(8), 3388; https://doi.org/10.3390/ijms27083388 - 9 Apr 2026
Abstract
Osteoarthritis (OA) is a complex degenerative joint disease for which early diagnosis and clear molecular characterization remain limited. DNA methylation has been increasingly recognized as an important regulatory factor in OA pathogenesis. In this study, we proposed an integrative computational framework combining statistical [...] Read more.
Osteoarthritis (OA) is a complex degenerative joint disease for which early diagnosis and clear molecular characterization remain limited. DNA methylation has been increasingly recognized as an important regulatory factor in OA pathogenesis. In this study, we proposed an integrative computational framework combining statistical analysis, machine learning, deep learning, and functional genomics to identify and validate OA-associated genes and methylation biomarkers for diagnostic and biological interpretation. Candidate CpG sites were obtained using two complementary strategies: differential methylation analysis and selection of loci located near transcription start sites of previously reported OA-related genes. Key features were further refined using support vector machine recursive feature elimination and random forest algorithms. Based on the selected loci, we developed a feature-fusion diagnostic model that combines Transformer and convolutional neural networks with adaptive weighting to capture both global dependency structures and local methylation patterns. A panel of 220 methylation sites demonstrated stable and reproducible diagnostic performance in an independent cohort. Functional annotation and pathway analysis highlighted several established OA-associated genes, including TGFBR2, SMAD3, PPARG, and MAPK3, and suggested INHBB as a potential novel effector gene, with additional support for AMH and INHBE involvement. Overall, this study presents a robust methylation-based framework for identifying key OA-associated genes and provides new insights into the epigenetic mechanisms underlying OA. Full article
(This article belongs to the Section Molecular Genetics and Genomics)
Show Figures

Figure 1

21 pages, 1931 KB  
Article
A Shapelet Transform-Based Method for Structural Damage Identification: A Case Study on a Wooden Truss Bridge
by Ke Gan, Yingzhuo Ye, Fulin Nie, Ching Tai Ng and Liujie Chen
Sensors 2026, 26(8), 2323; https://doi.org/10.3390/s26082323 - 9 Apr 2026
Abstract
The impact of environmental disturbances and sensor deployment variations on damage identification represents a critical bottleneck that constrains the practical effectiveness of structural health monitoring. Existing methods addressing these challenges often suffer from poor interpretability due to information loss during feature extraction or [...] Read more.
The impact of environmental disturbances and sensor deployment variations on damage identification represents a critical bottleneck that constrains the practical effectiveness of structural health monitoring. Existing methods addressing these challenges often suffer from poor interpretability due to information loss during feature extraction or exhibit insufficient sensitivity in identifying early-stage minor damage. This paper proposes a damage identification method based on the Shapelet Transform and Random Forest classifier, which extracts highly interpretable local shape features from vibration response signals to achieve robust identification of structural state changes. The study utilizes measured random vibration response data from a timber truss bridge. The dataset comprises four reference states collected on different dates and five damage states simulated by additional masses ranging from +23.5 g to +193.7 g, with sensors deployed in both vertical and horizontal directions. The Shapelet Transform selects local subsequences with high information gain from the original time series as features, which are subsequently classified using the Random Forest algorithm. The experimental design systematically investigates the influence of different damage severities, sensor locations, and environmental variations on method performance. The results demonstrate that with a Shapelet extraction time of 10 min, the method achieves 100% identification accuracy across multiple operating conditions comprehensively considering environmental variations, sensor location differences, and varying damage severities. When the extraction time is reduced to 5 min, 3 min, and 1 min, the average accuracies are 93.98%, 89.51%, and 58.48%, respectively. The method effectively identifies the minimum simulated damage (+23.5 g), which represents only 0.07% of the total structural mass, while maintaining stable performance under varying sensor locations and environmental conditions. Compared to traditional methods based on global frequency-domain features or statistical characteristics, the proposed method extracts physically meaningful local Shapelet features, offering significant advantages in interpretability. In contrast to deep learning approaches, this method demonstrates greater robustness under limited sample conditions. This study confirms that the combined framework of the Shapelet Transform and Random Forest can effectively address multiple real-world challenges in structural health monitoring, delivering high accuracy, strong robustness, and excellent interpretability, thereby providing a novel approach for developing practical real-time damage identification systems. Full article
(This article belongs to the Section Industrial Sensors)
Show Figures

Figure 1

29 pages, 3241 KB  
Article
Evaluation of Global Data for National-Scale Soil Depth Mapping in Data-Scarce Regions: A Case Study from Sri Lanka
by Ebrahim Jahanshiri, Eranga M. Wimalasiri, Yinan Yu and Ranjith B. Mapa
Soil Syst. 2026, 10(4), 47; https://doi.org/10.3390/soilsystems10040047 - 9 Apr 2026
Abstract
High-resolution soil depth maps are valuable for environmental modelling, yet reliable data remains scarce in the tropics. This study evaluates the feasibility of mapping depth to bedrock (DTB) in Sri Lanka using a legacy dataset (n = 88) and global environmental covariates (n [...] Read more.
High-resolution soil depth maps are valuable for environmental modelling, yet reliable data remains scarce in the tropics. This study evaluates the feasibility of mapping depth to bedrock (DTB) in Sri Lanka using a legacy dataset (n = 88) and global environmental covariates (n = 247). A robust machine learning workflow was employed—including feature selection, hyperparameter tuning, and a stacked ensemble of four algorithms (Random Forest, XGBoost, Cubist, SVM)—to test the limits of global data for local mapping. Despite rigorous optimization, the final ensemble model achieved a performance of R2 = 0.197 (RMSE = 35.4 cm) under spatial cross-validation. While still modest, this result significantly outperforms existing global products and quantifies the “prediction gap” inherent in using ~1 km resolution global covariates to model micro-scale soil variability. An initial exploration involved log-transforming the target variable; however, following rigorous testing, the untransformed depth was modelled directly to avoid bias in back-transformation. A robustness experiment was further conducted, reducing predictors from 24 to 12, which degraded performance, confirming that the model captures complex, physically meaningful climatic interactions rather than fitting noise. The study concludes that while global covariates can capture regional meso-scale trends (explaining ~20% of variance), they are insufficient for resolving local micro-relief (<50 m). The resulting map and uncertainty products provide a critical “baseline” for national planning, but effectively demonstrate that future improvements will require investment in higher-resolution local covariates (e.g., LiDAR) rather than more complex algorithms. Full article
(This article belongs to the Special Issue Use of Modern Statistical Methods in Soil Science)
Show Figures

Figure 1

26 pages, 6248 KB  
Article
Slope–Wind Coupling Effects on Fire Behavior and Emission Dynamics During Prescribed Burning in Mountainous Yunnan Pine Forests
by Tengteng Long, Yun Liu, Xiaohui Pu, Zhi Li, Shun Li, Qiuhua Wang, Li Han, Ning Lu, Leiguang Wang and Weiheng Xu
Fire 2026, 9(4), 155; https://doi.org/10.3390/fire9040155 - 9 Apr 2026
Abstract
Prescribed burning is important for reducing wildfire risk and regulating fuel loads, but its implementation in mountainous forests is strongly influenced by the coupled effects of the wind field and topography, making it difficult to control. This study focuses on Yunnan pine ( [...] Read more.
Prescribed burning is important for reducing wildfire risk and regulating fuel loads, but its implementation in mountainous forests is strongly influenced by the coupled effects of the wind field and topography, making it difficult to control. This study focuses on Yunnan pine (Pinus yunnanensis) forests in southwestern China. A three-dimensional Fire Dynamics Simulator (FDS) combined with measured fuel characteristics was used to simulate 21 slope (0–35°) and wind speed (0–2 m s−1) combinations to quantitatively analyze the fire spread, flame structure, and gaseous emission characteristics during downslope prescribed burning. The local fire spread rate (ROS), evaluated along three lateral lines (Y = 2.5, 5.0, and 7.5 m), exhibits a non-monotonic dependence on slope over the tested range, with a minimum near 30° and a modest rebound at 35°. A downslope wind of 1 m s−1 promotes near-surface heating and accelerates spread, whereas a stronger wind of 2 m s−1 lifts flames away from the fuel bed and suppresses combustion. Thermal field analysis reveals that peak temperature decreases with increasing slope and that a late-stage secondary heating episode occurs at 35°. CO2 emissions are significantly positively correlated with fuel consumption, reaching a peak of 717.5 kg under a 35° slope and no-wind conditions. CO emissions, as an indicator of combustion efficiency, reach their highest value of 2.23 kg at a 35° slope and a wind speed of 1 m s−1, indicating that their trend is not entirely consistent with the ROS and temperature and that there is a certain degree of decoupling. The interaction between slope and wind speed transforms fire behavior from a cooperative to a competitive mechanism, and the topography–wind field coupling provides differentiated control over the combustion intensity and completeness. This study provides a scientific basis for the safe implementation of mountain burning programs and for regional carbon emission assessments. Full article
Show Figures

Figure 1

18 pages, 3582 KB  
Article
Multi-Objective Eco-Routing Optimization for Timber Transportation Considering Carbon Emissions and Ecological Disturbance
by Dongtao Han and Yuewei Ma
Sustainability 2026, 18(8), 3706; https://doi.org/10.3390/su18083706 - 9 Apr 2026
Abstract
Forest harvesting transportation planning must balance operational efficiency with environmental sustainability, because timber transportation can cause both soil disturbance and carbon emissions. However, most vehicle routing studies primarily focus on economic objectives such as distance or cost minimization, whereas environmental impacts are often [...] Read more.
Forest harvesting transportation planning must balance operational efficiency with environmental sustainability, because timber transportation can cause both soil disturbance and carbon emissions. However, most vehicle routing studies primarily focus on economic objectives such as distance or cost minimization, whereas environmental impacts are often considered separately. The integrated optimization of ecological disturbance and carbon emissions remains limited in forest transportation planning. To address this gap, this study formulates a multi-vehicle routing optimization model for timber transportation that simultaneously minimizes transportation distance, makespan, soil disturbance, and CO2 emissions within a hierarchical forest road network. An enhanced evolutionary algorithm, Eco-Constrained Lévy-flight Local Search NSGA-II (ECLS-NSGA-II), is proposed to improve convergence and maintain environmentally favorable routing solutions. Simulation experiments comparing ECLS-NSGA-II with NSGA-II, MOPSO, MOEA/D, and WS-GA demonstrate that the proposed method achieves superior performance across all objectives, producing shorter routes, lower completion times, and reduced CO2 emissions while maintaining minimal ecological disturbance. Additional experiments on randomly generated networks further confirm the robustness of the proposed approach. These results indicate that the proposed framework provides an effective methodological tool for environmentally sustainable timber transportation planning in forest operations. Full article
(This article belongs to the Topic Mobility Engineering and Sustainability)
Show Figures

Figure 1

14 pages, 2627 KB  
Article
Comparative Assessment of Hyperspectral Image Segmentation Algorithms for Fruit Defect Detection Under Different Illumination Conditions
by Anastasia Zolotukhina, Anton Sudarev, Georgiy Nesterov and Demid Khokhlov
J. Imaging 2026, 12(4), 160; https://doi.org/10.3390/jimaging12040160 - 8 Apr 2026
Viewed by 129
Abstract
This study presents a comparative analysis of hyperspectral image segmentation algorithms for fruit defect detection under different illumination conditions. The research evaluates the performance of four segmentation methods (Spectral Angle Mapper, Random Forest, Support Vector Machine, and Neural Network) using three distinct illumination [...] Read more.
This study presents a comparative analysis of hyperspectral image segmentation algorithms for fruit defect detection under different illumination conditions. The research evaluates the performance of four segmentation methods (Spectral Angle Mapper, Random Forest, Support Vector Machine, and Neural Network) using three distinct illumination modes (local, simultaneous and sequential). The experimental setup employed hyperspectral imaging to assess tomato fruit samples, with data acquisition performed across the 450–850 nm spectral range. Quantitative metrics, including accuracy, error rate, precision, recall, F1-score, and Intersection over Union (IoU), were used to evaluate algorithm performance. Key findings indicate that Random Forest demonstrated superior performance across most metrics, particularly under simultaneous illumination conditions. The highest accuracy was achieved by Random Forest under sequential illumination (0.9971), while the best combination of segmentation metrics was obtained under simultaneous illumination, with an F1-score of 0.8996 and an IoU of 0.8176. The Neural Network showed competitive results. The Spectral Angle Mapper proved sensitive to illumination variations but excelled in specific scenarios requiring minimal memory usage. By demonstrating that acquisition protocol optimization can substantially improve segmentation performance, our results support the development of accurate, non-contact, high-throughput inspection systems and contribute to reducing postharvest losses and improving supply chain quality control. Full article
(This article belongs to the Section Color, Multi-spectral, and Hyperspectral Imaging)
Show Figures

Figure 1

18 pages, 2029 KB  
Article
Revolutionizing Pediatric Myopia Care: A Machine Learning Approach for Rapid and Accurate Pre-Clinical Screening
by Siqi Zhang and Qi Zhao
J. Clin. Med. 2026, 15(8), 2834; https://doi.org/10.3390/jcm15082834 - 8 Apr 2026
Viewed by 122
Abstract
Background/Objective: Myopia has become a prominent public health issue in China, significantly impacting the visual health of children and adolescents. The condition is characterized by a high incidence rate, increasing prevalence, and a trend toward earlier onset, highlighting the critical need for early [...] Read more.
Background/Objective: Myopia has become a prominent public health issue in China, significantly impacting the visual health of children and adolescents. The condition is characterized by a high incidence rate, increasing prevalence, and a trend toward earlier onset, highlighting the critical need for early and accurate diagnosis. Current clinical diagnostic methods primarily depend on subjective evaluations by optometrists and the use of isolated parameters, leading to inefficiencies and inconsistent outcomes. Moreover, there remains a lack of diagnostic tools that can effectively integrate multi-parameter analysis while ensuring robust data privacy protection. This study aims to develop an artificial intelligence (AI) diagnostic model that achieves objective, accurate, and safe diagnosis of myopia in children without cycloplegia through multi-parameter fusion and to enable local deployment. The proposed model is intended to be a reliable tool for clinical applications and large-scale screening projects, while ensuring strong protection of patient privacy. Methods: We built a transparent, rule-driven AI framework using clinical guidelines. Key ocular parameters—visual acuity, spherical equivalent, axial length, corneal curvature, and axial ratio—were encoded as logical rules in Python and incorporated via instruction fine-tuning. The model was trained and validated on retrospective clinical data (70% training, 15% validation, 15% test) using five algorithms: gradient boosting, logistic regression, random forest, SVM, and XGBoost. Performance was evaluated using accuracy, precision, recall, F1 score, and mean AUC across classes. Results: The model classifies refractive status into five categories: hyperopia, pre-myopia, mild, moderate, and high myopia. All five different algorithms demonstrated excellent diagnostic and classification performance. Gradient boosting achieved the best overall performance, with an accuracy of 98.67%, an F1 score of 98.67%, and a mean AUC of 0.957—outperforming all other models. Conclusions: This study successfully developed an artificial intelligence-based myopia diagnosis system for children under non-dilated pupil conditions. The system is interpretable and privacy-preserving, and has excellent diagnostic and classification performance, making it suitable for clinical decision support and large-scale screening applications. It has great potential to promote the development of early intervention, precision prevention, and control strategies for childhood myopia. Full article
(This article belongs to the Section Ophthalmology)
Show Figures

Figure 1

26 pages, 3800 KB  
Article
Prediction of Ship Estimated Time of Arrival Based on BO-CNN-LSTM Model
by Qiong Chen, Zhipeng Yang, Jiaqi Gao, Yui-yip Lau and Pengfei Zhang
J. Mar. Sci. Eng. 2026, 14(8), 694; https://doi.org/10.3390/jmse14080694 - 8 Apr 2026
Viewed by 101
Abstract
Accurate prediction of a ship’s Estimated Time of Arrival (ETA) is of great significance for port scheduling, logistics management, and navigation safety. Traditional ETA prediction approaches often rely on manual experience for parameter tuning, which tends to be inefficient and susceptible to subjective [...] Read more.
Accurate prediction of a ship’s Estimated Time of Arrival (ETA) is of great significance for port scheduling, logistics management, and navigation safety. Traditional ETA prediction approaches often rely on manual experience for parameter tuning, which tends to be inefficient and susceptible to subjective factors. To address this issue and improve prediction accuracy, this study proposes a hybrid modeling framework, integrating Bayesian Optimization (BO), Convolutional Neural Networks (CNNs), and Long Short-Term Memory (LSTM) networks. In this approach, Automatic Identification System (AIS) data is leveraged to predict the total voyage duration before departure, thereby deriving the vessel’s ETA. The model, referred to as BO-CNN-LSTM, utilizes BO for automatic hyperparameter tuning, employs CNN for extracting local features, and applies LSTM network to capture temporal dependencies. The model is developed using a dataset of 32,972 distinct voyage records, among which 23,947 are retained as valid samples after data cleaning. Pearson correlation analysis is conducted to select key input variables, including navigation speed, ship type, sailing distance, and deadweight tonnage. Additionally, sailing distance is processed using the Ramer–Douglas–Peucker algorithm. Experimental evaluation indicates that the BO-CNN-LSTM model achieves a coefficient of determination of 0.987, along with a mean absolute error and root mean square error of 6.078 and 8.730, respectively. These results significantly outperform comparison models such as CNN, LSTM, CNN-LSTM, random forest, AdaBoost, and Elman neural networks. Overall, this study validates the effectiveness and superiority of the proposed BO-CNN-LSTM model in ship ETA prediction, providing an efficient and effective prediction solution for intelligent maritime transportation systems. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

28 pages, 658 KB  
Article
Dual-Branch Deep Remote Sensing for Growth Anomaly and Risk Perception in Smart Horticultural Systems
by Yan Bai, Ceteng Fu, Shen Liu, Xichen Wang, Jibo Fan, Yuecheng Li and Yihong Song
Horticulturae 2026, 12(4), 461; https://doi.org/10.3390/horticulturae12040461 - 8 Apr 2026
Viewed by 175
Abstract
In the context of the rapid development of smart horticulture, a deep remote sensing-based dual detection method for horticultural crop growth anomalies and safety risks was proposed to address the limitations of existing remote sensing monitoring approaches. These conventional methods, which predominantly focused [...] Read more.
In the context of the rapid development of smart horticulture, a deep remote sensing-based dual detection method for horticultural crop growth anomalies and safety risks was proposed to address the limitations of existing remote sensing monitoring approaches. These conventional methods, which predominantly focused on growth vigor assessment or single-task anomaly detection, had difficulty distinguishing anomalies from actual production risks and exhibited insufficient sensitivity to weak anomalies and complex temporal disturbances. Within a unified framework, a growth state modeling branch and an anomaly perception branch were constructed, enabling the joint modeling of normal growth trajectories and anomalous deviation features. By further introducing a risk joint discrimination mechanism, an integrated analysis pipeline from anomaly identification to risk assessment was achieved. Multi-temporal remote sensing features were used as inputs, through which normal crop growth patterns were characterized via trend perception, texture modeling, and temporal aggregation, while sensitivity to local disturbances and weak anomaly signals was enhanced by anomaly embeddings and energy representations. Systematic experiments conducted on multi-regional and multi-crop horticultural remote sensing datasets demonstrated that the proposed method significantly outperformed comparative approaches, including traditional threshold-based methods, support vector machines, random forests, autoencoders, ConvLSTM, and temporal transformer models. In the dual task of horticultural crop growth anomaly detection and safety risk identification, an accuracy of approximately 0.91 and an F1 score of 0.88 were achieved, indicating higher anomaly recognition accuracy and more stable risk discrimination capability. Further anomaly-type awareness experiments showed that consistent performance was maintained across diverse real-world production scenarios, including climate stress, disease-induced anomalies, and management errors. Full article
(This article belongs to the Special Issue New Trends in Smart Horticulture)
Show Figures

Figure 1

Back to TopTop