Next Article in Journal
Effects of Chili Straw Biochar on Alfalfa (Medicago sativa L.) Seed Germination and Seedling Growth on Electrolytic Manganese Residue
Previous Article in Journal
Transcriptomic Responses of Wheat Anthers to Drought Stress and Antitranspirants
Previous Article in Special Issue
Detection of Pine Wilt Disease in UAV Remote Sensing Images Based on SLMW-Net
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Sparse-MoE-SAM: A Lightweight Framework Integrating MoE and SAM with a Sparse Attention Mechanism for Plant Disease Segmentation in Resource-Constrained Environments

1
School of Electronic Information and Physics, Central South University of Forestry and Technology, Changsha 410004, China
2
School of Computer, Jiangsu University of Science and Technology, Zhenjiang 212100, China
3
School of Forestry, Central South University of Forestry and Technology, Changsha 410004, China
4
Bangor College, Central South University of Forestry and Technology, Changsha 410004, China
*
Authors to whom correspondence should be addressed.
Plants 2025, 14(17), 2634; https://doi.org/10.3390/plants14172634
Submission received: 17 July 2025 / Revised: 17 August 2025 / Accepted: 18 August 2025 / Published: 24 August 2025
(This article belongs to the Special Issue Advances in Artificial Intelligence for Plant Research)

Abstract

Plant disease segmentation has achieved significant progress with the help of artificial intelligence. However, deploying high-accuracy segmentation models in resource-limited settings faces three key challenges, as follows: (A) Traditional dense attention mechanisms incur quadratic computational complexity growth (O(n2d)), rendering them ill-suited for low-power hardware. (B) Naturally sparse spatial distributions and large-scale variations in the lesions on leaves necessitate models that concurrently capture long-range dependencies and local details. (C) Complex backgrounds and variable lighting in field images often induce segmentation errors. To address these challenges, we propose Sparse-MoE-SAM, an efficient framework based on an enhanced Segment Anything Model (SAM). This deep learning framework integrates sparse attention mechanisms with a two-stage mixture of experts (MoE) decoder. The sparse attention dynamically activates key channels aligned with lesion sparsity patterns, reducing self-attention complexity while preserving long-range context. Stage 1 of the MoE decoder performs coarse-grained boundary localization; Stage 2 achieves fine-grained segmentation by leveraging specialized experts within the MoE, significantly enhancing edge discrimination accuracy. The expert repository—comprising standard convolutions, dilated convolutions, and depthwise separable convolutions—dynamically routes features through optimized processing paths based on input texture and lesion morphology. This enables robust segmentation across diverse leaf textures and plant developmental stages. Further, we design a sparse attention-enhanced Atrous Spatial Pyramid Pooling (ASPP) module to capture multi-scale contexts for both extensive lesions and small spots. Evaluations on three heterogeneous datasets (PlantVillage Extended, CVPPP, and our self-collected field images) show that Sparse-MoE-SAM achieves a mean Intersection-over-Union (mIoU) of 94.2%—surpassing standard SAM by 2.5 percentage points—while reducing computational costs by 23.7% compared to the original SAM baseline. The model also demonstrates balanced performance across disease classes and enhanced hardware compatibility. Our work validates that integrating sparse attention with MoE mechanisms sustains accuracy while drastically lowering computational demands, enabling the scalable deployment of plant disease segmentation models on mobile and edge devices.
Keywords: plant disease segmentation; sparse attention; mixture of experts; SAM (Segment Anything Model) plant disease segmentation; sparse attention; mixture of experts; SAM (Segment Anything Model)

Share and Cite

MDPI and ACS Style

Zhao, B.; Kang, X.; Zhou, H.; Shi, Z.; Li, L.; Zhou, G.; Wan, F.; Zhu, J.; Yan, Y.; Li, L.; et al. Sparse-MoE-SAM: A Lightweight Framework Integrating MoE and SAM with a Sparse Attention Mechanism for Plant Disease Segmentation in Resource-Constrained Environments. Plants 2025, 14, 2634. https://doi.org/10.3390/plants14172634

AMA Style

Zhao B, Kang X, Zhou H, Shi Z, Li L, Zhou G, Wan F, Zhu J, Yan Y, Li L, et al. Sparse-MoE-SAM: A Lightweight Framework Integrating MoE and SAM with a Sparse Attention Mechanism for Plant Disease Segmentation in Resource-Constrained Environments. Plants. 2025; 14(17):2634. https://doi.org/10.3390/plants14172634

Chicago/Turabian Style

Zhao, Benhan, Xilin Kang, Hao Zhou, Ziyang Shi, Lin Li, Guoxiong Zhou, Fangying Wan, Jiangzhang Zhu, Yongming Yan, Leheng Li, and et al. 2025. "Sparse-MoE-SAM: A Lightweight Framework Integrating MoE and SAM with a Sparse Attention Mechanism for Plant Disease Segmentation in Resource-Constrained Environments" Plants 14, no. 17: 2634. https://doi.org/10.3390/plants14172634

APA Style

Zhao, B., Kang, X., Zhou, H., Shi, Z., Li, L., Zhou, G., Wan, F., Zhu, J., Yan, Y., Li, L., & Wu, Y. (2025). Sparse-MoE-SAM: A Lightweight Framework Integrating MoE and SAM with a Sparse Attention Mechanism for Plant Disease Segmentation in Resource-Constrained Environments. Plants, 14(17), 2634. https://doi.org/10.3390/plants14172634

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop