SymbioMamba: An Efficient Dual-Stream State-Space Framework for Real-Time Maize Disease and Yield Analysis on UAV Platforms
Abstract
1. Introduction
- A heterogeneous dual-stream encoding framework is proposed, which, unlike standard homogeneous backbones, utilizes architectural asymmetry to fundamentally resolve the scale conflict between microscopic lesion recognition and macroscopic biomass continuity features.
- A pathology– biomass collaborative interaction (PBCI) mechanism is designed to replace traditional black-box multi-task heads. By explicitly embedding the agronomic prior that “disease stress suppresses yield” into the gating logic, the framework ensures that predictions are not merely numerically accurate but biologically consistent.
- A topology-aligned cross-architecture knowledge distillation paradigm is introduced. This method addresses the foundational challenge of transferring knowledge across mathematically distinct domains—specifically from global state-space Mamba representations to lightweight local convolutions—by aligning the underlying feature manifold topology.
- Comprehensive experimental validation on field-scale UAV datasets and the NVIDIA Jetson AGX Orin platform demonstrates that SymbioMamba outperforms conventional SOTA architectures in both predictive precision and deployment efficiency, establishing a new benchmark for causality-driven precision agriculture.
2. Related Work
2.1. UAV-Based Deep Learning Methods for Crop Phenotyping
2.2. Evolution of Visual Backbone Networks
2.3. Efficient Computation and Knowledge Transfer for UAV Edge Deployment
3. Materials and Method
3.1. Dataset Acquisition
3.2. Data Preprocessing and Augmentation
3.3. The SymbioMamba Framework
3.3.1. Overall Architecture
3.3.2. Stem Layer: Visual Embedding
3.4. Heterogeneous Dual-Stream Encoder
3.4.1. Micro-Texture Stream
3.4.2. Macro-Context-Scan Stream
3.5. Pathology–Biomass Collaborative Interaction
3.6. Task-Specific Decoupled Prediction Heads
3.7. Topology-Aligning Cross-Architecture Distillation
3.8. Evaluation Metrics
- Parameter count (Params): the total number of learnable weights in the model, measured in millions (M).
- Floating-point operations (FLOPs): the computational complexity of the model, measured in billions of floating-point operations (G).
- Frames per second (FPS): the actual inference speed measured on the target hardware, where FPS is considered to satisfy real-time performance requirements.
4. Results and Discussion
4.1. Experiment Settings
4.1.1. Implementation Details
4.1.2. Baselines
- Task-Specific Agricultural Baselines: YOLOv11 [42] and YOLO-Mamba [43] serve as the primary benchmarks for disease perception, representing high-speed edge detection and contemporary SSM-based detection, respectively. Additionally, AgriTransformer [44] is included as a specialized baseline that utilizes self-attention mechanisms tailored for agricultural scene understanding, allowing us to evaluate SymbioMamba against models specifically optimized for crop phenotypic variations.
- Agricultural Yield Regression Baselines: For yield estimation, we selected ConvLSTM [45] and TasselNetV3 [46]. These represent two divergent paradigms in agronomic modeling: the former focuses on spatiotemporal gating for growth dynamics, while the latter represents the SOTA in morphological counting and density-based regression. Their inclusion allows us to demonstrate how SymbioMamba bridges the gap between purely structural modeling and stress-aware yield calibration.
- Architectural Evolution Baselines: To verify whether the observed performance gains stem from our symbiotic logic rather than mere backbone capacity, we included a spectrum of general-purpose encoders. This includes the convolution-centered ConvNeXt [47] and the mobile-optimized EfficientNet-Lite4 [48], which represents the pinnacle of NAS-driven CNN efficiency for edge devices. We also compared against the self-attention-driven Swin Transformer [49], the hybrid MobileViT [50]—which integrates local convolutions with global Transformers for mobile deployment—and the pure SSM-based VMamba [51].
4.2. Performance Comparison on Disease Detection Task Across Different Models
4.3. Performance Comparison on Yield-Prediction Task Across Different Models
4.4. Ablation Studies
4.5. Robustness and Cross-Scenario Adaptability Analysis
4.6. Edge Deployment and Field Application Validation
4.7. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Cole, M.B.; Augustin, M.A.; Robertson, M.J.; Manners, J.M. The science of food security. NPJ Sci. Food 2018, 2, 14. [Google Scholar] [CrossRef] [PubMed]
- Primicerio, J.; Di Gennaro, S.F.; Fiorillo, E.; Genesio, L.; Lugato, E.; Matese, A.; Vaccari, F.P. A flexible unmanned aerial vehicle for precision agriculture. Precis. Agric. 2012, 13, 517–523. [Google Scholar] [CrossRef]
- Toscano, F.; Fiorentino, C.; Capece, N.; Erra, U.; Travascia, D.; Scopa, A.; Drosos, M.; D’Antonio, P. Unmanned aerial vehicle for precision agriculture: A review. IEEE Access 2024, 12, 69188–69205. [Google Scholar] [CrossRef]
- Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.G. Unmanned Aerial Vehicles (UAV) in precision agriculture: Applications and challenges. Energies 2021, 15, 217. [Google Scholar] [CrossRef]
- Chriki, A.; Touati, H.; Snoussi, H.; Kamoun, F. Deep learning and handcrafted features for one-class anomaly detection in UAV video. Multimed. Tools Appl. 2021, 80, 2599–2620. [Google Scholar] [CrossRef]
- Bouguettaya, A.; Zarzour, H.; Kechida, A.; Taberkit, A.M. Deep learning techniques to classify agricultural crops through UAV imagery: A review. Neural Comput. Appl. 2022, 34, 9511–9536. [Google Scholar] [CrossRef]
- MirhoseiniNejad, S.M.; Abbasi-Moghadam, D.; Sharifi, A. ConvLSTM-ViT: A deep neural network for crop yield prediction using Earth observations and remotely sensed data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 17489–17502. [Google Scholar] [CrossRef]
- Senarathna, J.I. Enhancing Rice Production Through a Multi-Task Neural Network Framework: A Smart Agricultural Solution for Growth Monitoring, Disease Detection, and Yield Prediction. Preprints 2025. [Google Scholar] [CrossRef]
- Alirezazadeh, P.; Schirrmann, M.; Stolzenburg, F. A comparative analysis of deep learning methods for weed classification of high-resolution UAV images. J. Plant Dis. Prot. 2024, 131, 227–236. [Google Scholar] [CrossRef]
- Castellano, G.; De Marinis, P.; Vessio, G. Applying knowledge distillation to improve weed mapping with drones. In Proceedings of the 2023 18th Conference on Computer Science and Intelligence Systems (FedCSIS), Warsaw, Poland, 17–20 September 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 393–400. [Google Scholar]
- Zhao, M.; Wang, D.; Zhang, G.; Cao, W.; Xu, S.; Li, Z.; Liu, X. Evaluating Maize Emergence Quality with Multi-task YOLO11-Mamba and UAV-RGB Remote Sensing. Smart Agric. Technol. 2025, 12, 101351. [Google Scholar] [CrossRef]
- Mittal, P. A comprehensive survey of deep learning-based lightweight object detection models for edge devices. Artif. Intell. Rev. 2024, 57, 242. [Google Scholar] [CrossRef]
- Liu, S.; Liang, Y.; Gitter, A. Loss-balanced task weighting to reduce negative transfer in multi-task learning. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; AAAI Press: Palo Alto, CA, USA, 2019; Volume 33, pp. 9977–9978. [Google Scholar]
- Tahir, M.N.; Lan, Y.; Zhang, Y.; Wenjiang, H.; Wang, Y.; Naqvi, S.M.Z.A. Application of unmanned aerial vehicles in precision agriculture. In Precision Agriculture; Elsevier: San Diego, CA, USA, 2023; pp. 55–70. [Google Scholar]
- Bai, X.; Liu, P.; Cao, Z.; Lu, H.; Xiong, H.; Yang, A.; Cai, Z.; Wang, J.; Yao, J. Rice plant counting, locating, and sizing method based on high-throughput UAV RGB images. Plant Phenomics 2023, 5, 0020. [Google Scholar] [CrossRef] [PubMed]
- Mora, J.J.; Selvaraj, M.G.; Alvarez, C.I.; Safari, N.; Blomme, G. From pixels to plant health: Accurate detection of banana Xanthomonas wilt in complex African landscapes using high-resolution UAV images and deep learning. Discov. Appl. Sci. 2024, 6, 377. [Google Scholar] [CrossRef]
- Lyu, Y.; Han, X.; Wang, P.; Shin, J.Y.; Ju, M.W. Unmanned aerial vehicle-based rgb imaging and lightweight deep learning for downy mildew detection in kimchi cabbage. Remote Sens. 2025, 17, 2388. [Google Scholar] [CrossRef]
- You, S.; Li, B.; Chen, Y.; Ren, Z.; Liu, Y.; Wu, Q.; Tao, J.; Zhang, Z.; Zhang, C.; Xue, F.; et al. Rose-Mamba-YOLO: An enhanced framework for efficient and accurate greenhouse rose monitoring. Front. Plant Sci. 2025, 16, 1607582. [Google Scholar] [CrossRef]
- Sharma, V.; Patel, V.K.; Sahu, Y.; Vyas, M. AgriTransformer: Synergizing Robust Deep Transformer Model for Intelligent Farming. In Communication and Intelligent Systems, Proceedings of ICCIS 2024; Springer: Singapore, 2024; pp. 397–406. [Google Scholar]
- Bhuvaneswari, P.; Srilatha, K.; Kavya, A.; Anitha, T.; Sasikala, V. Enhanced Plant Pest Classification by Leveraging CBAM Attention in ResNet-9. In Proceedings of the 2025 Fourth International Conference on Smart Technologies, Communication and Robotics (STCR), Sathyamangalam, India, 9–10 May 2025; IEEE: Piscataway, NJ, USA, 2025; pp. 1–5. [Google Scholar]
- Ray, R.K.; Chakravarty, S.; Dash, S.; Ghosh, A.; Mohanty, S.N.; Chirra, V.R.R.; Ayouni, S.; Khan, M.I. Precision pest management in agriculture using Inception V3 and EfficientNet B4: A deep learning approach for crop protection. Inf. Process. Agric. 2025, 13, 142–161. [Google Scholar] [CrossRef]
- Barman, U.; Sarma, P.; Rahman, M.; Deka, V.; Lahkar, S.; Sharma, V.; Saikia, M.J. Vit-SmartAgri: Vision transformer and smartphone-based plant disease detection for smart agriculture. Agronomy 2024, 14, 327. [Google Scholar] [CrossRef]
- Duc, C.M.; Fukui, H. SatMamba: Development of Foundation Models for Remote Sensing Imagery Using State Space Models. arXiv 2025, arXiv:2502.00435. [Google Scholar] [CrossRef]
- Zhang, Q.; Zhang, X.; Quan, C.; Zhao, T.; Huo, W.; Huang, Y. Mamba-STFM: A Mamba-Based Spatiotemporal Fusion Method for Remote Sensing Images. Remote Sens. 2025, 17, 2135. [Google Scholar] [CrossRef]
- Gu, A.; Dao, T. Mamba: Linear-time sequence modeling with selective state spaces. In Proceedings of the First Conference on Language Modeling, Philadelphia, PA, USA, 7–9 October 2024. [Google Scholar]
- Chen, K.; Chen, B.; Liu, C.; Li, W.; Zou, Z.; Shi, Z. Rsmamba: Remote sensing image classification with state space model. IEEE Geosci. Remote Sens. Lett. 2024, 21, 8002605. [Google Scholar] [CrossRef]
- Cao, Y.; Liu, C.; Wu, Z.; Zhang, L.; Yang, L. Remote sensing image segmentation using vision mamba and multi-scale multi-frequency feature fusion. Remote Sens. 2025, 17, 1390. [Google Scholar] [CrossRef]
- Li, D.; Sun, J.; Liu, Y. Hierarchical semantic alignment heterogeneous knowledge distillation model for smart agriculture crop leaf disease recognition. Expert Syst. Appl. 2025, 296, 129100. [Google Scholar] [CrossRef]
- Chowdhury, R.H.; Ahmed, S. MangoLeafViT: Leveraging Lightweight Vision Transformer with Runtime Augmentation for Efficient Mango Leaf Disease Classification. In 2024 27th International Conference on Computer and Information Technology (ICCIT); IEEE: Piscataway, NJ, USA, 2024; pp. 699–704. [Google Scholar]
- Yang, Z.; Li, Z.; Zeng, A.; Li, Z.; Yuan, C.; Li, Y. Vitkd: Feature-based knowledge distillation for vision transformers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 17–18 June 2024; pp. 1379–1388. [Google Scholar]
- Patel, D.J.; Patel, P.S.; Patel, T.J.; Viradiya, M.D.; Patel, J.B.; Garg, D. Real-Time Object Detection and Recognition on Jetson Nano. In ICT Analysis and Applications, Proceedings of ICT4SD 2024; Springer: Singapore, 2024; pp. 349–360. [Google Scholar]
- Espejo-Garcia, B.; Güldenring, R.; Nalpantidis, L.; Fountas, S. Foundation vision models in agriculture: DINOv2, LoRA and knowledge distillation for disease and weed identification. Comput. Electron. Agric. 2025, 239, 110900. [Google Scholar] [CrossRef]
- He, J.; Jiang, J.; Zhang, C. A Survey of Lightweight Methods for Object Detection Networks. Array 2025, 29, 100589. [Google Scholar] [CrossRef]
- Cheng, S.; Das, S.; Qu, S.; Ballan, L. KD-Mamba: Selective state space models with knowledge distillation for trajectory prediction. Comput. Vis. Image Underst. 2025, 261, 104499. [Google Scholar] [CrossRef]
- Sanabria-Velazquez, A.D.; Enciso-Maldonado, G.A.; Maidana-Ojeda, M.; Diaz-Najera, J.F.; Thiessen, L.D.; Shew, H.D. Validation of standard area diagrams to estimate the severity of Septoria leaf spot on stevia in Paraguay, Mexico, and the United States. Plant Dis. 2023, 107, 1829–1838. [Google Scholar] [CrossRef]
- Xie, D.; Ye, W.; Pan, Y.; Wang, J.; Qiu, H.; Wang, H.; Li, Z.; Chen, T. GCPDFFNet: Small Object Detection for Rice Blast Recognition. Phytopathology 2024, 114, 1490–1501. [Google Scholar] [CrossRef]
- Dehghani, A.; Sarbishei, O.; Glatard, T.; Shihab, E. A quantitative comparison of overlapping and non-overlapping sliding windows for human activity recognition using inertial sensors. Sensors 2019, 19, 5026. [Google Scholar] [CrossRef]
- Hamilton, A.; Culhane, M. Spherical redshift distortions. arXiv 1995, arXiv:astro-ph/9507021. [Google Scholar]
- Finkelstein, A.; Range, M. Image mosaics. In Electronic Publishing, Artistic Imaging, and Digital Typography, Proceedings of the International Conference on Raster Imaging and Digital Typography; Springer: Berlin/Heidelberg, Germany, 1998; pp. 11–22. [Google Scholar]
- Deng, J.; Dong, W.; Socher, R.; Li, L.J.; Li, K.; Fei-Fei, L. Imagenet: A large-scale hierarchical image database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 248–255. [Google Scholar]
- Gu, A.; Goel, K.; Ré, C. Efficiently modeling long sequences with structured state spaces. arXiv 2021, arXiv:2111.00396. [Google Scholar]
- Khanam, R.; Hussain, M. Yolov11: An overview of the key architectural enhancements. arXiv 2024, arXiv:2410.17725. [Google Scholar] [CrossRef]
- Wang, Z.; Li, C.; Xu, H.; Zhu, X.; Li, H. Mamba yolo: A simple baseline for object detection with state space model. In Proceedings of the AAAI Conference on Artificial Intelligence, Philadelphia, PA, USA, 25 February–4 March 2025; AAAI Press: Palo Alto, CA, USA, 2025; Volume 39, pp. 8205–8213. [Google Scholar]
- Jácome Galarza, L.; Realpe, M.; Viñán-Ludeña, M.S.; Calderón, M.F.; Jaramillo, S. Agritransformer: A transformer-based model with attention mechanisms for enhanced multimodal crop yield prediction. Electronics 2025, 14, 2466. [Google Scholar] [CrossRef]
- Nejad, S.M.M.; Abbasi-Moghadam, D.; Sharifi, A.; Farmonov, N.; Amankulova, K.; Lászlź, M. Multispectral crop yield prediction using 3D-convolutional neural networks and attention convolutional LSTM approaches. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 16, 254–266. [Google Scholar] [CrossRef]
- Lu, H.; Liu, L.; Li, Y.N.; Zhao, X.M.; Wang, X.Q.; Cao, Z.G. TasselNetV3: Explainable plant counting with guided upsampling and background suppression. IEEE Trans. Geosci. Remote Sens. 2021, 60, 4700515. [Google Scholar] [CrossRef]
- Zhang, P.; Zhang, S.; Wang, J.; Sun, X. Identifying rice lodging based on semantic segmentation architecture optimization with UAV remote sensing imaging. Comput. Electron. Agric. 2024, 227, 109570. [Google Scholar] [CrossRef]
- Dionisio, M.A.C.; Salazar, I.J.D.; Hortinela, C.C. EfficientNet-Lite 4-Based Classification System for Grading Philippine Strawberries. In Proceedings of the 2024 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET), Kota Kinabalu, Sabah, 26–28 August 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 366–371. [Google Scholar]
- Liang, W.; Tan, J.; He, H.; Xu, H.; Li, J. Detection of small objects from UAV imagery via an improved Swin transformer. In Proceedings of the 2024 IGARSS 2024—2024 IEEE International Geoscience and Remote Sensing Symposium, Athens, Greece, 7–12 July 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 9134–9138. [Google Scholar]
- Mehta, S.; Rastegari, M. Mobilevit: Light-weight, general-purpose, and mobile-friendly vision transformer. arXiv 2021, arXiv:2110.02178. [Google Scholar]
- Rahman, M.M.; Tutul, A.A.; Nath, A.; Laishram, L.; Jung, S.K.; Hammond, T. Mamba in vision: A comprehensive survey of techniques and applications. arXiv 2024, arXiv:2410.03105. [Google Scholar] [CrossRef]
- Jin, T.; Bercea, G.T.; Le, T.D.; Chen, T.; Su, G.; Imai, H.; Negishi, Y.; Leu, A.; O’Brien, K.; Kawachiya, K.; et al. Compiling onnx neural network models using mlir. arXiv 2020, arXiv:2008.08272. [Google Scholar] [CrossRef]
- Yan, Y.; Song, F.; Sun, J. The application of UAV technology in maize crop protection strategies: A review. Comput. Electron. Agric. 2025, 237, 110679. [Google Scholar] [CrossRef]
- Gao, C.; He, B.; Guo, W.; Qu, Y.; Wang, Q.; Dong, W. SCS-YOLO: A real-time detection model for agricultural diseases—A case study of wheat fusarium head blight. Comput. Electron. Agric. 2025, 238, 110794. [Google Scholar] [CrossRef]
- Wang, J.; Wang, P.; Tian, H.; Tansey, K.; Liu, J.; Quan, W. A deep learning framework combining CNN and GRU for improving wheat yield estimates using time series remotely sensed multi-variables. Comput. Electron. Agric. 2023, 206, 107705. [Google Scholar] [CrossRef]
- Song, D.; Sun, H.; Ngumbi, E.; Kamruzzaman, M. Multispectral image reconstruction from RGB image for maize growth status monitoring based on window-adaptive spatial-spectral attention transformer. Comput. Electron. Agric. 2025, 239, 111062. [Google Scholar] [CrossRef]












| Parameter | Specification |
|---|---|
| Dimensions (unfolded) | mm |
| Weight (with batteries) | Approx. 6.3 kg |
| Max Takeoff Weight | 9.0 kg |
| Max Flight Time | 55 min (no payload) |
| Max Flight Speed | 23 m/s (S Mode) |
| Hovering Accuracy | Vertical: ±0.1 m; Horizontal: ±0.1 m (RTK enabled) |
| Operating Temperature | −20 °C to 50 °C |
| Ingress Protection | IP45 |
| Max Payload | 2.7 kg |
| Grade | Infection Ratio (%) | No. of Patches | Percentage (%) | Avg. Yield (kg/ha) |
|---|---|---|---|---|
| Healthy | 0 | 3547 | 28.0 | 11,250 ± 420 |
| Grade 1 | <10 | 3203 | 25.6 | 10,840 ± 510 |
| Grade 2 | 10–25 | 2819 | 22.4 | 9650 ± 680 |
| Grade 3 | 25–50 | 1459 | 14.4 | 7820 ± 850 |
| Grade 4 | >50 | 1046 | 9.6 | 5430 ± 940 |
| Total | - | 12,074 | 100.0% | - |
| Method | P (%) | R (%) | F1 (%) | mAP@0.5 (%) | mAP@0.5:0.95 (%) | Params (M) | FLOPs (G) | FPS |
|---|---|---|---|---|---|---|---|---|
| ConvNeXt | ||||||||
| EfficientNet-Lite4 | ||||||||
| Swin Transformer | ||||||||
| MobileViT | ||||||||
| VMamba | ||||||||
| AgriTransformer | ||||||||
| YOLOv11 | ||||||||
| YOLO-Mamba | ||||||||
| SymbioMamba (Ours) |
| Method | RMSE (kg/ha) | MAE (kg/ha) | Params (M) | FLOPs (G) | FPS | |
|---|---|---|---|---|---|---|
| ConvNeXt | ||||||
| EfficientNet-Lite4 | ||||||
| Swin Transformer | ||||||
| MobileViT | ||||||
| VMamba | ||||||
| AgriTransformer | ||||||
| ConvLSTM | ||||||
| TasselNetV3 | ||||||
| SymbioMamba (Ours) |
| Model Variant | P (%) | R (%) | mAP@0.5 (%) |
|---|---|---|---|
| SymbioMamba w/o Macro Stream | |||
| SymbioMamba w/o Micro Stream | |||
| SymbioMamba w/o PBCI | |||
| SymbioMamba w/o TACAD | |||
| SymbioMamba (Full) |
| Model Variant | RMSE (kg/ha) | |
|---|---|---|
| SymbioMamba w/o Macro Stream | ||
| SymbioMamba w/o Micro Stream | ||
| SymbioMamba w/o PBCI | ||
| SymbioMamba w/o TACAD | ||
| SymbioMamba (Full) |
| Method | Disease mAP@0.5 (%) | Yield | FPS | |||
|---|---|---|---|---|---|---|
| Common Rust | Banded Blight | Weed/Rain Interference | Normal | Environmental Stress | ||
| YOLOv11 | ||||||
| VMamba | ||||||
| AgriTransformer | ||||||
| SymbioMamba (Ours) | ||||||
| Category | Parameter/Metric | Observed Value |
|---|---|---|
| Generalization | Cross-Sensor mAP@0.5 | 87.2% |
| Cross-Regional F1-score | 86.4% | |
| Energy Profile | Peak Power Consumption | 45–55 W |
| Battery Endurance Retention | >90% | |
| Deployment Latency | Inference Speed | 38.2 FPS |
| End-to-End Latency | 26.2 ms |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Wang, Z.; Wang, Y.; Zhou, B.; Yan, X.; Guo, P.; Yang, H.; Song, Y. SymbioMamba: An Efficient Dual-Stream State-Space Framework for Real-Time Maize Disease and Yield Analysis on UAV Platforms. Agriculture 2026, 16, 801. https://doi.org/10.3390/agriculture16070801
Wang Z, Wang Y, Zhou B, Yan X, Guo P, Yang H, Song Y. SymbioMamba: An Efficient Dual-Stream State-Space Framework for Real-Time Maize Disease and Yield Analysis on UAV Platforms. Agriculture. 2026; 16(7):801. https://doi.org/10.3390/agriculture16070801
Chicago/Turabian StyleWang, Zihuan, Yuru Wang, Bocheng Zhou, Xu Yan, Peijiang Guo, Hanyu Yang, and Yihong Song. 2026. "SymbioMamba: An Efficient Dual-Stream State-Space Framework for Real-Time Maize Disease and Yield Analysis on UAV Platforms" Agriculture 16, no. 7: 801. https://doi.org/10.3390/agriculture16070801
APA StyleWang, Z., Wang, Y., Zhou, B., Yan, X., Guo, P., Yang, H., & Song, Y. (2026). SymbioMamba: An Efficient Dual-Stream State-Space Framework for Real-Time Maize Disease and Yield Analysis on UAV Platforms. Agriculture, 16(7), 801. https://doi.org/10.3390/agriculture16070801
