Next Article in Journal
Topology Robustness of State Estimation Against False Data Injection and Network Parameter Attacks on Power Monitoring and Control Systems
Previous Article in Journal
FPGA-Based Reconfigurable System: Research Progress and New Trend on High-Reliability Key Problems
Previous Article in Special Issue
Diffmap: Enhancement Difference Map for Peripheral Prostate Zone Cancer Localization Based on Functional Data Analysis and Dynamic Contrast Enhancement MRI
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

From Quality Grading to Defect Recognition: A Dual-Pipeline Deep Learning Approach for Automated Mango Assessment

Department of Computer Science and Information Engineering, National Dong Hwa University, Hualien 974301, Taiwan
*
Author to whom correspondence should be addressed.
Electronics 2026, 15(3), 549; https://doi.org/10.3390/electronics15030549
Submission received: 29 December 2025 / Revised: 24 January 2026 / Accepted: 26 January 2026 / Published: 27 January 2026

Abstract

Mango is a high-value agricultural commodity, and accurate and efficient appearance quality grading and defect inspection are critical for export-oriented markets. This study proposes a dual-pipeline deep learning framework for automated mango assessment, in which surface defect classification and quality grading are jointly implemented within a unified inspection system. For defect assessment, the task is formulated as a multi-label classification problem involving five surface defect categories, eliminating the need for costly bounding box annotations required by conventional object detection models. To address the severe class imbalance commonly encountered in agricultural datasets, a copy–paste-based image synthesis strategy is employed to augment scarce defect samples. For quality grading, mangoes are categorized into three quality levels. Unlike conventional CNN-based approaches relying solely on spatial-domain information, the proposed framework integrates decision-level fusion of spatial-domain and frequency-domain representations to enhance grading stability. In addition, image preprocessing is investigated, showing that adaptive contrast enhancement effectively emphasizes surface textures critical for quality discrimination. Experimental evaluations demonstrate that the proposed framework achieves superior performance in both defect classification and quality grading compared with existing detection-based approaches. The proposed classification-oriented system provides an efficient and practical integrated solution for automated mango assessment.
Keywords: deep learning; convolutional neural networks; mango quality grading; image processing; machine learning deep learning; convolutional neural networks; mango quality grading; image processing; machine learning

Share and Cite

MDPI and ACS Style

Lin, S.; Chiu, H. From Quality Grading to Defect Recognition: A Dual-Pipeline Deep Learning Approach for Automated Mango Assessment. Electronics 2026, 15, 549. https://doi.org/10.3390/electronics15030549

AMA Style

Lin S, Chiu H. From Quality Grading to Defect Recognition: A Dual-Pipeline Deep Learning Approach for Automated Mango Assessment. Electronics. 2026; 15(3):549. https://doi.org/10.3390/electronics15030549

Chicago/Turabian Style

Lin, Shinfeng, and Hongting Chiu. 2026. "From Quality Grading to Defect Recognition: A Dual-Pipeline Deep Learning Approach for Automated Mango Assessment" Electronics 15, no. 3: 549. https://doi.org/10.3390/electronics15030549

APA Style

Lin, S., & Chiu, H. (2026). From Quality Grading to Defect Recognition: A Dual-Pipeline Deep Learning Approach for Automated Mango Assessment. Electronics, 15(3), 549. https://doi.org/10.3390/electronics15030549

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop