Next Article in Journal
Autonomous State Estimation and Observability Analysis for the Taiji Formation Using High-Precision Optical Sensors
Next Article in Special Issue
Performance Improvement of Partial Least Squares Regression Soluble Solid Content Prediction Model Based on Adjusting Distance between Light Source and Spectral Sensor according to Apple Size
Previous Article in Journal
A Rust Extraction and Evaluation Method for Navigation Buoys Based on Improved U-Net and Hue, Saturation, and Value
Previous Article in Special Issue
Tactile-Sensing Technologies: Trends, Challenges and Outlook in Agri-Food Manipulation
 
 
Article
Peer-Review Record

AI-Assisted Cotton Grading: Active and Semi-Supervised Learning to Reduce the Image-Labelling Burden

Sensors 2023, 23(21), 8671; https://doi.org/10.3390/s23218671
by Oliver J. Fisher 1,2,*, Ahmed Rady 1,3, Aly A. A. El-Banna 4, Haitham H. Emaish 5 and Nicholas J. Watson 1,6
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Sensors 2023, 23(21), 8671; https://doi.org/10.3390/s23218671
Submission received: 23 September 2023 / Revised: 20 October 2023 / Accepted: 21 October 2023 / Published: 24 October 2023
(This article belongs to the Special Issue Artificial Intelligence and Sensor Technologies in Agri-Food)

Round 1

Reviewer 1 Report

The authors used active and semi-supervised learning to reduce the image labelling burden for AI assisted cotton grading, the idea is novel. Here are some comments :

1.  Please revise "introduction part", too many words for cotton grading part, maybe need to increase "reduce the labelling burden" part;

2 In section 2.1, all the samples were harvested in the same year?

3 Please detail "machine learning" part, such as N-fold cross-validation? and how you set the parameters?

 

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

The paper demonstrates a strong utilization of various machine learning techniques for cotton grading, aiming to train more efficient models with limited labeled data or a constrained data labeling budget. However, I believe that further exploration could enhance the quality of this work.

 

a) To broaden the scope of your research, consider incorporating diverse active learning sample selection criteria, including but not limited to uncertainty, density, and diversity. A notable example to draw inspiration from is the paper 'Active Learning based on Random Forest and Its Application to Terrain Classification' by Yingjie Gu, Dawid Zydek & Zhong Jin.

 

b) Instead of depicting the absolute volume of seed data in figures 2, 4, and 5, it might be more enlightening to represent the size of seed data in percentage of the total labeled data. This approach would facilitate comparative investigations.

 

c) I recommend exploring alternative feature extraction methods, such as leveraging various color channels and Gray-level co-occurrence to capture spatial features. This step has the potential to significantly enhance the depth of the research.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

The article presents important research on semi-supervised and active learning capabilities to minimise effort when labelling cotton lint samples while maintaining high classification accuracy.

However, some minor corrections are necessary:

-In Figure 1, align the numbers I, II, III, ... in columns and keep Giza 86, 90, and 96 aligned so the reader can compare. If necessary, form two images, the first from I to IV and the second from V to IX.

-On line 174, also insert the citation number in Fisher et al. (2022).

-On line 188, cite the author and the citation number to complete the sentence. For example: “The image processing method developed in Fisher et al. 2022 32 was followed to extract features...”

-On line 198, cite the author and number as on line 188.

-On line 256, check the word ‘retained’.

-Correct page numbers 12 of 18, 13 of 18, 14 of 18, 15 of 18, 16 of 18, 17 of 18 and 18 of 18.

-In Figure 3, better specify the title of each axis.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Back to TopTop