Deep Learning-Based Pine Nematode Trees’ Identification Using Multispectral and Visible UAV Imagery
Round 1
Reviewer 1 Report
Review comments
1. The method in this paper only uses mAP@0.5 as the accuracy evaluation index, which is not accurate enough and has insufficient applicability.
2. Line 100 of this paper points out that single visible imagery based or single multispectral imagery based recognition method will inevitably be affected by the phenomenon of "same thing with different spectrum" and "different thing with same spectrum", which will lead to various false detections and missed detections and affect the recognition accuracy. Therefore, visible light image and multispectral image fusion method is adopted. However, the method does not explain how to solve the above problems by using two data sources, nor does it prove the complexity of the experimental environment. Therefore, it is questioned here why visible light image is needed when multi-spectral image has richer spectral information and only multi-spectral image can achieve better effect.
3. In Line 136-148, about the original images, the DJI MS600 Pro multispectral camera has been used in this study, visible images and multispectral images ( was called in this paper) were pixel normalized by standard reflectance, and so on. So, I think your proposed method can be only used in this camera, even in this area. Because this was not the standardized way to proposed, atmospheric correction?
4. Meanwhile, visible images and multispectral images have different spatial resolution and size. Multispectral images and visible images are collected from different heights and angles, so how can the two images be scaled and matched together?
5. The disease cycle of pine wood nematode is short. Are visible and multispectral images collected at the same time? Please provide specific flight parameters. Furthermore, The validation data was very inadequate.
6. Do the two data sources use the same algorithm for input? It should be clearer.
7. In this paper, There are many errors in the format, please correct them carefully.
8. Compared with other methods, No advantage found.
9. In this paper, the proposed method seemed to be a patchwork of several methods, and there were no ablation experiments.
Author Response
Please see the attachment
Author Response File: Author Response.docx
Reviewer 2 Report
Review of “drones-2214982” Deep Learning Based Pine Nematode Trees Identification Using Multispectral and Visible UAV Imagery by authors Bingxi Qin, Fenggang Sun, Weixing Shen, Bin Dong, Xinyu Huo and Peng Lan.
Although the manuscipt deals with a threat to pine forest by the pine wilt disease (PWD), which we can say is a worldwide problem because it kills trees also in Europe and the United states, I am worried that the paper is useful for monitoring unhealthy trees but not for the purpose which is stated in the title of the manuscript (i.e. specifically PWD). It is not clear from the manuscript how many trees and how were they monitored for the disease. Hence the manuscript uses very advanced statistical methods but from the viewpoint of forest ecology it has big weaknesses therefore I suggest a major revision of the paper.
Abstract
The abstract is a bit hard to read there are too many abbreviations. The terms pine wilt disease and unmanned aerial vehicles are explained but the rest of abbreviations are left for the reader to make sure of what they means.
Introduction
The introduction lacks any significant information on the biology of the disease or vectors of the disease. Is the PWD the only mortality factor? No previous drought, insects, only PWD?
This is probably also the reason why the manuscript looks very good from the point of remote sensing but not from the point of forestry protection.
L.46-68 refers mostly to Chinese (or Asian?) works, although I am sure some work was done also in Europe or the Unites States.
l.58: this is the first mention of the YOLO “toolset, thus it should be referenced and as this is also the main tool used by the authors, it should be described at least for some background information what this tool is, how it works etc.
l.83-94 again mainly Chinese studies are listed.
95-102 Exactly, so how you know the trees you “inspected” where infested by the PWD? You mention no laboratory work in the medthods so how can you be so sure it is PWD?
l.107 GhostNnet without a reference? Why is the reference on l.237 and not at the first mention of the term?
109: CBAM and CA? What does it mean?
110: what is “transformer module”?
111: what is “BiFPN”?
113-121: I suggest to rewrite this paragraph to be more like a scientific paper with hypotheses which are going to be tested. So I would remove l.114-115 (It is found that the reflectance in red, near infrared, infra- 114 red edge and red edge 750nm bands differ significantly.) and l.117-118 (enables more efficient 117 and accurate identification.) as well as l.120-121 (the model balances accuracy and speed 120 and is simple to detect.).
l.113: how do the authors know the trees were healthy? And how do they know they were affected by PWD? The trees which are infested/attacked can be stressed before the attack and the disease is just a secondary “symptom”. You need to detect the nematode in the laboratory to be sure the tree was killed by PWD.
l.127: what pine?
L.155: 4500 images (3500 visible and 1000 multispectral and this was already mentioned in l. 134).
Fig.4 and 5: two classes are detected, infected and healthy but figure 15 list more categories...There are usually some quidelines to categorise the trees, such as Abelleira, A.; Picoaga, A.; Mansilla, J.P.; Aguin, O. Detection of Bursaphelencus xylophilus, causal agent of pine wilt disease on Pinus pinaster in Northwestern Spain. Plant Dis. 2011, 95, 776. So you could check if that exist for your area and your pine species and use some discoloration classes of the disease.
l.180-211 belongs to results
l.213: first reference to YOLO, although it was already mentioned several times
231: what is a long detection proces? Miliseconds, hours, weeks, months? You used 4500 images from a 6 km2 area done during two weeks (l126-135) so this should be accounted for, at least in the discussion, if this is a viable approach for forestry, or maybe to smaller urban forests?
l.322-325: this belong to methods
L.330-348: methods, too.
L.320: although the chapter says “Results and Discussion”, I see no discussion on l.321-440. The authors discuss only their own results. But what about the results of others? Some other models are mensioned on l.411 to 414 but what does this mean?
Does this approach gives us much faster or more precise results? What is the difference between your approach and approach for example byt Iordache et al. in Remote Sensing 2020? They also used multispectral images. So it is the difference between your approach and the approach of other authors?
Fig. 15 informs us on “infected”, “undetected late stage infected”, “undetected early stage infected”, “undetected middle stage infected” and “missidentified” trees. But there is no mention on these categories anywhere in the methods section? Who done the evaluation? When? Fig.4 and 5 mention only infected and healthy trees. Or can you say the reflectance values of “middle stage infected” and “early stage infected”? “Late stage infected” means that the disease is already visible in the field (on a picture by naked eye) and the other categories are what?
l.441: This study showed that it is possible to detect decaying trees from healthy ones. The authors showed that they computational workflow can process images in 0,064 s/photo (l.454). I am a bit suspiciuous it is related to PWD management. The field crew needs to cut down every decaying tree anyway so the focus should be on trees which are already infected but not yet visible for human eye.
The study used very vague definition of PWD classification, mostly looking at the health status of trees by remote sensing approach, which is viable and cool of course but I doubt the precision related to PWD.
Author Response
Please see the attachment.
Author Response File: Author Response.docx
Round 2
Reviewer 1 Report
Agree to receive.