Next Article in Journal
Effect of the Addition of GGBS on the Frost Scaling and Chloride Migration Resistance of Concrete
Next Article in Special Issue
Combined Generative Adversarial Network and Fuzzy C-Means Clustering for Multi-Class Voice Disorder Detection with an Imbalanced Dataset
Previous Article in Journal
Concrete Silos: Failures, Design Issues and Repair/Strengthening Methods
Previous Article in Special Issue
Individual Topological Analysis of Synchronization-Based Brain Connectivity
 
 
Article
Peer-Review Record

A Fast Self-Learning Subspace Reconstruction Method for Non-Uniformly Sampled Nuclear Magnetic Resonance Spectroscopy

Appl. Sci. 2020, 10(11), 3939; https://doi.org/10.3390/app10113939
by Zhangren Tu 1, Huiting Liu 2, Jiaying Zhan 1 and Di Guo 1,*
Reviewer 1:
Reviewer 2: Anonymous
Appl. Sci. 2020, 10(11), 3939; https://doi.org/10.3390/app10113939
Submission received: 4 May 2020 / Revised: 29 May 2020 / Accepted: 2 June 2020 / Published: 5 June 2020
(This article belongs to the Special Issue Signal Processing and Machine Learning for Biomedical Data)

Round 1

Reviewer 1 Report

This paper proposes a fast self-learning method for subspace reconstruction of non-uniformly sampled nuclear magnetic resonance spectroscopy. Instead of using iterative singular value decomposition methods, the authors propose a fast and high-quality reconstruction by a self-learning subspace method. The results suggest the proposed method yield high-fidelity reconstructed spectra and simultaneously spend less than 10% of the time than state-of-the-art methods. This paper is interesting and the proposed method seems to be mathematically solid. However, there are some concerns that need to be addressed to enhance the quality of the paper.

(1) The definition of self-learning needs to be clarified. People tend to be more familiar with supervised learning, unsupervised learning and semi-supervised learning, etc. I am just wondering, how the self-learning is correlated with these learning methods? The authors need to add a section specifically talking about the definition of self-learning and the correlation mentioned above. And what's more, why self-learning instead of other learning methods is used in the NMR spectra subspace reconstruction?

(2) Some figures are not clearly presented. For example, what are those blue/white circles in Fig. 1? And the "Core j"? Some more legends/captions need to be added to facilitate the understanding this figure.

(3) In Fig. 2, although the authors claim that there are signifiant difference between the performance of their proposed method Fig. 2(f) and that of other methods, I do not find too much difference at least visually from Fig. 2, especially Fig. 2(e) and Fig. 2(f). Even when comparing Fig. 2(c), (d) and (f), the difference seems to very trivial. Could you please further clarify and explain why?

(4) For Fig. 11, the authors claimed that their proposed method ran faster than other methods. However, it seems that the method called LRHMF outperformed the proposed method in terms of speed. The cases for the parallel seemed even disadvantageous for the proposed method when comparing other methods. Could you please further explain the discrepancies? Also, the authors need to explicitly clarify this in the manuscript.

(5) For the method part, from Eq. 6 to Eq. 7, I do not quite understand how the dual variable D was introduced. Also, why the constraint in Eq. 6 was converted into two terms in Eq. 7? The authors should elaborate in details about these.

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Reviewer 2 Report

The presented paper presents a novel approach on reconstruction of NMR spectroscopy. However I have few suggestions that should be more detailed explained in final version of manuscript:

1) I do understand that the benefit of presented method SLS in contrast to LRHM lies in better reconstruction of low-intensity peaks. Nevertheless in paper it should be described how important are those low-intensity peaks in spectra. For some reasons it can be connected to noise and no reconstruction in that case is not a bad result.

2) Why all results are presented in the form of correlations? I suggest to put also absolute values in the appendix and try to comment given results and differences (especially for synthetic data).

3) Did the authors try to run methods on more personal computer instead computing server? What were the results in such situation?

4) In general according to presented results the computation time is not better for the proposed method. I found one better benefit - better reconstruction for lower peaks. In conclusions it should be more and better described what are the advantages of using SLS method.

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The authors have largely addressed most of the concerns raised in the last round review. However, there are some concerns which have not been fully addressed.

For Q1, please add the responses to the revised manuscript, especially about the relationship between self-learning and unsupervised-learning. To better clarify, I would suggest adding more sentences discussing the differences between unsupervised learning, supervised learning and semi-supervised learning or transductive learning [1] as well as your self-learning.

[1] Wan, S., Mak, M. W., & Kung, S. Y. (2016). Transductive learning for multi-label protein subchloroplast localization prediction. IEEE/ACM transactions on computational biology and bioinformatics14(1), 212-224.

Again for Q3, please add the responses (especially the discussions about the results) to the revised manuscript.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

Thank you for introducing remarks that I mentioned. I have just one additional concern - in Table 5 (newly added) emoticones and smiles do not look to serious as for scientific paper. That could be replaced by some scale introduced by authors (good or bad also do not look too scientific and professional). 

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Back to TopTop