Next Article in Journal
Coastal Adaptation to Climate Change and Sea Level Rise: Ecosystem Service Assessments in Spatial and Sectoral Planning
Next Article in Special Issue
Deep Clustering Efficient Learning Network for Motion Recognition Based on Self-Attention Mechanism
Previous Article in Journal
Characteristics of the Contingent Negative Variation during Lower Limb Functional Movement with an Audio-Visual Cue
Previous Article in Special Issue
Point Cloud Repair Method via Convex Set Theory
 
 
Article
Peer-Review Record

An Attention-Based Method for Remaining Useful Life Prediction of Rotating Machinery

Appl. Sci. 2023, 13(4), 2622; https://doi.org/10.3390/app13042622
by Yaohua Deng, Chengwang Guo, Zilin Zhang *, Linfeng Zou, Xiali Liu * and Shengyu Lin
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Appl. Sci. 2023, 13(4), 2622; https://doi.org/10.3390/app13042622
Submission received: 1 February 2023 / Revised: 16 February 2023 / Accepted: 16 February 2023 / Published: 17 February 2023

Round 1

Reviewer 1 Report

By considering the contribution of input data to the modeling output, a deep learning model that incorporates the attention mechanism in feature selection and extraction is proposed in this paper. An unsupervised clustering method for classification of rotating machinery performance state evolution is proposed, and the similarity function was used to calculate the expected attention of input data, and then to build an input data extraction attention module.  The module is then fused with a gated recurrent unit , a variant of recurrent neural network, to construct an attention-GRU model that combines prediction calculation and weight calculation for RUL prediction.

The RUL prediction models for rotating machinery discussed in this paper are mainly constructed based on large size of sample data. 

Author Response

Thanks to Reviewer 1 for the summary and evaluation of this paper.

Reviewer 2 Report

This paper is a research on the life prediction of rotating components and discusses in detail the problems of feature extraction and data imbalance. The paper is scientifically rigorous, clear in thinking, and has reference value.

l  There are some grammatical mistakes authors should address them.

l  The punctuation marks in the paper need to be checked.

l  The legend in the picture is too small.

l  The author needs to clearly explain the novelty and motivation of the proposed work.

l  The role of the GRU network in articles and experiments needs further explanation.

l  The conclusion should state the scope for future work.

Author Response

  1. There are some grammatical mistakes authors should address them.

Response: We have asked a native English speaker in our field of research to polish our paper, and hope the revision could meet the standard of publication.

  1. The punctuation marks in the paper need to be checked.

Response: Thanks for your kind advice. We have checked the punctuation marks throughout the article and corrected formatting inconsistencies.

  1. The legend in the picture is too small.

Response: Thanks for the reminder. We have enlarged the legend in the article.

  1. The author needs to clearly explain the novelty and motivation of the proposed work.

Response: Thank you for your professional advice. The novelty of our work is that it provides a solution of selective feature extraction with the attention mechanism. To clarify the novelty and motivation of our work, we have made revisions to the Introduction section, and the original text in the last paragraph of the introduction section

“Therefore, to achieve selectivity of feature extraction, we consider the contribution of input data to the modeling output and hence propose an attention-based RUL prediction method for rotating machinery. Specifically, the performance evolution of rotating machinery is classified into different states, and the attention mechanism is introduced to assign varied weights to the GRU inputs so that the model can extract more important information for RUL prediction and thereby achieve higher accuracy and stability in prediction. ”

has been modified into the following:

“To address the problem that existing models overlook useful local target features and cannot extract features in a selective manner, we consider the contribution of input data to the modeling output and propose an attention-based RUL prediction method for rotating machinery. Specifically, the performance evolution of rotating machinery is classified into different states, and the attention mechanism is introduced to assign varied weights to the GRU inputs so that the model can extract more important information for RUL prediction, achieve accurate extraction of both global and local features of the target object, and thereby reach higher accuracy and stability in prediction.”

  1. The role of the GRU network in articles and experiments needs further explanation.

Response: We have added explanations of the GRU network in Section 2.2.2. And the original text in the first paragraph of section 2.2.2

“The RUL prediction attention-GRU model consists of two modules: a GRU and an attention module, where the GRU works for prediction, and the attention module calculates similarity weights (Figure 5).”

has been modified into the following:

“The RUL prediction attention-GRU model consists of two modules: a GRU and an attention module. The GRU module learns and analyzes the sequence trends and intrinsic relationships of data, and it can independently determine whether to retain or discard a feature, which is mainly used for predictive calculations. The attention module is used for similarity weights (Figure 5). The attention mechanism gives a larger weight to the part of attention than to the less-relevant parts to obtain more effective information. Therefore, adding an attention layer to the GRU model can increase the contribution of important features to the prediction.”

 

  1. The conclusion should state the scope for future work.

Response: Thanks for your advice. We have provided the scope of future work in the Conclusion section in our revised manuscript. The original text in the last paragraph of the Conclusions section

“The RUL prediction models for rotating machinery discussed in this paper are mainly constructed based on a large size of sample data. However, in reality, RUL prediction of equipment, especially under complex and severe working conditions, is principally a modeling problem with few available samples. Therefore, future work will focus on the RUL prediction of rotating machinery under the condition of few sample data. Given the complexity of such problems, the attention-GRU RUL prediction model proposed in this paper will face challenges in accuracy and efficiency. For this reason, the dual-channel attention mechanism will be the research focus for feature extraction in RUL prediction under the condition of insufficient sample data.”

has been modified into the following:

The RUL prediction models for rotating machinery discussed in this paper are mainly constructed based on large size of sample data. In real-world scenarios, however, RUL prediction of equipment, especially under complex and unfavorable working conditions, is principally a modeling problem with few available samples. Therefore, future work will focus on the RUL prediction of rotating machinery with limited availability of sample data. Given the complexity of such problems, the attention-GRU RUL prediction model proposed in this paper will face challenges in accuracy and efficiency. For this reason, models focusing on feature mining, such as the dual-channel attention mechanism, will be the research focus for feature extraction in RUL prediction when sample data are insufficient.”

Author Response File: Author Response.docx

Reviewer 3 Report

Please check attachment 

Comments for author File: Comments.pdf

Author Response

  1. Check the punctuation and white spaces between characters

Response: Thanks for your kind suggestion. We asked a native English speaker in our research team to revise the errors and polish the paper, hoping that the revision could meet the standard of publication.

  1. figure 1 content is not clear so please redraw it using effective tools, Figure 8 and 10 too small axis text, too small legend text.

Response: Thank you. We have modified the figures you mentioned in our revised manuscript.

  1. Discuss the future plans with respect to the research state of progress and its limitations.

Response: Thank you for your advice. We have described the limitations of the study and future research progress, as detailed in the last paragraph in the revised version.

  1. The three data evolution stages in Figure 2 need to be explained in the text.

Response: In our revised manuscript, the three stages in Figure 2 are explained in detail in the first paragraph of section 2.1

Among them, the “health state” refers to the state of trouble-free operation of the rotating component, the “degradation-begun state” refers to the operating state at the beginning of the early failure, and the “degradation-intensified state” refers to the operating state after the failure goes further.

  1. The function of the attention mechanism in the article is unclear. Please expand the description.

Response: We added explanations to the attention mechanism in Section 2.2.2. And the original text in the first paragraph of section 2.2.2

“The RUL prediction attention-GRU model consists of two modules: a GRU and an attention module, where the GRU works for prediction, and the attention module calculates similarity weights (Figure 5).”

has been modified into the following:

“The GRU module learns and analyzes the sequence trends and intrinsic relationships of data, and it can independently determine whether to retain or discard a feature, which is mainly used for predictive calculations. The attention module is used for similarity weights (Figure 5). The attention mechanism gives a larger weight to the part of attention than to the less-relevant parts to obtain more effective information. Therefore, adding an attention layer to the GRU model can increase the contribution of important features to the prediction.”

 

  1. The contribution of the article needs to be made clearer in the previous paragraph.

Response: The contribution of the paper is described in detail in the last paragraph of the Introduction section. The original text in the last paragraph of the introduction section

“Therefore, to achieve selectivity of feature extraction, we consider the contribution of input data to the modeling output and hence propose an attention-based RUL prediction method for rotating machinery. Specifically, the performance evolution of rotating machinery is classified into different states, and the attention mechanism is introduced to assign varied weights to the GRU inputs so that the model can extract more important information for RUL prediction and thereby achieve higher accuracy and stability in prediction. ”

has been modified into the following:

“To address the problem that existing models overlook useful local target features and cannot extract features in a selective manner, we consider the contribution of input data to the modeling output and propose an attention-based RUL prediction method for rotating machinery. Specifically, the performance evolution of rotating machinery is classified into different states, and the attention mechanism is introduced to assign varied weights to the GRU inputs so that the model can extract more important information for RUL prediction, achieve accurate extraction of both global and local features of the target object, and thereby reach higher accuracy and stability in prediction.”

Author Response File: Author Response.docx

Back to TopTop