Next Article in Journal
Post-Editese in Literary Translations
Next Article in Special Issue
A Bidirectional Context Embedding Transformer for Automatic Speech Recognition
Previous Article in Journal
Translation Alignment with Ugarit
Previous Article in Special Issue
Multi-Keyword Classification: A Case Study in Finnish Social Sciences Data Archive
 
 
Article
Peer-Review Record

Performance Study on Extractive Text Summarization Using BERT Models

Information 2022, 13(2), 67; https://doi.org/10.3390/info13020067
by Shehab Abdel-Salam * and Ahmed Rafea
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Information 2022, 13(2), 67; https://doi.org/10.3390/info13020067
Submission received: 24 December 2021 / Revised: 21 January 2022 / Accepted: 26 January 2022 / Published: 28 January 2022
(This article belongs to the Special Issue Novel Methods and Applications in Natural Language Processing)

Round 1

Reviewer 1 Report

This paper studies the performance of deep learning models on text summarization through a series of experiments, and proposes SqueezeBERTSumm, a trained summarization model based on SqueezeBERT which achieved competitive ROUGE scores. This paper is written well. The following comments could be considered to improve the paper further.

 

  1. I recommend authors to seek a language editor to improve the quality of writing and to correct some grammatical errors. For example, the first letter in 1.1 on page 2 should be capitalized.
  2. The figures of this paper are not clear, for example, Figures 2. It is recommended to use vector graphics.
  3. The references should have the same type. For example, the location and format of publication year information should be consistent.
  4. Formulas should not be represented by pictures.
  5. Recently, gradient activation function is used to improve the performance of deep learning, e.g., ``Activated gradients for deep neural networks’’ in IEEE TNNLS. The authors are suggested to added some comments in the revised manuscript on this point.
  6. The details of the experiments are insufficient. It is necessary to represent more values of hyperparameters of your experiments. Without these details, readers cannot reproduce the experimental results.
  7. The description of the proposed network structure in the article is not clear enough, please strengthen it.

Author Response

Please see the attachment covering all points mentioned by the reviewer.

Author Response File: Author Response.pdf

Reviewer 2 Report

I found the paper to be overall well written and much of it to be well described. The manuscript has to do with text summarization through a set of experiments by different modifications of the BERT algorithm. 

Because of this, the current study is on a topic of relevance and general interest to the readers of the journal. I explain my concerns in more detail below. 

The title sounds quite general. It does not represent the article's contents to the full extent. Some additional specifications must be added to make the paper sound unique.

The originality/novelty of the work may be somewhat questionable. Would you mind specifying precisely what the novelty is? In the current form, it sounds vague.

The Abstract does not summarize the contents clearly. Please dwell on the offered methodology and its benefits.

Time measurements are missing in all the tables (1-6). But it is a crucial aspect to be measured.

The limitations of the introduced approach are not mentioned. Are there any ones?

Would you mind sharing a link to the code?

English should be polished once again; there are some grammar mistakes in the text.

Figures 1-3 must be enlarged.

 

Author Response

Please see the attachment covering all points mentioned by the reviewer.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The authors have addressed the issues raised last time by the reviewer.

Back to TopTop