Next Article in Journal
Analysis of a Personalized Provision of Service Level Agreement (SLA) Algorithm
Next Article in Special Issue
Knowledge-Guided Prompt Learning for Few-Shot Text Classification
Previous Article in Journal
A Survey on Web Application Penetration Testing
Previous Article in Special Issue
WCC-JC 2.0: A Web-Crawled and Manually Aligned Parallel Corpus for Japanese-Chinese Neural Machine Translation
 
 
Article
Peer-Review Record

Keyword-Aware Transformers Network for Chinese Open-Domain Conversation Generation

Electronics 2023, 12(5), 1228; https://doi.org/10.3390/electronics12051228
by Yang Zhou 1, Chenjiao Zhi 1, Feng Xu 2, Weiwei Cui 3, Huaqiong Wang 4, Aihong Qin 4, Xiaodiao Chen 2,4, Yaqi Wang 4,* and Xingru Huang 3,*
Reviewer 1:
Reviewer 2: Anonymous
Reviewer 3:
Electronics 2023, 12(5), 1228; https://doi.org/10.3390/electronics12051228
Submission received: 7 February 2023 / Revised: 23 February 2023 / Accepted: 28 February 2023 / Published: 4 March 2023
(This article belongs to the Special Issue Natural Language Processing and Information Retrieval)

Round 1

Reviewer 1 Report

The authors proposed a Keyword-Aware Transformer Network (KAT) approach that can fuse keyword information in the context and jointly train keyword extraction and conversation generation tasks. The paper has an interesting structure, but the English language is very poor. Also, the content needs to be much improved. There are some comments:

1. The related work must be extended, and a table with the comparison of the works is needed;

2. The task formulation must be removed and merged with the approach;

3. The experiments are interesting, but a discussion section is needed.

Author Response

Dear reviewer:

Thank you for your comment. We appreciate your feedback and have addressed your concerns in detail in a response letter that we have uploaded as an attachment. Please let us know if you have any further questions or concerns, and we would be happy to address them. Thank you for your time and attention to our work.

Best regards,

Xingru Huang

Author Response File: Author Response.pdf

Reviewer 2 Report

This manuscript focuses on the KAT for Chinese open-domain conversation generation. There are many defects in it.
1."Keyword-Aware Transformers Network" is used in the title, but "Keyword-Aware Transformer Network" is used in lines 20 and 57.
2.In line 25, "Coherent" should be replaced by "Coherence".
3.In line 126, is "u^i" an utterance or word?
4.In lines 126-127, what is the relationship between "N" and "n"?
5.In line 127, what is the relationship between "M" and "m"?
6.In line 128, is "r^L" an utterance or word?
7.In line 129, what is "U"? Is "U" an utterance or word?
8.The definition of "C" in Eq. (1) is different from the definition of "C" in line 126.
9.In Eq. (2),what is "w"?
10.In Eqs. (1) and (2),what is the relationship between "u^" and "u^1" to "u^(N-1)? ?
11.In Eq. (3), "R" has a subscript "t-1", but "r" has a superscript "j-1".
12.The definition of "R_(t-1)" in Eq. (3) is similar to the definition of "r" in lines 127-128. What is the relation ship between "R_(t-1)" and "r"?
13.In Eq. (3), what is the definition of the function "f"?
14.The definition of "H" in Eq. (4) is the same to the definition of "C" in line 126. But, "C" is in the function "f" in Eq. (4). What is wrong?
15.In lines 127 and 141, what is the relationship between "utterance" and "sentence"?
16.In Eq. (5) what do "Q", "K", and "V" mean?
17.In Eq. (5), what does the subscript "i" mean?
18.In the upper row of Eq. (5), the left side has not "i" and the right side has "i" for every sentence. It is not reasonable.
19.In the lower row of Eq. (5), K is calculated from K. It is not reasonable.
20.In the lower row of Eq. (5), what does the operator "*" mean?
21.In line 153, the first "d_k" should be replaced by "d_q".
22.In line 153, what are "d_model", "d_q", "d_k", and "d_v"?
23.In line 154, what is "W^Z_i"?
24.In Eq. (6), K is calculated from K. It is not reasonable.
25.In Eq. (6), the definition of "K" is different from the definition of "K" in the the lower row of Eq. (5). Why?
26.In Eq. (6), what does the matrix "W^i_KW" mean?
27.In Eq. (6), the left side has not "i" and the right side has "i". It is not reasonable.
28.The function "SelfKeywordAttention" in Eq. (9) depends on "W^Q_i", "W^K_i", and "V", but the function "SelfKeywordAttention" in Eq. (7) does not.
29.In Eq. (8), what does "W_Z" mean?
30.In line 158, the belonging realm of "W^Q_i" is different from the belonging realm of "W^Q_i" in line 153. What is wrong?
31.In line 158, the belonging realm of "W^Z_i" is different from the belonging realm of "W^Z_i" in line 154. What is wrong?
32.The parameters "W" and "e" in Fig. 2 are different from the parameters "w" and "u" in Eqs.(1) and (2).
33.In Eq. (11), "~r^j" is different from "~r_j" in the next line.
34.In Eq. (12), what are "lambda 1" and "lambda 2"?
35.In Eq. (14), what are "r" and "c"?
36.The symbol "w_n" in Eq. (15) and the symbol "w_m" in Eq. (2) are different. Modification is necessary.
37.In line 243, both "lambda 1" and "lambda 2" are equal to 1. Why isn't Eq. (12) "L=L1+L2".
38.In line 300, "pre" and "we" should be replaced by "Pre" and "We" respectively.
39.In line 303, "pre" should be replaced by "Pre".
40.In line 335, "setting 2" should be replaced by "Setting 2".
41.In line 364, "keyword" should be re[placed by "Keyword".
In conclusion. this manuscript is not written clearly. Major revision is necessary.

Author Response

Dear reviewer:

Thank you for your comment. We appreciate your feedback and have addressed your concerns in detail in a response letter that we have uploaded as an attachment. Please let us know if you have any further questions or concerns, and we would be happy to address them. Thank you for your time and attention to our work.

Best regards,

Xingru Huang

 

Author Response File: Author Response.pdf

Reviewer 3 Report

Line 41. Authors said "they still suffer from suboptimal performance" What does suboptimal mean?

Line 76. Authors need to justify the statement " joint modeling approach for both keywords predicting and dialogue generation tasks"

How is the approach different from the existing papers in performance?

Section 3 is very short I propose to combine it with section 4. Approach

Authors must represent variables mathematics formulas such as t-1.

Eq. 2 authors need to fix variable of token, is it w or u.

Line 146. What are the dominant keyword attributes in each sentence? What is the difference between dominant and relevant in the proposed approach

Some symbols in Eq. 6 -9, not described. For example, in Eq. 6 what does tanh mean

Eq. 12 authors need to define λ1 and λ2 and their possible values.

I suggest to clarify all steps of proposed approach through example sentences.  Please provide inputs and outputs of each step.

No quantitative result in the Abstract.

From the quantitative results provided in Table 2 it is observed that the proposed method is evaluated twice, and the authors need to distinguish between them.

Authors need to move case study at first and reorganize tables.

Author Response

Dear reviewer:

Thank you for your comment. We appreciate your feedback and have addressed your concerns in detail in a response letter that we have uploaded as an attachment. Please let us know if you have any further questions or concerns, and we would be happy to address them. Thank you for your time and attention to our work.

Best regards,

Xingru Huang

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The paper was accurately revised, and it can be accepted.

Reviewer 2 Report

This revised manuscript is ready to be published.

Reviewer 3 Report

Authors have addressed all my comments and questions in this paper.

My decision is accept.

Back to TopTop