You are currently viewing a new version of our website. To view the old version click .
by
  • Aggrey Keya1,
  • Pauline Gitonga2,3 and
  • Daniel Wanjohi1
  • et al.

Reviewer 1: Anonymous Reviewer 2: Anonymous

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

After carefully reviewing the manuscript "Brucellosis; Point-of-care Diagnostics; Febrile Brucella Agglutination Test; Diagnostic Validation," it can be concluded that the material is fully consistent with the journal's scope and will be highly useful to scientists involved in laboratory diagnostics, including the testing and validation of analytical systems. We also note a significant strength of this study—the clear and concise practical recommendations in the conclusion.

I believe the article can be accepted after minor revisions to the data processing and statistical analysis section:

1) Authors should justify the choice of statistical analysis methods used and provide references to current literature describing recommendations for the use of these methods in the validation and comparative analysis of diagnostic test systems.

2) More recent publications on the use of the McNemar's test should be included and referenced in the text (lines 132–136). Because standards may have changed since the method was first described in 1947, it is important to demonstrate that this statistical tool remains relevant in up-to-date research.

Author Response

Comment 1: After carefully reviewing the manuscript "Brucellosis; Point-of-care Diagnostics; Febrile Brucella Agglutination Test; Diagnostic Validation," it can be concluded that the material is fully consistent with the journal's scope and will be highly useful to scientists involved in laboratory diagnostics, including the testing and validation of analytical systems. We also note a significant strength of this study—the clear and concise practical recommendations in the conclusion.

Response 1.  Thank you for your kind appraisal.

Comment 2. Authors should justify the choice of statistical analysis methods used and provide references to current literature describing recommendations for the use of these methods in the validation and comparative analysis of diagnostic test systems.

Response 2. Excellent comment - thank you.  We revised the section on statistical analyses rather extensively and added several new references.  See lines 132-149.

Comment 3. More recent publications on the use of the McNemar's test should be included and referenced in the text (lines 132–136). Because standards may have changed since the method was first described in 1947, it is important to demonstrate that this statistical tool remains relevant in up-to-date research.

Response 3.  Point well received.  We added additional text and citation [20] supporting the current use of the 75 year old McNemar's test - see lines 134-136.

Reviewer 2 Report

Comments and Suggestions for Authors

On page 2, “the sensitivity and specificity of this test system was not provided by the manufacturer”, the authors should discuss how this might affect the results/conclusion.

 

The manuscript should clarify the selection criteria for the 200 archived serum samples.

 

 The authors state in abstract, “… none of the FBAT test kits proved to have acceptable sensitivity and specificity …”, consider clarifying what’s considered “acceptable”

 

Consider adding manufacturer’s claims (as listed in Table 1) in Table 3 to highlight the manuscripts message. Again, the ELISA is probably not 100% sensitive/specific, some FBAT “false positives” or “false negatives” might actually be due to ELISA misclassification rather than FBAT?

 

The title and discussion refer to the need for “antigen-based alternatives,” yet the suggested alternatives include PCR and LAMP. Consider rephrasing this terminology or clarifying.

 

Minor comments

Data Availability Statement appears to be a placeholder

 

The acronym RBT was used without definition

Author Response

Comment 1. On page 2, “the sensitivity and specificity of this test system was not provided by the manufacturer”, the authors should discuss how this might affect the results/conclusion.

Response: Excellent point.  We expanded on that as a type of limitation of our study on lines 220-221

Comment 2. The manuscript should clarify the selection criteria for the 200 archived serum samples.

Response 2. Agreed; those were the total of the archived samples; text modified to state that on line 74.

Comment 3. The authors state in abstract, “… none of the FBAT test kits proved to have acceptable sensitivity and specificity …”, consider clarifying what’s considered “acceptable”.

Response 3. Very good comment.  We added text and a citation [25] indicating what is considered acceptable for sensitivity and specificity on line 204.

Comment 4.  Consider adding manufacturer’s claims (as listed in Table 1) in Table 3 to highlight the manuscripts message. Again, the ELISA is probably not 100% sensitive/specific, some FBAT “false positives” or “false negatives” might actually be due to ELISA misclassification rather than FBAT?

Response 4.  This is a good suggestion but on consideration, we decided it would be redundant to add the same information in Tables 1 and 3.  We opted to add a footnote to Table 3 indicating that the claimed S&S for each kit was described in Table 1.

Comment 5. The title and discussion refer to the need for “antigen-based alternatives,” yet the suggested alternatives include PCR and LAMP. Consider rephrasing this terminology or clarifying.

Response 5.  Outstanding suggestion. We modified the title and used the term "molecular" on line 234 to clarify this issue.

Comment 6. Data Availability Statement appears to be a placeholder

Response 6: Thank you for catching this!  We replaced the MDPI placeholder with appropriate text (lines 284-286)

Comment 7. The acronym RBT was used without definition

Comment 8. Another good catch - thank you.  We eliminated the acronym RBT and replaced with "Rose-Bengal test".