You are currently viewing a new version of our website. To view the old version click .
Viruses
  • Comment
  • Open Access

18 April 2022

Comment on Wang et al. Development of a Novel Double Antibody Sandwich ELISA for Quantitative Detection of Porcine Deltacoronavirus Antigen. Viruses 2021, 13, 2403

,
and
1
College of Computer and Control Engineering, Qiqihar University, Qiqihar 161006, China
2
Heilongjiang Provincial Key Laboratory of Resistance Gene Engineering and Protection of Biodiversity in Cold Areas, College of Life Science and Agriculture and Forestry, Qiqihar University, Qiqihar 161006, China
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue State-of-the-Art Porcine Virus Research in China
We were interested in reading an article published by Wang et al. [1] in Viruses on 30 November 2021. The authors developed a double-antibody sandwich enzyme-linked immunosorbent assay (DAS-ELISA) for porcine deltacoronavirus (PDCoV) detection using a monoclonal antibody against the PDCoV N protein and an anti-PDCoV rabbit polyclonal antibody. They used kappa analysis to determine the consistency between the DAS-ELISA and reverse transcriptase real-time PCR (RT-qPCR). The kappa value obtained was 0.827, indicating an almost perfect agreement between the two methods.
Although this article has provided valuable information, some substantial points that may cause misinterpretation of the results need to be clarified. Kappa analysis includes Cohen’s and Fleiss’ kappa analyses. Generally, Cohen’s and Fleiss’ kappa analyses are used to analyze intra- and inter-rater agreements, respectively [2]. Cohen’s kappa analysis is suitable for evaluating two raters, whereas Fleiss’ kappa analysis is suitable for evaluating more than two raters. In Cohen’s kappa analysis, the weighted kappa should be used to calculate the agreement in the presence of more than two categories [3]. According to this article, Cohen’s kappa was applicable based on this situation. Cohen’s kappa was calculated as follows:
k = i = 1 n ( p i i p i q i ) 1 i = 1 n p i q i
The value of p and q are the sample frequency.
The authors compared the two detection methods using two types of clinical samples, namely, 205 fecal and 59 intestinal samples. Unlike the authors, we calculated the agreement between the RT-qPCR and DAS-ELISA by using the two types of samples with SPSS 18 statistical package (SPSS 18 Inc., Chicago, IL, USA) software. The kappa values in the fecal and intestinal samples were 0.807 and 0.645, respectively. Furthermore, a simple sum of the data was performed, and the kappa value obtained was 0.781 (Table 1). The three kappa values were significantly different from the authors’ kappa value of 0.827. We would be grateful if the authors could explain their results in detail and clarify the misunderstanding.
Table 1. The kappa values for calculating agreement between RT-qPCR and DAS-ELISA.
Considering the applicability of Cohen’s kappa analysis, we suggest that the kappa values should be calculated in the presence of two or more types of samples. In our opinion, any agreed conclusions must be supported by methodological and statistical methods. We emphasize the importance of rigor and using the correct statistical approach in any scientific publication. Otherwise, misinterpretation cannot be avoided.

Author Contributions

M.L. wrote the manuscript. C.Z. helped to draft the manuscript. C.Z. and T.Y. conducted data analysis. T.Y. contributed essential ideas and revised the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, W.; Li, J.; Fan, B.; Zhang, X.; Guo, R.; Zhao, Y.; Zhou, J.; Zhou, J.; Sun, D.; Li, B. Development of a Novel Double Antibody Sandwich ELISA for Quantitative Detection of Porcine Deltacoronavirus Antigen. Viruses 2021, 13, 2403. [Google Scholar] [CrossRef] [PubMed]
  2. Meseguer-Henarejos, A.B.; Sánchez-Meca, J.; López-Pina, J.A.; Carles-Hernández, R. Inter- and intra-rater reliability of the Modified Ashworth Scale: A systematic review and meta-analysis. Eur. J. Phys. Rehabil. Med. 2018, 54, 576–590. [Google Scholar] [CrossRef] [PubMed]
  3. Tran, Q.D.; Demirhan, H.; Dolgun, A. Bayesian approaches to the weighted kappa-like inter-rater agreement measures. Stat. Methods Med. Res. 2021, 30, 2329–2351. [Google Scholar] [CrossRef] [PubMed]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.