Next Article in Journal
Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation
Next Article in Special Issue
Entropy Measurements for Leukocytes’ Surrounding Informativeness Evaluation for Acute Lymphoblastic Leukemia Classification
Previous Article in Journal
Information Hiding Based on Statistical Features of Self-Organizing Patterns
Previous Article in Special Issue
A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Correction

Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436

1
LITIS, Quantif, University of Rouen, 76000 Rouen, France
2
Centre Henri Becquerel, 76038 Rouen, France
3
Société Aquilab, 59120 Lille, France
4
Département de Radiothérapie, Centre Oscar Lambret, 59000 Lille, France
5
Département de Radiothérapie, CLCC Francois Baclesse, 14000 Caen, France
6
Département de Radiothérapie, Centre Léon Berard, 69008 Lyon, France
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(5), 685; https://doi.org/10.3390/e24050685
Submission received: 28 April 2022 / Accepted: 29 April 2022 / Published: 13 May 2022
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)

Addition of Authors

Alexandre Huat, Sébastien Thureau, David Pasquier, Isabelle Gardin, Romain Modzelewski, David Gibon, Juliette Thariat and Vincent Grégoire were not included as authors in the original publication [1]. The corrected Author Contributions Statement appears here. The authors apologize for any inconvenience caused and state that the scientific conclusions are unaffected. This correction was approved by the Academic Editor. The original publication has also been updated.

Author Contributions

Conceptualization, S.R. and J.L.-L.; methodology, S.R., T.B. and J.L.-L.; software, T.B. and J.L.-L.; validation, S.R.; formal analysis, T.B.; writing—original draft preparation, T.B.; writing—review and editing, J.L.-L. and S.R.; supervision, S.R.; project administration, S.R.; data curation, A.H., S.T., D.P., I.G., R.M., D.G., J.T., V.G. and P.V.; resources, S.T., D.P., I.G., R.M., J.T., V.G. and P.V.; medical expertise, P.V.; investigation, T.B., J.L.-L., A.H., S.T., D.P., I.G., R.M., D.G., J.T., V.G., P.V. and S.R. All authors have read and agreed to the published version of the manuscript.

Reference

  1. Brochet, T.; Lapuyade-Lahorgue, J.; Vera, P.; Ruan, S. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436. [Google Scholar] [CrossRef] [PubMed]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Brochet, T.; Lapuyade-Lahorgue, J.; Huat, A.; Thureau, S.; Pasquier, D.; Gardin, I.; Modzelewski, R.; Gibon, D.; Thariat, J.; Grégoire, V.; et al. Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436. Entropy 2022, 24, 685. https://doi.org/10.3390/e24050685

AMA Style

Brochet T, Lapuyade-Lahorgue J, Huat A, Thureau S, Pasquier D, Gardin I, Modzelewski R, Gibon D, Thariat J, Grégoire V, et al. Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436. Entropy. 2022; 24(5):685. https://doi.org/10.3390/e24050685

Chicago/Turabian Style

Brochet, Thibaud, Jérôme Lapuyade-Lahorgue, Alexandre Huat, Sébastien Thureau, David Pasquier, Isabelle Gardin, Romain Modzelewski, David Gibon, Juliette Thariat, Vincent Grégoire, and et al. 2022. "Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436" Entropy 24, no. 5: 685. https://doi.org/10.3390/e24050685

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop