Next Article in Journal
Fast Phylogeny of SARS-CoV-2 by Compression
Next Article in Special Issue
Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436
Previous Article in Journal
Distributed Quantization for Partially Cooperating Sensors Using the Information Bottleneck Method
Previous Article in Special Issue
Reconstructing Binary Signals from Local Histograms
 
 
Correction published on 13 May 2022, see Entropy 2022, 24(5), 685.
Article

A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction

1
LITIS, Quantif, University of Rouen, 76000 Rouen, France
2
Centre Henri Becquerel, 76038 Rouen, France
3
Société Aquilab, 59120 Lille, France
4
Département de Radiothérapie, Centre Oscar Lambret, 59000 Lille, France
5
Département de Radiothérapie, CLCC Francois Baclesse, 14000 Caen, France
6
Département de Radiothérapie, Centre Léon Berard, 69008 Lyon, France
*
Author to whom correspondence should be addressed.
Academic Editor: Anne Humeau-Heurtier
Entropy 2022, 24(4), 436; https://doi.org/10.3390/e24040436
Received: 7 February 2022 / Revised: 18 March 2022 / Accepted: 18 March 2022 / Published: 22 March 2022 / Corrected: 13 May 2022
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
In this paper, we propose to quantitatively compare loss functions based on parameterized Tsallis–Havrda–Charvat entropy and classical Shannon entropy for the training of a deep network in the case of small datasets which are usually encountered in medical applications. Shannon cross-entropy is widely used as a loss function for most neural networks applied to the segmentation, classification and detection of images. Shannon entropy is a particular case of Tsallis–Havrda–Charvat entropy. In this work, we compare these two entropies through a medical application for predicting recurrence in patients with head–neck and lung cancers after treatment. Based on both CT images and patient information, a multitask deep neural network is proposed to perform a recurrence prediction task using cross-entropy as a loss function and an image reconstruction task. Tsallis–Havrda–Charvat cross-entropy is a parameterized cross-entropy with the parameter α. Shannon entropy is a particular case of Tsallis–Havrda–Charvat entropy for α=1. The influence of this parameter on the final prediction results is studied. In this paper, the experiments are conducted on two datasets including in total 580 patients, of whom 434 suffered from head–neck cancers and 146 from lung cancers. The results show that Tsallis–Havrda–Charvat entropy can achieve better performance in terms of prediction accuracy with some values of α. View Full-Text
Keywords: deep neural networks; Shannon entropy; Tsallis–Havrda–Charvat entropy; generalized entropies; recurrence prediction; head–neck cancer; lung cancer deep neural networks; Shannon entropy; Tsallis–Havrda–Charvat entropy; generalized entropies; recurrence prediction; head–neck cancer; lung cancer
Show Figures

Figure 1

MDPI and ACS Style

Brochet, T.; Lapuyade-Lahorgue, J.; Huat, A.; Thureau, S.; Pasquier, D.; Gardin, I.; Modzelewski, R.; Gibon, D.; Thariat, J.; Grégoire, V.; Vera, P.; Ruan, S. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436. https://doi.org/10.3390/e24040436

AMA Style

Brochet T, Lapuyade-Lahorgue J, Huat A, Thureau S, Pasquier D, Gardin I, Modzelewski R, Gibon D, Thariat J, Grégoire V, Vera P, Ruan S. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy. 2022; 24(4):436. https://doi.org/10.3390/e24040436

Chicago/Turabian Style

Brochet, Thibaud, Jérôme Lapuyade-Lahorgue, Alexandre Huat, Sébastien Thureau, David Pasquier, Isabelle Gardin, Romain Modzelewski, David Gibon, Juliette Thariat, Vincent Grégoire, Pierre Vera, and Su Ruan. 2022. "A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction" Entropy 24, no. 4: 436. https://doi.org/10.3390/e24040436

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop