Special Issue "Application of Entropy to Computer Vision and Medical Imaging"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: 31 August 2022 | Viewed by 3944

Special Issue Editors

Dr. Su Ruan
E-Mail Website
Guest Editor
Department of Medicine, University of Rouen Normandy, 76130 Mont-Saint-Aignan, France
Interests: image processing; pattern recognition; medical image analysis; information fusion
Dr. Jérôme Lapuyade-Lahorgue
E-Mail Website
Guest Editor
Department of Medicine, University of Rouen Normandy, 76130 Mont-Saint-Aignan, France
Interests: bayesian and statistical inference; information theory; deep-learning; signal processing; medical image analysis

Special Issue Information

Dear Colleagues,

Shannon entropy is initially devoted to quantifying the minimum bits necessary to encode a signal without loss of information; it represents the asymptotic limit of the compression ratio in the Huffman algorithm. Moreover, Shannon entropy is linked to the amount of disorder in random signals. Since Shannon’s work, generalizations of entropy (Rényie, Havrda–Charvat) as well as various applications have emerged. In statistics, as well as in machine learning, different entropies have been used to model uncertainty in data and in parameter estimation and can be also used to evaluate the amount of information in data. From entropies, one can define divergences which are used as “distances” between probability distributions. In deep learning, these entropies are usually used as loss functions for probabilistic neural networks.

This Special Issue is devoted to applications of probabilistic neural networks for computer vision and medical image analysis.

This Special Issue will accept unpublished original papers and comprehensive reviews focused on (but not restricted to) the following research areas:

- Modeling new loss functions in neural networks;

- Use of entropies and information measures for uncertainty quantification in a neural network;

- Choice of relevant entropy depending on the data and the task;

- Influence of activation functions on the choice of entropy;

- Axioms behind the choice of entropy;

- Entropy measures for the evaluation of image quality;

- Applications to medical image analysis and computer vision.

Dr. Su Ruan
Dr. Jérôme Lapuyade-Lahorgue
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • shannon entropy
  • generalized entropies
  • uncertainty quantification
  • deep learning
  • medical image analysis
  • computer vision
  • amount of information

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

Article
A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction
Entropy 2022, 24(4), 436; https://doi.org/10.3390/e24040436 - 22 Mar 2022
Cited by 1 | Viewed by 617 | Correction
Abstract
In this paper, we propose to quantitatively compare loss functions based on parameterized Tsallis–Havrda–Charvat entropy and classical Shannon entropy for the training of a deep network in the case of small datasets which are usually encountered in medical applications. Shannon cross-entropy is widely [...] Read more.
In this paper, we propose to quantitatively compare loss functions based on parameterized Tsallis–Havrda–Charvat entropy and classical Shannon entropy for the training of a deep network in the case of small datasets which are usually encountered in medical applications. Shannon cross-entropy is widely used as a loss function for most neural networks applied to the segmentation, classification and detection of images. Shannon entropy is a particular case of Tsallis–Havrda–Charvat entropy. In this work, we compare these two entropies through a medical application for predicting recurrence in patients with head–neck and lung cancers after treatment. Based on both CT images and patient information, a multitask deep neural network is proposed to perform a recurrence prediction task using cross-entropy as a loss function and an image reconstruction task. Tsallis–Havrda–Charvat cross-entropy is a parameterized cross-entropy with the parameter α. Shannon entropy is a particular case of Tsallis–Havrda–Charvat entropy for α=1. The influence of this parameter on the final prediction results is studied. In this paper, the experiments are conducted on two datasets including in total 580 patients, of whom 434 suffered from head–neck cancers and 146 from lung cancers. The results show that Tsallis–Havrda–Charvat entropy can achieve better performance in terms of prediction accuracy with some values of α. Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Show Figures

Figure 1

Article
Reconstructing Binary Signals from Local Histograms
Entropy 2022, 24(3), 433; https://doi.org/10.3390/e24030433 - 21 Mar 2022
Viewed by 820
Abstract
In this paper, we considered the representation power of local overlapping histograms for discrete binary signals. We give an algorithm that is linear in signal size and factorial in window size for producing the set of signals, which share a sequence of densely [...] Read more.
In this paper, we considered the representation power of local overlapping histograms for discrete binary signals. We give an algorithm that is linear in signal size and factorial in window size for producing the set of signals, which share a sequence of densely overlapping histograms, and we state the values for the sizes of the number of unique signals for a given set of histograms, as well as give bounds on the number of metameric classes, where a metameric class is a set of signals larger than one, which has the same set of densely overlapping histograms. Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Show Figures

Figure 1

Other

Jump to: Research

Correction
Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436
Entropy 2022, 24(5), 685; https://doi.org/10.3390/e24050685 - 13 May 2022
Viewed by 1886
Abstract
Alexandre Huat, Sébastien Thureau, David Pasquier, Isabelle Gardin, Romain Modzelewski, David Gibon, Juliette Thariat and Vincent Grégoire were not included as authors in the original publication [...] Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Back to TopTop