Special Issue "Applications of Information Theory to Epidemiology"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 31 January 2020.

Special Issue Editor

Prof. Gareth Hughes
E-Mail Website
Guest Editor
Scotland's Rural College, Crop and Soil Systems Research Group, Edinburgh, United Kingdom
Interests: quantitative epidemiology of plant disease; decision-making in crop protection; applications of information theory to epidemiology

Special Issue Information

Dear Colleagues,

Epidemiological applications of information theory can be traced back at least as far as the 1970s. The work of W.I. Card (collaborating with I.J. Good) on diagnostic decision-making in terms of entropy reduction and the work of C.E. Metz and colleagues on an information theoretic approach to the interpretation of receiver operating characteristic (ROC) curve data are examples of early applications. Almost half a century on, these examples still typify the way that information theory has been used by many epidemiologists and diagnosticians to gain insight into our understanding of disease risk and our decision-making in relation to the management of risk. At the same time, new applications are appearing, not least in the pages of Entropy.

This Special Issue looks both back at the way information theory has already contributed to our epidemiological understanding of disease risk, and forward to new contributions. Medical and botanical applications are predominant at the moment, but the increasing availability of individual and household data to social geographers and commercial sociologists seems likely to present new opportunities for information theoretic applications. We welcome research work on all aspects of information theoretic applications in the study of epidemiology and disease risk for this Special Issue.

Prof. Gareth Hughes
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Medical epidemiology
  • Botanical epidemiology
  • Social geography
  • Disease risk factors
  • Calibration and validation of risk algorithms
  • Diagnostic decision-making

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle
Dynamics of Ebola Disease in the Framework of Different Fractional Derivatives
Entropy 2019, 21(3), 303; https://doi.org/10.3390/e21030303 - 21 Mar 2019
Cited by 7
Abstract
In recent years the world has witnessed the arrival of deadly infectious diseases that have taken many lives across the globe. To fight back these diseases or control their spread, mankind relies on modeling and medicine to control, cure, and predict the behavior [...] Read more.
In recent years the world has witnessed the arrival of deadly infectious diseases that have taken many lives across the globe. To fight back these diseases or control their spread, mankind relies on modeling and medicine to control, cure, and predict the behavior of such problems. In the case of Ebola, we observe spread that follows a fading memory process and also shows crossover behavior. Therefore, to capture this kind of spread one needs to use differential operators that posses crossover properties and fading memory. We analyze the Ebola disease model by considering three differential operators, that is the Caputo, Caputo–Fabrizio, and the Atangana–Baleanu operators. We present brief detail and some mathematical analysis for each operator applied to the Ebola model. We present a numerical approach for the solution of each operator. Further, numerical results for each operator with various values of the fractional order parameter α are presented. A comparison of the suggested operators on the Ebola disease model in the form of graphics is presented. We show that by decreasing the value of the fractional order parameter α , the number of individuals infected by Ebola decreases efficiently and conclude that for disease elimination, the Atangana–Baleanu operator is more useful than the other two. Full article
(This article belongs to the Special Issue Applications of Information Theory to Epidemiology)
Show Figures

Figure 1

Review

Jump to: Research

Open AccessReview
A Review of the Application of Information Theory to Clinical Diagnostic Testing
Entropy 2020, 22(1), 97; https://doi.org/10.3390/e22010097 - 14 Jan 2020
Abstract
The fundamental information theory functions of entropy, relative entropy, and mutual information are directly applicable to clinical diagnostic testing. This is a consequence of the fact that an individual’s disease state and diagnostic test result are random variables. In this paper, we review [...] Read more.
The fundamental information theory functions of entropy, relative entropy, and mutual information are directly applicable to clinical diagnostic testing. This is a consequence of the fact that an individual’s disease state and diagnostic test result are random variables. In this paper, we review the application of information theory to the quantification of diagnostic uncertainty, diagnostic information, and diagnostic test performance. An advantage of information theory functions over more established test performance measures is that they can be used when multiple disease states are under consideration as well as when the diagnostic test can yield multiple or continuous results. Since more than one diagnostic test is often required to help determine a patient’s disease state, we also discuss the application of the theory to situations in which more than one diagnostic test is used. The total diagnostic information provided by two or more tests can be partitioned into meaningful components. Full article
(This article belongs to the Special Issue Applications of Information Theory to Epidemiology)

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Author: Prof. Waldemar Koczkodaj
Affiliation: Laurentian University

Author: Prof. William Benish
Affiliation: Case Western Reserve University

Author: Neil McRoberts and Allysha Choudhury
Affiliation: University of California, Davis

Back to TopTop