Next Article in Journal
On Heat Transfer Performance of Cooling Systems Using Nanofluid for Electric Motor Applications
Previous Article in Journal / Special Issue
The Convex Information Bottleneck Lagrangian
Open AccessArticle

Probabilistic Ensemble of Deep Information Networks

Electronic and Telecommunications, Politecnico di Torino, 10100 Torino, Italy
Author to whom correspondence should be addressed.
Entropy 2020, 22(1), 100;
Received: 22 November 2019 / Revised: 10 January 2020 / Accepted: 13 January 2020 / Published: 14 January 2020
(This article belongs to the Special Issue Information–Theoretic Approaches to Computational Intelligence)
We describe a classifier made of an ensemble of decision trees, designed using information theory concepts. In contrast to algorithms C4.5 or ID3, the tree is built from the leaves instead of the root. Each tree is made of nodes trained independently of the others, to minimize a local cost function (information bottleneck). The trained tree outputs the estimated probabilities of the classes given the input datum, and the outputs of many trees are combined to decide the class. We show that the system is able to provide results comparable to those of the tree classifier in terms of accuracy, while it shows many advantages in terms of modularity, reduced complexity, and memory requirements.
Keywords: information Theory; information bottleneck; classifier; decision tree; ensemble information Theory; information bottleneck; classifier; decision tree; ensemble
MDPI and ACS Style

Franzese, G.; Visintin, M. Probabilistic Ensemble of Deep Information Networks. Entropy 2020, 22, 100.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop