Next Article in Journal
Shape and Structure, from Engineering to Nature
Previous Article in Journal
Self-Organization of Template-Replicating Polymers and the Spontaneous Rise of Genetic Information
Entropy 2001, 3(4), 280-292; doi:10.3390/e3040280
Article

A Possible Extension of Shannon's Information Theory

1,2
Received: 7 August 2001; Accepted: 21 October 2001 / Published: 21 November 2001
Download PDF [114 KB, uploaded 16 September 2008]
Abstract: As a possible generalization of Shannon's information theory, we review the formalism based on the non-logarithmic information content parametrized by a real number q, which exhibits nonadditivity of the associated uncertainty. Moreover it is shown that the establishment of the concept of the mutual information is of importance upon the generalization.
Keywords: information theory; Tsallis entropy; nonadditivity; source coding theorem; mutual information information theory; Tsallis entropy; nonadditivity; source coding theorem; mutual information
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Yamano, T. A Possible Extension of Shannon's Information Theory. Entropy 2001, 3, 280-292.

AMA Style

Yamano T. A Possible Extension of Shannon's Information Theory. Entropy. 2001; 3(4):280-292.

Chicago/Turabian Style

Yamano, Takuya. 2001. "A Possible Extension of Shannon's Information Theory." Entropy 3, no. 4: 280-292.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert