Entropy 2008, 10(4), 765-775; doi:10.3390/e10040765
Article

Non-linear Information Inequalities

* email and email
Received: 24 May 2008; Accepted: 9 December 2008 / Published: 22 December 2008
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract: We construct non-linear information inequalities from Mat´uˇs’ infinite series of linear information inequalities. Each single non-linear inequality is sufficiently strong to prove that the closure of the set of all entropy functions is not polyhedral for four or more random variables, a fact that was already established using the series of linear inequalities. To the best of our knowledge, they are the first non-trivial examples of non-linear information inequalities.
Keywords: Entropy; entropy function; nonlinear information inequality; nonshannon type information inequality
PDF Full-text Download PDF Full-Text [471 KB, uploaded 17 December 2008 10:30 CET]

Export to BibTeX |
EndNote


MDPI and ACS Style

Chan, T.; Grant, A. Non-linear Information Inequalities. Entropy 2008, 10, 765-775.

AMA Style

Chan T, Grant A. Non-linear Information Inequalities. Entropy. 2008; 10(4):765-775.

Chicago/Turabian Style

Chan, Terence; Grant, Alex. 2008. "Non-linear Information Inequalities." Entropy 10, no. 4: 765-775.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert