Information-Theoretic Analysis of Memoryless Deterministic Systems
AbstractThe information loss in deterministic, memoryless systems is investigated by evaluating the conditional entropy of the input random variable given the output random variable. It is shown that for a large class of systems the information loss is finite, even if the input has a continuous distribution. For systems with infinite information loss, a relative measure is defined and shown to be related to Rényi information dimension. As deterministic signal processing can only destroy information, it is important to know how this information loss affects the solution of inverse problems. Hence, we connect the probability of perfectly reconstructing the input to the information lost in the system via Fano-type bounds. The theoretical results are illustrated by example systems commonly used in discrete-time, nonlinear signal processing and communications. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Geiger, B.C.; Kubin, G. Information-Theoretic Analysis of Memoryless Deterministic Systems. Entropy 2016, 18, 410.
Geiger BC, Kubin G. Information-Theoretic Analysis of Memoryless Deterministic Systems. Entropy. 2016; 18(11):410.Chicago/Turabian Style
Geiger, Bernhard C.; Kubin, Gernot. 2016. "Information-Theoretic Analysis of Memoryless Deterministic Systems." Entropy 18, no. 11: 410.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.