Next Article in Journal
Existence of Entropy Solutions for Nonsymmetric Fractional Systems
Next Article in Special Issue
J.J. Thomson and Duhem’s Lagrangian Approaches to Thermodynamics
Previous Article in Journal
Illuminating Water and Life
Previous Article in Special Issue
Koszul Information Geometry and Souriau Geometric Temperature/Capacity of Lie Group Thermodynamics
Article Menu

Export Article

Open AccessArticle
Entropy 2014, 16(9), 4892-4910; doi:10.3390/e16094892

On Shannon’s Formula and Hartley’s Rule: Beyond the Mathematical Coincidence

1
Télécom ParisTech, Institut Mines-Télécom, CNRS LTCI, 46 Rue Barrault, 75013, Paris, France
2
School of Technology (FT), University of Campinas (Unicamp), Rua Paschoal Marmo 1.888, 13484-370 Limeira, São Paulo, Brazil
This paper is an extended version of our paper published in Proceedings of the MaxEnt 2014 Conference on Bayesian Inference and Maximum Entropy Methods in Science and Engineering.
*
Author to whom correspondence should be addressed.
Received: 23 July 2014 / Revised: 18 August 2014 / Accepted: 28 August 2014 / Published: 10 September 2014
(This article belongs to the Special Issue Information, Entropy and Their Geometric Structures)
View Full-Text   |   Download PDF [416 KB, uploaded 24 February 2015]   |  

Abstract

In the information theory community, the following “historical” statements are generally well accepted: (1) Hartley did put forth his rule twenty years before Shannon; (2) Shannon’s formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came out unexpected in 1948; (3) Hartley’s rule is inexact while Shannon’s formula is characteristic of the additive white Gaussian noise channel; (4) Hartley’s rule is an imprecise relation that is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are somewhat wrong. In fact, a careful calculation shows that “Hartley’s rule” in fact coincides with Shannon’s formula. We explain this mathematical coincidence by deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon’s formula and construct a sequence of such channels that makes the link between the uniform (Hartley) and Gaussian (Shannon) channels. View Full-Text
Keywords: Shannon’s formula; Hartley’s rule; additive noise channel; differential entropy; channel capacity; signal-to-noise ratio; pulse-amplitude modulation (PAM); additive white Gaussian noise (AWGN) channel; uniform noise channel; characteristic function; uniform B-spline function; uniform sum distribution; central limit theorem Shannon’s formula; Hartley’s rule; additive noise channel; differential entropy; channel capacity; signal-to-noise ratio; pulse-amplitude modulation (PAM); additive white Gaussian noise (AWGN) channel; uniform noise channel; characteristic function; uniform B-spline function; uniform sum distribution; central limit theorem
Figures

This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Rioul, O.; Magossi, J.C. On Shannon’s Formula and Hartley’s Rule: Beyond the Mathematical Coincidence. Entropy 2014, 16, 4892-4910.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top