Next Article in Journal
An Elementary Derivation of The Black Hole Entropy in Any Dimension
Next Article in Special Issue
Diversity and Entropy
Previous Article in Journal
Quantum Limits to the Second Law
Previous Article in Special Issue
Gibbs’ Paradox and the Definition of Entropy
Article Menu

Article Versions

Export Article

Open AccessArticle
Entropy 2001, 3(1), 1-11; doi:10.3390/e3010001

Some Observations on the Concepts of Information-Theoretic Entropy and Randomness

Department of Mathematics, Iowa State University, Ames, IA 50011, USA
Received: 15 February 2000 / Accepted: 11 January 2001 / Published: 1 February 2001
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)
Download PDF [130 KB, uploaded 24 February 2015]

Abstract

Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed. Pre-dating Shannon, the concept is traced back to Pauli. A derivation from first principles is given, without use of approximations. The concept depends on the underlying degree of randomness. In physical applications, this translates to dependence on the experimental apparatus available. An example illustrates how this dependence affects Prigogine's proposal for the use of the Second Law of Thermodynamics as a selection principle for the breaking of time symmetry. The dependence also serves to yield a resolution of the so-called ``Gibbs Paradox.'' Extension of the concept from the discrete to the continuous case is discussed. The usual extension is shown to be dimensionally incorrect. Correction introduces a reference density, leading to the concept of Kullback entropy. Practical relativistic considerations suggest a possible proper reference density.
Keywords: information-theoretic entropy; Shannon entropy; Martin-Loef randomness; self-delimiting algorithmic complexity; thermodynamic entropy; second law of thermodynamics; selection principle; wave equation; Gibbs paradox; dimensional analysis; Kullback entropy; cross-entropy; reference density; improper prior information-theoretic entropy; Shannon entropy; Martin-Loef randomness; self-delimiting algorithmic complexity; thermodynamic entropy; second law of thermodynamics; selection principle; wave equation; Gibbs paradox; dimensional analysis; Kullback entropy; cross-entropy; reference density; improper prior
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Smith, J.D. Some Observations on the Concepts of Information-Theoretic Entropy and Randomness. Entropy 2001, 3, 1-11.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top