Next Article in Journal
Optimization of Biomass-Fuelled Combined Cooling, Heating and Power (CCHP) Systems Integrated with Subcritical or Transcritical Organic Rankine Cycles (ORCs)
Previous Article in Journal
Measuring the Complexity of Self-Organizing Traffic Lights
Entropy 2014, 16(5), 2408-2432; doi:10.3390/e16052408
Article

General H-theorem and Entropies that Violate the Second Law

Department of Mathematics, University of Leicester, University Road, Leicester, LE1 7RH, UK
Received: 9 March 2014 / Revised: 15 April 2014 / Accepted: 24 April 2014 / Published: 29 April 2014
View Full-Text   |   Download PDF [364 KB, uploaded 24 February 2015]   |   Browse Figures

Abstract

H-theorem states that the entropy production is nonnegative and, therefore, the entropy of a closed system should monotonically change in time. In information processing, the entropy production is positive for random transformation of signals (the information processing lemma). Originally, the H-theorem and the information processing lemma were proved for the classical Boltzmann-Gibbs-Shannon entropy and for the correspondent divergence (the relative entropy). Many new entropies and divergences have been proposed during last decades and for all of them the H-theorem is needed. This note proposes a simple and general criterion to check whether the H-theorem is valid for a convex divergence H and demonstrates that some of the popular divergences obey no H-theorem. We consider systems with n states Ai that obey first order kinetics (master equation). A convex function H is a Lyapunov function for all master equations with given equilibrium if and only if its conditional minima properly describe the equilibria of pair transitions AiAj . This theorem does not depend on the principle of detailed balance and is valid for general Markov kinetics. Elementary analysis of pair equilibria demonstrate that the popular Bregman divergences like Euclidian distance or Itakura-Saito distance in the space of distribution cannot be the universal Lyapunov functions for the first-order kinetics and can increase in Markov processes. Therefore, they violate the second law and the information processing lemma. In particular, for these measures of information (divergences) random manipulation with data may add information to data. The main results are extended to nonlinear generalized mass action law kinetic equations.
Keywords: Markov process; Lyapunov function; non-classical entropy; information processing; quasiconvexity; directional convexity; Schur convexity Markov process; Lyapunov function; non-classical entropy; information processing; quasiconvexity; directional convexity; Schur convexity
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Export to BibTeX |
EndNote


MDPI and ACS Style

Gorban, A.N. General H-theorem and Entropies that Violate the Second Law. Entropy 2014, 16, 2408-2432.

View more citation formats

Article Metrics

Comments

Citing Articles

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert