Open AccessThis article is
- freely available
Entropy: The Markov Ordering Approach
Department of Mathematics, University of Leicester, Leicester, UK
Institute of Space and Information Technologies, Siberian Federal University, Krasnoyarsk, Russia
Department of Resource Economics, University of California, Berkeley, CA, USA
* Author to whom correspondence should be addressed.
Received: 1 March 2010; in revised form: 30 April 2010 / Accepted: 4 May 2010 / Published: 7 May 2010
Abstract: The focus of this article is on entropy and Markov processes. We study the properties of functionals which are invariant with respect to monotonic transformations and analyze two invariant “additivity” properties: (i) existence of a monotonic transformation which makes the functional additive with respect to the joining of independent systems and (ii) existence of a monotonic transformation which makes the functional additive with respect to the partitioning of the space of states. All Lyapunov functionals for Markov chains which have properties (i) and (ii) are derived. We describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the Markov order). The solution differs significantly from the ordering given by the inequality of entropy growth. For inference, this approach results in a convex compact set of conditionally “most random” distributions.
Keywords: Markov process; Lyapunov function; entropy functionals; attainable region; MaxEnt; inference
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Gorban, A.N.; Gorban, P.A.; Judge, G. Entropy: The Markov Ordering Approach. Entropy 2010, 12, 1145-1193.
Gorban AN, Gorban PA, Judge G. Entropy: The Markov Ordering Approach. Entropy. 2010; 12(5):1145-1193.
Gorban, Alexander N.; Gorban, Pavel A.; Judge, George. 2010. "Entropy: The Markov Ordering Approach." Entropy 12, no. 5: 1145-1193.