licensee Molecular Diversity Preservation International, Basel, Switzerland. This article is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license

Spontaneous transitions of an isolated system from one macroscopic state to another (relaxation processes) are accompanied by a change of entropy. Following Jaynes’ MaxEnt formalism, it is shown that practically all the possible microscopic developments of a system, within a fixed time interval, are accompanied by the maximum possible entropy change. In other words relaxation processes are accompanied by maximum entropy production.

E.T. Jaynes, claiming that our knowledge of the initial microscopic state of a system is not a matter of physics but of information theory, introduced information entropy to describe the behavior of macroscopic systems [

MaxEnt can be equally applied to equilibrium and nonequilibrium processes. It was applied by a number of authors [

Dewar considered the ensemble of the trajectories (paths) in phase space. He maximized the information entropy over paths with macroscopic quantities as constraints (MaxEnt) and found that path probability is proportional to the exponential function of entropy production, when entropy production is expressed as the sum of products of fluxes and conjugated forces. Thus, the most probable evolution of a system is accompanied by the maximum possible entropy production. Some criticism of Dewar’s work can be found in references [

Niven [

Both Dewar and Niven assumed the existence of a stationary state, local equilibrium and entropy production defined as the sum of the products of thermodynamic forces and conjugated fluxes.

We noted in paper II that in a stationary nonequilibrium state the subject must interfere with a system in order to maintain such a state. In this paper, entitled paper III, we apply the MaxEnt formalism to relaxation processes. Our approach is free of the above-mentioned assumptions. Assuming that changes of entropy, within a fixed interval of time, are distributed so as to achieve maximum information entropy, we find that the evolution of an isolated system is accompanied by the maximum possible entropy production.

The paper is organized in the following manner.

E.T. Jaynes [

We note that information entropy is a property of any distribution of probabilities and has, in principle, nothing to do with thermodynamic entropy that is a property of state of macroscopic system (see page 351 in reference [

The relaxation process is the spontaneous approach of an isolated system toward its equilibrium state.

The initial problem in applying MaxEnt is the choice of events of interest. In the case of an isolated system, the initial microscopic state of the system defines the “phase space trajectories” (paths). Assuming that the initial microscopic states do not share a common path there is a one-to-one correspondence between initial states and paths.

Time evolution of the macroscopic state of a system is defined with the set

We can define the entropy of an isolated system by means of microcanonical ensemble using either Boltzmann’s ,

In order to resolve this paradox, Zurek [

Each measuring process is defined by the measuring device and system, whose state is determined with the value of measured quantity. A measurement of any physical quantity is accompanied with uncertainty. The resulting uncertainty is a superposition of the uncertainties introduced by the measuring device and system. The uncertainty introduced due to the measuring instrument is within the resolution of the measuring instrument. We too exploit the coarse grain approach, but in a different way from the above described. We take the resolution of the measuring devices

The initial state is defined by a subensemble with fixed values

The mean value of entropy is the result of averaging its value over subensembles defined with

We assume that changes of entropy satisfy the MaxEnt distribution. Then the measured (mean) value of the change of entropy is the only constraint in addition to the normalization of probabilities:

Information entropy in Eq. (

The constrained maximization of information entropy (

The outcome of the maximization of

We return back to the relaxation of the system. It comes from the foregoing expression that the maximum possible change of entropy is the most probable one. For a very short interval of time

In short, relaxation processes in isolated systems are in accordance with the MEP principle. An isolated system will develop in such a way so as to produce the maximum possible entropy. This is an addition of the second law of thermodynamics. While the second law states only that an isolated system will approach the state of largest entropy, the MEP principle states that the system will approach that state with the maximum possible speed [

The basic assumption of this paper is that an entropy change in relaxation processes obeys the MaxEnt formalism. Therefore, the largest possible increase in entropy is the most probable increase of entropy. This means that a system evolves with maximum entropy production. In this way, the MEP principle becomes an addition to the second law of thermodynamics. While the second law requires that an isolated system will find itself in the state of maximum entropy, the MEP principle asserts that the system reaches this state as quickly as possible.

The present work was supported by the bilateral research project of Slovenia-Croatia Cooperation in Science and Technology, 2009-2010 and Croatian Ministry of Science grant No. 177-1770495-0476 to DJ.