Special Issue "An Informational Theoretical Approach to the Entropy of Liquids and Solutions"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 June 2019).

Special Issue Editor

Prof. Dr. Arieh Ben-Naim
Website
Guest Editor
Department of Physical Chemistry, The Hebrew University of Jerusalem, Edmond J. Safra Campus Givat Ram, Jerusalem 91904, Israel
Interests: theory of liquids and solutions; theories of water and aqueous solutions; theoretical problems in biochemistry and biophysics
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

The statistical mechanical theory of liquids has been lagging far behind the theory of either gases or solids, Ben-Naim (2006). Recently, information theory has been used to derive and interpret the entropy of an ideal gas of simple particles (i.e., non-interacting and structure-less particles). Starting with Shannon’s measure of information (SMI) one can get the entropy function of an ideal gas, the same function derived in 1912 by Sackur and Tetrode.

The new deviation of the same entropy function, based on SMI, has several advantages listed in Ben-Naim (2008, 2017). Here we mention two: First, it provides a simple interpretation of the various terms in this entropy function. Second, and more important for our purpose, this derivation may be extended to any system of interacting particles including liquids and solutions, which is the central topic of this Special Issue of Entropy.

The main idea is that once one adds intermolecular interactions between the particles, one also adds correlations between the particles. These correlations may be cast in terms of mutual information (MI). Hence, we can start with the informational theoretical interpretation of the entropy of an ideal gas. Then, we add correction due to correlations in the form of MI between the locations of the particles. This process preserves the interpretation of the entropy of liquids and solutions in terms of a measure of information (or as an average uncertainty about the locations of the particles).

It is well-known that the entropy of liquids, any liquids for that matter, is lower than the entropy of a gas. This is manifested by the positive slope of the P(T) curve in the phase diagram of any substance. Traditionally, this fact is interpreted in terms of order–disorder; The lower entropy of the liquid is interpreted in terms of higher degree of order compared with that of the gas.

However, unlike in the case of transition from solid to either liquid, or to a gas phase where the order–disorder interpretation works well, the same interpretation would not work for the liquid-gas transition. It is hard, if not impossible to argue that the liquid is more “ordered” than the gaseous phase.

It is the purpose of this issue to examine some specific liquids, and compare their entropies with the extent of the intermolecular interactions between the molecules.

One outstanding liquid known to be a structured liquid, is water. This liquid will be discussed in a paper by Ben-Naim (this issue). In addition, heavy water, as well as aqueous solutions of simple solutes, such as argon or methane, will be discussed in this article.

Authors are invited to contribute, to this issue of Entropy, articles discussing the entropy of other liquids, of either pure substance or mixtures of different components.

Prof. Arieh Ben-Naim
Guest Editor

References

Ben-Naim, A. (2006). A Molecular Theory of Solutions. Oxford University Press: Oxford, UK.

Ben-Naim, A. (2008). A Farewell to Entropy: Statistical Thermodynamics Based on Information. World Scientific: Singapore.

Ben-Naim, A. (2017). Information Theory, Part I: An Introduction to the Fundamental Concept. World Scientific: Singapore.

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Entropy
  • Liquids
  • Solutions
  • Information theory

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

Open AccessEditorial
An Informational Theoretical Approach to the Entropy of Liquids and Solutions
Entropy 2018, 20(7), 514; https://doi.org/10.3390/e20070514 - 09 Jul 2018
Cited by 4
Abstract
It is well known that the statistical mechanical theory of liquids has been lagging far behind the theory of either gases or solids, See for examples: Ben-Naim (2006), Fisher (1964), Guggenheim (1952) Hansen and McDonald (1976), Hill (1956), Temperley, Rowlinson and Rushbrooke (1968), [...] Read more.
It is well known that the statistical mechanical theory of liquids has been lagging far behind the theory of either gases or solids, See for examples: Ben-Naim (2006), Fisher (1964), Guggenheim (1952) Hansen and McDonald (1976), Hill (1956), Temperley, Rowlinson and Rushbrooke (1968), O’Connell (1971). Information theory was recently used to derive and interpret the entropy of an ideal gas of simple particles (i.e., non-interacting and structure-less particles). Starting with Shannon’s measure of information (SMI), one can derive the entropy function of an ideal gas, the same function as derived by Sackur (1911) and Tetrode (1912). The new deviation of the same entropy function, based on SMI, has several advantages, as listed in Ben-Naim (2008, 2017). Here we mention two: First, it provides a simple interpretation of the various terms in this entropy function. Second, and more important for our purpose, this derivation may be extended to any system of interacting particles including liquids and solutions. The main idea is that once one adds intermolecular interactions between the particles, one also adds correlations between the particles. These correlations may be cast in terms of mutual information (MI). Hence, we can start with the informational theoretical interpretation of the entropy of an ideal gas. Then, we add correction due to correlations in the form of MI between the locations of the particles. This process preserves the interpretation of the entropy of liquids and solutions in terms of a measure of information (or as an average uncertainty about the locations of the particles). It is well known that the entropy of liquids, any liquids for that matter, is lower than the entropy of a gas. Traditionally, this fact is interpreted in terms of order-disorder. The lower entropy of the liquid is interpreted in terms of higher degree of order compared with that of the gas. However, unlike the transition from a solid to either a liquid, or to a gaseous phase where the order-disorder interpretation works well, the same interpretation would not work for the liquid-gas transition. It is hard, if not impossible, to argue that the liquid phase is more “ordered” than the gaseous phase. In this article, we interpret the lower entropy of liquids in terms of SMI. One outstanding liquid known to be a structured liquid, is water, according to Ben-Naim (2009, 2011). In addition, heavy water, as well as aqueous solutions of simple solutes such as argon or methane, will be discussed in this article. Full article
Show Figures

Figure 1

Research

Jump to: Editorial

Open AccessArticle
First Principles Calculation of the Entropy of Liquid Aluminum
Entropy 2019, 21(2), 131; https://doi.org/10.3390/e21020131 - 31 Jan 2019
Abstract
The information required to specify a liquid structure equals, in suitable units, its thermodynamic entropy. Hence, an expansion of the entropy in terms of multi-particle correlation functions can be interpreted as a hierarchy of information measures. Utilizing first principles molecular dynamics simulations, we [...] Read more.
The information required to specify a liquid structure equals, in suitable units, its thermodynamic entropy. Hence, an expansion of the entropy in terms of multi-particle correlation functions can be interpreted as a hierarchy of information measures. Utilizing first principles molecular dynamics simulations, we simulate the structure of liquid aluminum to obtain its density, pair and triplet correlation functions, allowing us to approximate the experimentally measured entropy and relate the excess entropy to the information content of the correlation functions. We discuss the accuracy and convergence of the method. Full article
Show Figures

Figure 1

Open AccessArticle
Definition and Time Evolution of Correlations in Classical Statistical Mechanics
Entropy 2018, 20(12), 898; https://doi.org/10.3390/e20120898 - 23 Nov 2018
Cited by 2
Abstract
The study of dense gases and liquids requires consideration of the interactions between the particles and the correlations created by these interactions. In this article, the N-variable distribution function which maximizes the Uncertainty (Shannon’s information entropy) and admits as marginals a set [...] Read more.
The study of dense gases and liquids requires consideration of the interactions between the particles and the correlations created by these interactions. In this article, the N-variable distribution function which maximizes the Uncertainty (Shannon’s information entropy) and admits as marginals a set of (N−1)-variable distribution functions, is, by definition, free of N-order correlations. This way to define correlations is valid for stochastic systems described by discrete variables or continuous variables, for equilibrium or non-equilibrium states and correlations of the different orders can be defined and measured. This allows building the grand-canonical expressions of the uncertainty valid for either a dilute gas system or a dense gas system. At equilibrium, for both kinds of systems, the uncertainty becomes identical to the expression of the thermodynamic entropy. Two interesting by-products are also provided by the method: (i) The Kirkwood superposition approximation (ii) A series of generalized superposition approximations. A theorem on the temporal evolution of the relevant uncertainty for molecular systems governed by two-body forces is proved and a conjecture closely related to this theorem sheds new light on the origin of the irreversibility of molecular systems. In this respect, the irreplaceable role played by the three-body interactions is highlighted. Full article
Open AccessArticle
The Entropy of Deep Eutectic Solvent Formation
Entropy 2018, 20(7), 524; https://doi.org/10.3390/e20070524 - 12 Jul 2018
Cited by 3
Abstract
The standard entropies S298°E of deep eutectic solvents (DESs), which are liquid binary mixtures of a hydrogen bond acceptor component and a hydrogen bod donor one, are calculated from their molecular volumes, derived from their densities or crystal structures. These values are [...] Read more.
The standard entropies S298°E of deep eutectic solvents (DESs), which are liquid binary mixtures of a hydrogen bond acceptor component and a hydrogen bod donor one, are calculated from their molecular volumes, derived from their densities or crystal structures. These values are compared with those of the components—pro-rated according to the DES composition—to obtain the standard entropies of DES formation ΔfS. These quantities are positive, due to the increased number and kinds of hydrogen bonds present in the DESs relative to those in the components. The ΔfS values are also compared with the freezing point depressions of the DESs ΔfusT/K, but no general conclusions on their mutual relationship could be drawn. Full article
Back to TopTop