Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 10, Issue 1 (March 2008), Pages 1-18

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-3
Export citation of selected articles as:

Research

Jump to: Other

Open AccessArticle An Algorithmic Complexity Interpretation of Lin's Third Law of Information Theory
Entropy 2008, 10(1), 6-14; doi:10.3390/entropy-e10010006
Received: 28 February 2008 / Revised: 16 March 2008 / Accepted: 19 March 2008 / Published: 20 March 2008
Cited by 8 | PDF Full-text (183 KB)
Abstract
Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectly-random universe acts as an interfering entity which introduces local [...] Read more.
Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectly-random universe acts as an interfering entity which introduces local disruption in randomness. This is modeled by a selection rule R which selects a subsequence of the random input sequence that hits the structure. Through the inequality that relates stochasticity and chaoticity of random binary sequences we maintain that Lin’s notion of stability corresponds to the stability of the frequency of 1s in the selected subsequence. This explains why more complex static structures are less stable. Lin’s third law is represented as the inevitable change that static structure undergo towards conforming to the universe’s perfect randomness. Full article
(This article belongs to the Special Issue Symmetry and Entropy)
Open AccessArticle Gibbs’ Paradox and the Definition of Entropy
Entropy 2008, 10(1), 15-18; doi:10.3390/entropy-e10010015
Received: 10 December 2007 / Accepted: 14 March 2008 / Published: 20 March 2008
Cited by 19 | PDF Full-text (118 KB)
Abstract
Gibbs’ Paradox is shown to arise from an incorrect traditional definition of the entropy that has unfortunately become entrenched in physics textbooks. Among its flaws, the traditional definition predicts a violation of the second law of thermodynamics when applied to colloids. By [...] Read more.
Gibbs’ Paradox is shown to arise from an incorrect traditional definition of the entropy that has unfortunately become entrenched in physics textbooks. Among its flaws, the traditional definition predicts a violation of the second law of thermodynamics when applied to colloids. By adopting Boltzmann’s definition of the entropy, the violation of the second law is eliminated, the properties of colloids are correctly predicted, and Gibbs’ Paradox vanishes. Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)

Other

Jump to: Research

Open AccessCommentary Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship
Entropy 2008, 10(1), 1-5; doi:10.3390/entropy-e10010001
Received: 13 March 2008 / Published: 17 March 2008
Cited by 14 | PDF Full-text (90 KB) | HTML Full-text | XML Full-text
Abstract
We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [1-4]. During this [...] Read more.
We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [1-4]. During this decade, the globalized Information Society has been developing very quickly based on the Internet and the term information is widely used, but what is information? What is its relationship with entropy and other concepts like symmetry, distinguishability and stability? What is the situation of entropy research in general? As the Editor-in-Chief of Entropy, I feel it is time to offer some comments, present my own opinions in this matter and point out a major flaw in related studies. [...] Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top