Next Issue
Volume 10, June
Previous Issue
Volume 9, December
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 10, Issue 1 (March 2008) – 3 articles , Pages 1-18

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:

Research

Jump to: Other

183 KiB  
Article
An Algorithmic Complexity Interpretation of Lin's Third Law of Information Theory
by Joel Ratsaby
Entropy 2008, 10(1), 6-14; https://doi.org/10.3390/entropy-e10010006 - 20 Mar 2008
Cited by 12 | Viewed by 10038
Abstract
Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectly-random universe acts as an interfering entity which introduces local disruption [...] Read more.
Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectly-random universe acts as an interfering entity which introduces local disruption in randomness. This is modeled by a selection rule R which selects a subsequence of the random input sequence that hits the structure. Through the inequality that relates stochasticity and chaoticity of random binary sequences we maintain that Lin’s notion of stability corresponds to the stability of the frequency of 1s in the selected subsequence. This explains why more complex static structures are less stable. Lin’s third law is represented as the inevitable change that static structure undergo towards conforming to the universe’s perfect randomness. Full article
(This article belongs to the Special Issue Symmetry and Entropy)
Show Figures

Figure 1

118 KiB  
Article
Gibbs’ Paradox and the Definition of Entropy
by Robert H. Swendsen
Entropy 2008, 10(1), 15-18; https://doi.org/10.3390/entropy-e10010015 - 20 Mar 2008
Cited by 39 | Viewed by 16219
Abstract
Gibbs’ Paradox is shown to arise from an incorrect traditional definition of the entropy that has unfortunately become entrenched in physics textbooks. Among its flaws, the traditional definition predicts a violation of the second law of thermodynamics when applied to colloids. By adopting [...] Read more.
Gibbs’ Paradox is shown to arise from an incorrect traditional definition of the entropy that has unfortunately become entrenched in physics textbooks. Among its flaws, the traditional definition predicts a violation of the second law of thermodynamics when applied to colloids. By adopting Boltzmann’s definition of the entropy, the violation of the second law is eliminated, the properties of colloids are correctly predicted, and Gibbs’ Paradox vanishes. Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)

Other

Jump to: Research

90 KiB  
Commentary
Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship
by Shu-Kun Lin
Entropy 2008, 10(1), 1-5; https://doi.org/10.3390/entropy-e10010001 - 17 Mar 2008
Cited by 18 | Viewed by 24262
Abstract
We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [1-4]. During this decade, [...] Read more.
We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [1-4]. During this decade, the globalized Information Society has been developing very quickly based on the Internet and the term information is widely used, but what is information? What is its relationship with entropy and other concepts like symmetry, distinguishability and stability? What is the situation of entropy research in general? As the Editor-in-Chief of Entropy, I feel it is time to offer some comments, present my own opinions in this matter and point out a major flaw in related studies. [...] Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop