Special Issue "Entropy Transformations in Nonequilibrium and Other Complex Systems"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Non-equilibrium Phenomena".

Deadline for manuscript submissions: closed (26 April 2022) | Viewed by 3178

Special Issue Editor

Prof. Dr. Vladimir Aristov
E-Mail Website
Guest Editor
Dorodnicyn Computing Center, Federal Research Center “Computer Science and Control” of Russian Academy of Sciences, 119333 Moscow, Russia
Interests: direct methods for solving the Boltzmann equation; study of nonlinear nonclassical effects in the nonequilibrium flows; analytical and numerical methods in different areas of gas dynamics, rarefied flows, in particular turbulent phenomena; kinetic method in statistical physics for description dissipative structures

Special Issue Information

Dear Colleagues,

This Special Issue will present current research on entropy transformation in various fields, especially for complex open nonequilibrium systems.
The transformation of entropy (and information) in complex, especially nonequilibrium, systems is still an important problem in kinetic theory, hydrodynamics, nonlinear physics, biology, genetics, neuroscience, etc. Entropy, Lyapunov exponents, and other similar theoretical tools are used in the qualitative and quantitative analysis of the complexity of dynamics. Nonequilibrium and complex close-to-equilibrium processes (including nonequilibrium and particularly unstable flows) attract the attention of many researchers around the world.

The Special Issue focuses on, but is not limited to, research and applied work on complex processes in which significant entropy transformation occurs. The complexity of these systems can be estimated using various definitions of entropy and information, and by theoretically developing these notions. Dissipative structures (open nonequilibrium systems in terms of different authors) can demonstrate fundamentally new physical properties and relationships. Dissipative structures described by kinetic methods may be of particular interest.

Boltzmann’s kinetic theory, including the notion of statistical nonequilibrium entropy and the H-theorem, proposes an important new path in the study of complex systems and flows. Boltzmann and other kinetic equations can be used to simulate processes that differ significantly from nonequilibrium thermodynamics. At the same time, modern kinetic CFD (computational fluid dynamics) methods are able to adequately describe near-to-equilibrium and far-from-equilibrium flows.
The transformation of entropy, its extreme properties in open systems, and the conditions for decreasing entropy require special analysis. This is interesting from the perspective of the hypothesis of the strong nonequilibrium of biological structures. There is a significant difference in describing living systems when comparing the use of thermodynamic and statistical definitions of entropy. The formulation of the second law of thermodynamics based on statistical entropy is important because of the nonclassical heat transfer in some nonequilibrium flows.

Other research topics are also welcome, such as informational transformations associated with the structures of the genome and their role in the organization of complex biological organisms; descriptions of the structure of the brain, represented by neuron-like networks, using statistical and kinetic methods; and complexity theory in this and similar areas.
This Issue will accept unpublished original papers, short communications, and appropriate reviews that pertain (but are not restricted) to the following research areas:

  • Development of the theoretical apparatus of entropy and its application for complex systems.
  • Nonequilibrium processes and flows in different media.
  • Comparison of the use of thermodynamic and kinetic approaches for entropy transformations.
  • Dissipative structures and extremes of entropy generation.
  • Methods for the description of nonequilibrium processes in biological systems.
  • Complexity and transformation of information in genetics.
  • Statistical modeling in complex neural networks.
  • Entropy and information in many-particle systems.
  • Transition between order and disorder including entropy and information transforms.

Prof. Dr. Vladimir Aristov
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Thermodynamical and statistical entropy
  • Entropy transformation
  • Thermodynamics of nonequilibrium processes
  • Kinetic theory
  • H-function, information and negentropy
  • Dissipative structures
  • Biology
  • Genetics
  • Complex networks

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
The Use of the Statistical Entropy in Some New Approaches for the Description of Biosystems
Entropy 2022, 24(2), 172; https://doi.org/10.3390/e24020172 - 24 Jan 2022
Cited by 2 | Viewed by 909
Abstract
Some problems of describing biological systems with the use of entropy as a measure of the complexity of these systems are considered. Entropy is studied both for the organism as a whole and for its parts down to the molecular level. Correlation of [...] Read more.
Some problems of describing biological systems with the use of entropy as a measure of the complexity of these systems are considered. Entropy is studied both for the organism as a whole and for its parts down to the molecular level. Correlation of actions of various parts of the whole organism, intercellular interactions and control, as well as cooperativity on the microlevel lead to a more complex structure and lower statistical entropy. For a multicellular organism, entropy is much lower than entropy for the same mass of a colony of unicellular organisms. Cooperativity always reduces the entropy of the system; a simple example of ligand binding to a macromolecule carrying two reaction centers shows how entropy is consistent with the ambiguity of the result in the Bernoulli test scheme. Particular attention is paid to the qualitative and quantitative relationship between the entropy of the system and the cooperativity of ligand binding to macromolecules. A kinetic model of metabolism. corresponding to Schrödinger’s concept of the maintenance biosystems by “negentropy feeding”, is proposed. This model allows calculating the nonequilibrium local entropy and comparing it with the local equilibrium entropy inherent in non-living matter. Full article
(This article belongs to the Special Issue Entropy Transformations in Nonequilibrium and Other Complex Systems)
Show Figures

Figure 1

Article
Entropy Analysis of Protein Sequences Reveals a Hierarchical Organization
Entropy 2021, 23(12), 1647; https://doi.org/10.3390/e23121647 - 07 Dec 2021
Viewed by 862
Abstract
Background: Analyzing the local sequence content in proteins, earlier we found that amino acid residue frequencies differ on various distances between amino acid positions in the sequence, assuming the existence of structural units. Methods: We used informational entropy of protein sequences to find [...] Read more.
Background: Analyzing the local sequence content in proteins, earlier we found that amino acid residue frequencies differ on various distances between amino acid positions in the sequence, assuming the existence of structural units. Methods: We used informational entropy of protein sequences to find that the structural unit of proteins is a block of adjacent amino acid residues—“information unit”. The ANIS (ANalysis of Informational Structure) method uses these information units for revealing hierarchically organized Elements of the Information Structure (ELIS) in amino acid sequences. Results: The developed mathematical apparatus gives stable results on the structural unit description even with a significant variation in the parameters. The optimal length of the information unit is five, and the number of allowed substitutions is one. Examples of the application of the method for the design of protein molecules, intermolecular interactions analysis, and the study of the mechanisms of functioning of protein molecular machines are given. Conclusions: ANIS method makes it possible not only to analyze native proteins but also to design artificial polypeptide chains with a given spatial organization and, possibly, function. Full article
(This article belongs to the Special Issue Entropy Transformations in Nonequilibrium and Other Complex Systems)
Show Figures

Figure 1

Article
Influence of Environmental Parameters on the Stability of the DNA Molecule
Entropy 2021, 23(11), 1446; https://doi.org/10.3390/e23111446 - 31 Oct 2021
Cited by 1 | Viewed by 784
Abstract
Fluctuations in viscosity within the cell nucleus have wide limits. When a DNA molecule passes from the region of high viscosity values to the region of low values, open states, denaturation bubbles, and unweaving of DNA strands can occur. Stabilization of the molecule [...] Read more.
Fluctuations in viscosity within the cell nucleus have wide limits. When a DNA molecule passes from the region of high viscosity values to the region of low values, open states, denaturation bubbles, and unweaving of DNA strands can occur. Stabilization of the molecule is provided by energy dissipation—dissipation due to interaction with the environment. Separate sections of a DNA molecule in a twisted state can experience supercoiling stress, which, among other things, is due to complex entropic effects caused by interaction with a solvent. In this work, based on the numerical solution of a mechanical mathematical model for the interferon alpha 17 gene and a fragment of the Drosophila gene, an analysis of the external environment viscosity influence on the dynamics of the DNA molecule and its stability was carried out. It has been shown that an increase in viscosity leads to a rapid stabilization of the angular vibrations of nitrogenous bases, while a decrease in viscosity changes the dynamics of DNA: the rate of change in the angular deviations of nitrogenous bases increases and the angular deformations of the DNA strands increase at each moment of time. These processes lead to DNA instability, which increases with time. Thus, the paper considers the influence of the external environment viscosity on the dissipation of the DNA nitrogenous bases’ vibrational motion energy. Additionally, the study on the basis of the described model of the molecular dynamics of physiological processes at different indicators of the rheological behavior of nucleoplasm will allow a deeper understanding of the processes of nonequilibrium physics of an active substance in a living cell to be obtained. Full article
(This article belongs to the Special Issue Entropy Transformations in Nonequilibrium and Other Complex Systems)
Show Figures

Figure 1

Back to TopTop