E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Thermodynamics of Information Processing"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 30 June 2019

Special Issue Editor

Guest Editor
Dr. Susanne Still

Department of Information and Computer Sciences, and Department of Physics and Astronomy, University of Hawaii at Mānoa, Honolulu, HI, USA
Website | E-Mail
Interests: information theory, physics of computation, far-from-equilibrium thermodynamics, machine learning, theoretical biophysics

Special Issue Information

Dear Colleagues,

The energetics of information processing systems, both man-made and natural, are subject to the laws of thermodynamics. The second law of thermodynamics, in particular, limits the thermodynamic efficiency with which information can be processed. This point was first raised in Leo Szillard's discussion of Maxwell's Demon. Later, Rolf Landauer showed that the second law provides a fundamental limit to computing in that it gives a lower bound on the heat generated when one bit of information is erased. While Szillard's and Landauer's reasoning is based solely on equilibrium thermodynamics, many information processing systems, especially living systems, operate away from thermodynamic equilibrium. Living systems embody particularly interesting information processing capabilities, including the ability to learn about, adapt to, and, to a certain degree, control their environment. A great deal of recent effort has gone into understanding the fundamental limits for systems operating arbitrarily far from equilibrium. This effort is in part driven by recent experimental advances in manipulating small systems. As technology approaches the nano scale, increased attention has been directed towards quantum thermodynamics. While quantum effects are believed to play a role in some living systems, biological information processing can largely be understood classically. Ultimately, the hope is that physical limits to information processing will inform general building principles, thus deepening our understanding of living systems, as well as paving the way for a new generation of nano-devices and machine learning methods.

This Special Issue seeks contributions advancing our understanding of the thermodynamics of information processing, with a special focus on, but not limited to, living systems and learning. We welcome theoretical, as well as experiment work, original research or review papers, and, particularly, seek high-quality theoretical contributions illuminating the general thermodynamic limits of information processing.

Dr. Susanne Still
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • thermodynamics of information processing
  • far-from-equilibrium thermodynamics
  • stochastic thermodynamics
  • quantum thermodynamics
  • measurement
  • living systems
  • machine learning
  • adaptation
  • predictive inference
  • physics of computation

Published Papers (6 papers)

View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Open AccessArticle Thermodynamics of Majority-Logic Decoding in Information Erasure
Entropy 2019, 21(3), 284; https://doi.org/10.3390/e21030284
Received: 10 January 2019 / Revised: 25 February 2019 / Accepted: 11 March 2019 / Published: 15 March 2019
PDF Full-text (435 KB) | HTML Full-text | XML Full-text
Abstract
We investigate the performance of majority-logic decoding in both reversible and finite-time information erasure processes performed on macroscopic bits that contain N microscopic binary units. While we show that for reversible erasure protocols single-unit transformations are more efficient than majority-logic decoding, the latter [...] Read more.
We investigate the performance of majority-logic decoding in both reversible and finite-time information erasure processes performed on macroscopic bits that contain N microscopic binary units. While we show that for reversible erasure protocols single-unit transformations are more efficient than majority-logic decoding, the latter is found to offer several benefits for finite-time erasure processes: Both the minimal erasure duration for a given erasure and the minimal erasure error for a given erasure duration are reduced, if compared to a single unit. Remarkably, the majority-logic decoding is also more efficient in both the small-erasure error and fast-erasure region. These benefits are also preserved under the optimal erasure protocol that minimizes the dissipated heat. Our work therefore shows that majority-logic decoding can lift the precision-speed-efficiency trade-off in information erasure processes. Full article
(This article belongs to the Special Issue Thermodynamics of Information Processing)
Figures

Figure 1

Open AccessArticle A Programmable Mechanical Maxwell’s Demon
Entropy 2019, 21(1), 65; https://doi.org/10.3390/e21010065
Received: 21 November 2018 / Revised: 22 December 2018 / Accepted: 9 January 2019 / Published: 14 January 2019
PDF Full-text (4111 KB) | HTML Full-text | XML Full-text
Abstract
We introduce and investigate a simple and explicitly mechanical model of Maxwell’s demon—a device that interacts with a memory register (a stream of bits), a thermal reservoir (an ideal gas) and a work reservoir (a mass that can be lifted or lowered). Our [...] Read more.
We introduce and investigate a simple and explicitly mechanical model of Maxwell’s demon—a device that interacts with a memory register (a stream of bits), a thermal reservoir (an ideal gas) and a work reservoir (a mass that can be lifted or lowered). Our device is similar to one that we have briefly described elsewhere, but it has the additional feature that it can be programmed to recognize a chosen reference sequence, for instance, the binary representation of π . If the bits in the memory register match those of the reference sequence, then the device extracts heat from the thermal reservoir and converts it into work to lift a small mass. Conversely, the device can operate as a generalized Landauer’s eraser (or copier), harnessing the energy of a dropping mass to write the chosen reference sequence onto the memory register, replacing whatever information may previously have been stored there. Our model can be interpreted either as a machine that autonomously performs a conversion between information and energy, or else as a feedback-controlled device that is operated by an external agent. We derive generalized second laws of thermodynamics for both pictures. We illustrate our model with numerical simulations, as well as analytical calculations in a particular, exactly solvable limit. Full article
(This article belongs to the Special Issue Thermodynamics of Information Processing)
Figures

Figure 1

Open AccessArticle Analysis of Heat Dissipation and Reliability in Information Erasure: A Gaussian Mixture Approach
Entropy 2018, 20(10), 749; https://doi.org/10.3390/e20100749
Received: 18 June 2018 / Revised: 20 September 2018 / Accepted: 28 September 2018 / Published: 30 September 2018
Cited by 1 | PDF Full-text (527 KB) | HTML Full-text | XML Full-text
Abstract
This article analyzes the effect of imperfections in physically realizable memory. Motivated by the realization of a bit as a Brownian particle within a double well potential, we investigate the energetics of an erasure protocol under a Gaussian mixture model. We obtain sharp [...] Read more.
This article analyzes the effect of imperfections in physically realizable memory. Motivated by the realization of a bit as a Brownian particle within a double well potential, we investigate the energetics of an erasure protocol under a Gaussian mixture model. We obtain sharp quantitative entropy bounds that not only give rigorous justification for heuristics utilized in prior works, but also provide a guide toward the minimal scale at which an erasure protocol can be performed. We also compare the results obtained with the mean escape times from double wells to ensure reliability of the memory. The article quantifies the effect of overlap of two Gaussians on the the loss of interpretability of the state of a one bit memory, the required heat dissipated in partially successful erasures and reliability of information stored in a memory bit. Full article
(This article belongs to the Special Issue Thermodynamics of Information Processing)
Figures

Figure 1

Open AccessArticle Energy Dissipation and Information Flow in Coupled Markovian Systems
Entropy 2018, 20(9), 707; https://doi.org/10.3390/e20090707
Received: 30 June 2018 / Revised: 11 September 2018 / Accepted: 13 September 2018 / Published: 14 September 2018
PDF Full-text (627 KB) | HTML Full-text | XML Full-text
Abstract
A stochastic system under the influence of a stochastic environment is correlated with both present and future states of the environment. Such a system can be seen as implicitly implementing a predictive model of future environmental states. The non-predictive model complexity has been [...] Read more.
A stochastic system under the influence of a stochastic environment is correlated with both present and future states of the environment. Such a system can be seen as implicitly implementing a predictive model of future environmental states. The non-predictive model complexity has been shown to lower-bound the thermodynamic dissipation. Here we explore these statistical and physical quantities at steady state in simple models. We show that under quasi-static driving this model complexity saturates the dissipation. Beyond the quasi-static limit, we demonstrate a lower bound on the ratio of this model complexity to total dissipation, that is realized in the limit of weak driving. Full article
(This article belongs to the Special Issue Thermodynamics of Information Processing)
Figures

Figure 1

Open AccessArticle Writing, Proofreading and Editing in Information Theory
Entropy 2018, 20(5), 368; https://doi.org/10.3390/e20050368
Received: 5 April 2018 / Revised: 4 May 2018 / Accepted: 12 May 2018 / Published: 15 May 2018
PDF Full-text (502 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Information is a physical entity amenable to be described by an abstract theory. The concepts associated with the creation and post-processing of the information have not, however, been mathematically established, despite being broadly used in many fields of knowledge. Here, inspired by how [...] Read more.
Information is a physical entity amenable to be described by an abstract theory. The concepts associated with the creation and post-processing of the information have not, however, been mathematically established, despite being broadly used in many fields of knowledge. Here, inspired by how information is managed in biomolecular systems, we introduce writing, entailing any bit string generation, and revision, as comprising proofreading and editing, in information chains. Our formalism expands the thermodynamic analysis of stochastic chains made up of material subunits to abstract strings of symbols. We introduce a non-Markovian treatment of operational rules over the symbols of the chain that parallels the physical interactions responsible for memory effects in material chains. Our theory underlies any communication system, ranging from human languages and computer science to gene evolution. Full article
(This article belongs to the Special Issue Thermodynamics of Information Processing)
Figures

Figure 1

Open AccessArticle Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations
Entropy 2017, 19(8), 427; https://doi.org/10.3390/e19080427
Received: 27 June 2017 / Revised: 8 August 2017 / Accepted: 18 August 2017 / Published: 21 August 2017
PDF Full-text (3212 KB) | HTML Full-text | XML Full-text
Abstract
Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. We provide upper and lower [...] Read more.
Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. We provide upper and lower bounds on the entropy for the minimum entropy distribution over arbitrarily large collections of binary units with any fixed set of mean values and pairwise correlations. We also construct specific low-entropy distributions for several relevant cases. Surprisingly, the minimum entropy solution has entropy scaling logarithmically with system size for any set of first- and second-order statistics consistent with arbitrarily large systems. We further demonstrate that some sets of these low-order statistics can only be realized by small systems. Our results show how only small amounts of randomness are needed to mimic low-order statistical properties of highly entropic distributions, and we discuss some applications for engineered and biological information transmission systems. Full article
(This article belongs to the Special Issue Thermodynamics of Information Processing)
Figures

Figure 1

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top