Generalized Landauer Bound for Information Processing: Proof and Applications
Abstract
:1. Introduction
2. Generalized Landauer Bound and Applications
2.1. Setup
2.2. The Generalized Landauer Bound
- Conditional Energy Bound: The energy change in the environment , averaged over the M processes, is lower bounded as
- Generalized Landauer Bound: The energy change in the environment , averaged over the M processes, is lower bounded as
2.3. Specialization: Energy Bounds for Landauer Erasure and Logical Operations
2.3.1. Landauer Erasure
- Landauer Bound for Erasure (General Encodings): The average energy change in the environment for an M-input Landauer erasure operation implemented in , weighted by the relative frequencies with which the inputs are encoded, is lower bounded as
- Landauer Bound for Erasure (Distinguishable Encodings): The average energy change in the environment for an M-input Landauer erasure operation implemented in , weighted by the relative frequencies with which the inputs are encoded, is lower bounded as
2.3.2. Logically Irreversible Operations
- Landauer Bound for Logical Operations (Distinguishable Encodings): The average energy change in the environment for an M-input, N-output logical operation implemented in , weighted by the relative frequencies with which the inputs are encoded, is lower bounded as
3. Trial-Averaging Proof of the GLB
3.1. Proof of the Conditional Energy Bound
3.2. Proof of the GLB (or Unconditional Energy Bound)
4. Summary
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 1961, 5, 183–191. [Google Scholar] [CrossRef]
- Leff, H.S.; Rex, A.F. (Eds.) Maxwell’s Demon 2: Entropy, Classical and Quantum Information, Computing; Institute of Physics Publishing: Bristol, UK, 2003. [Google Scholar]
- Lent, C.S.; Orlov, A.O.; Porod, W.; Snider, G.L. (Eds.) Energy Limits in Computation: A Review of Landauer’s Principle, Theory and Experiments; Springer Nature: Cham, Switzerland, 2019. [Google Scholar]
- Maroney, O. Information Processing and Thermodynamic Entropy. Stanford Encyclopedia of Philosophy, Fall 2009 ed.; Zalta, E.L., Ed.; Metaphysics Research Lab, Stanford University: Stanford, CA, USA, 2009; Available online: https://plato.stanford.edu/cgi-bin/encyclopedia/archinfo.cgi?entry=information-entropy (accessed on 10 August 2022).
- Anderson, N.G. Conditional Erasure and the Landauer Limit. In Energy Limits in Computation: A Review of Landauer’s Principle, Theory and Experiments; Lent, C.S., Orlov, A.O., Porod, W., Snider, G.L., Eds.; Springer Nature: Cham, Switzerland, 2019; pp. 65–100. [Google Scholar]
- Norton, J.D. Waiting for Landauer. Stud. Hist. Philos. Mod. Phys. 2011, 137, 184–198. [Google Scholar] [CrossRef] [Green Version]
- Bennett, C.H. Logical reversibility of computation. IBM J. Res. Dev. 1973, 17, 525–532. [Google Scholar] [CrossRef]
- Frank, M.P.; Shukla, K. Quantum foundations of classical reversible computing. Entropy 2021, 23, 701. [Google Scholar] [CrossRef] [PubMed]
- Lent, C.S.; Liu, M.; Lu, Y. Bennett clocking of quantum-dot cellular automata and the limits to binary logic scaling. Nanotechnology 2006, 17, 4240. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Holevo, A.S. Bounds for the quantity of information transmitted by a quantum communication channel. Probl. Peredachi Informatsii 1973, 9, 3–11. [Google Scholar]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
- Ladyman, J.; Robertson, K. Landauer defended: Reply to Norton. Stud. Hist. Philos. Mod. Phys. 2017, 44, 263–271. [Google Scholar] [CrossRef]
- Lopez-Suarez, M.; Neri, I.; Gammaitoni, L. Sub-kBT micro-electromechanical irreversible logic gate. Nat. Commun. 2016, 7, 12068. [Google Scholar] [CrossRef] [PubMed]
- Partovi, M.H. Quantum thermodynamics. Phys. Lett. A 1989, 137, 440–444. [Google Scholar] [CrossRef]
- Norton, J.D. Eaters of the lotus: Landauer’s principle and the return of Maxwell’s demon. Stud. Hist. Philos. Mod. Phys. 2005, 36, 375–411. [Google Scholar] [CrossRef]
- Myrvold, W.C. Shakin’ All Over: Proving Landauer’s Principle without Neglect of Fluctuations. Br. J. Philos. Sci. 2021; forthcoming. [Google Scholar] [CrossRef]
- Anderson, N.G. Information erasure in quantum systems. Phys. Lett. A 2008, 372, 5552–5555. [Google Scholar] [CrossRef]
- Anderson, N.G. Landauer’s limit and the physicality of information. Eur. Phys. J. B 2018, 91, 156. [Google Scholar] [CrossRef]
- Anderson, N.G. Information as a physical quantity. Inf. Sci. 2017, 415–416, 397–413. [Google Scholar] [CrossRef]
- Bedingham, D.J.; Maroney, O.J.E. The thermodynamic cost of quantum operations. New J. Phys. 2016, 18, 113050. [Google Scholar] [CrossRef]
- Anderson, N.G. On the physical implementation of logical transformations: Generalized L-machines. Theor. Comp. Sci. 2010, 411, 4179–4199. [Google Scholar] [CrossRef]
1 | |
2 | Five examples of exchanges from the literature, spanning more than three decades, are collected in Ref. [5] (as Refs. [4]–[25] of that work). |
3 | |
4 | Such scenarios do not, however, accommodate systems like standard CMOS implementations of logic gates, in which physical encodings of the logical inputs persist in part of the system as the output states are generated and rendered in another part—i.e., in which the inputs are not physically overwritten by the outputs—and the outputs persist only as long as the inputs are retained. This does not suggest that there is no “Landauer cost” to information processing in standard CMOS; it is just to say that formal proof of Landauer’s bound in such a setting would require modification of the present formalism (not to mention inclusion of additional features such as external particle sources and sinks). Along with these additional formal complexities, any such proof would face the very same fundamental issues that arise—and that we seek to illuminate—within the simpler scenarios considered in this paper and in most studies of Landauer’s Principle. |
5 | The transformation of the local state of resulting from this process, which generally is not unitary (as is evolution of the global state of ), is described by a quantum operation or completely positive trace preserving map. |
6 | The adjective “conditional” recognizes that the process potential is conditioned upon the initial state, i.e., that if the initial system state is then the process potential is . |
7 | The entropy differences that are averaged over in the bound (8) are changes in the self-entropies of the M individual encoding states—not any entropic measure associated with information encoded in the M-ary ensemble of states . The individual can be positive, negative, or zero—as can their ensemble average—whether information is gained, lost, or preserved in the process. |
8 | We note as an aside that, in conventional information processing devices, the primitive operations are performed unconditionally. In the reversible computing paradigm, however, elementary operations are performed conditionally. |
9 | One may naturally wonder if this external copy could be avoided by including, at an initial stage of the process, a measurement that would determine the inital state and then act accordingly. This does not solve the problem, however, as the outcome of the measurement must be rendered in a physical system outside of the device on which the measurement is performed, which constitutes creation of an external copy. (see exchange between Norton [6] and Ladyman and Robertson [12] on this issue, and [5] for related remarks.) This solution is problematic for other reasons as well, including possible lack of distinguishability of initial states by any measurement for some ensembles and, in some quantum settings, unavoidable modification of the initial states by the measurement process. |
10 | To be clear, M and N here are the respective numbers of distinct inputs and outputs to the information processing operation—not the numbers of digits used to represent the inputs and outputs in any particular number system. Consider, for example, the Boolean function implemented by a binary full adder, which has a 3-bit input and a 2-bit output. For this operation, the number of distinct inputs (or input vectors) is and the number of distinct outputs (or output vectors) is . Note more generally that for Boolean operations with 2-output bits, since some such operations—unlike the full adder—do not generate outputs corresponding to all possible combinations of two bit values. |
11 | Logically reversible operations include noiseless communication and memory. |
12 | The conditional entropy , which is the Shannon information in X that is not in Y or the input information that is lost in the input-output mapping from X to Y, generally is not equal to the difference between the Shannon entropies of correlated random variables X and Y. In general, they are related as , where is the joint entropy of X and Y. However, for the mappings that describe the input-output relation for deterministic logical operations, so . |
13 | Note that in Ref. [5] there are mathematical typesetting errors in Equations (9), (11) and (12), in the unnumbered equation before Equation (4) and the unnumbered equation between Equation (11) and Equation (12), and the inline equation appearing immediately after Equation (12). These typos, which did not effect the results of that work, have been corrected in all analogous expressions appearing in this work. The author thanks Sam Fletcher for bringing these errors to his attention. |
14 | Hence our terminological preference for “Landauer bound” over “Landauer limit” in this work. While they are used interchangeably, the latter seems more likely to carry a connotation of achievability. |
15 | Effective confinement of physical states to designated state subspaces is essential for reliable encoding of information in system states, but absolute confinement for all time is not required for reliable processing of encoded information. Rather, reliable information processing requires barriers high enough to confine system states to designated subspaces with exceedingly high probability over timescales that matter for the processing and intermediate retention of information. Indeed, the reliable encoding and retention of information in conventional computing devices is achieved via confinement of electrons to selected spatial regions (and thus state subspaces) by barriers that are penetrable—but that are “sufficiently impenetrable”. |
16 | Note that this paper does not address concerns related to the neglect of thermal fluctuations in proofs of Landauer’s Principle—another prominent point of contention—as the extent to which such concerns impact the present approach is unclear. A recent summary of such concerns, and a statistical mechanical treatment of Landauer’s Principle that aims to address them, can be found in [16]. |
17 | These include the “referential approach”—see Equation (18) of [17] (with Equation (20) of that work) and Equation (18) of [18] (with Equations (1) and (6) of that work)—which is this author’s preferred approach to obtaining dissipation bounds in information processing scenarios for reasons described in Refs. [18,19]. |
18 | |
19 | For example, the bound (12) for erasure with distinguishable encoding states appeared as Equation (34) in Ref. [17], and the bound (16) for ideal logical operations appeared as Equation (68) in Ref. [21] (appropriately specialized to ideal operations with efficacy measures and fixed at unity) and as Equation (11) in Ref. [5]. However, in these cases and all others of which we are aware—with the exception of [5] (the precursor to this work)—the now controversial use of surrogate density operators to represent ensembles of states in the determination of dissipation bounds was employed and presumed to be valid. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Anderson, N.G. Generalized Landauer Bound for Information Processing: Proof and Applications. Entropy 2022, 24, 1568. https://doi.org/10.3390/e24111568
Anderson NG. Generalized Landauer Bound for Information Processing: Proof and Applications. Entropy. 2022; 24(11):1568. https://doi.org/10.3390/e24111568
Chicago/Turabian StyleAnderson, Neal G. 2022. "Generalized Landauer Bound for Information Processing: Proof and Applications" Entropy 24, no. 11: 1568. https://doi.org/10.3390/e24111568
APA StyleAnderson, N. G. (2022). Generalized Landauer Bound for Information Processing: Proof and Applications. Entropy, 24(11), 1568. https://doi.org/10.3390/e24111568