Next Article in Journal
Game-Theoretic Optimization of Bilateral Contract Transaction for Generation Companies and Large Consumers with Incomplete Information
Next Article in Special Issue
Complexity, Criticality and Computation
Previous Article in Journal
A Novel Distance Metric: Generalized Relative Entropy
Previous Article in Special Issue
Can a Robot Have Free Will?
Article Menu
Issue 6 (June) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(6), 273; doi:10.3390/e19060273

Multiscale Information Theory and the Marginal Utility of Information

1
Department of Mathematics, Emmanuel College, Boston, MA 02115, USA
2
Program for Evolutionary Dynamics, Harvard University, Cambridge, MA 02138, USA
3
Department of Physics, University of Massachusetts-Boston, Boston, MA 02125, USA
4
New England Complex Systems Institute, Cambridge, MA 02139, USA
*
Author to whom correspondence should be addressed.
Received: 28 February 2017 / Revised: 26 May 2017 / Accepted: 9 June 2017 / Published: 13 June 2017
(This article belongs to the Special Issue Complexity, Criticality and Computation (C³))
View Full-Text   |   Download PDF [709 KB, uploaded 20 June 2017]   |  

Abstract

Complex systems display behavior at a range of scales. Large-scale behaviors can emerge from the correlated or dependent behavior of individual small-scale components. To capture this observation in a rigorous and general way, we introduce a formalism for multiscale information theory. Dependent behavior among system components results in overlapping or shared information. A system’s structure is revealed in the sharing of information across the system’s dependencies, each of which has an associated scale. Counting information according to its scale yields the quantity of scale-weighted information, which is conserved when a system is reorganized. In the interest of flexibility we allow information to be quantified using any function that satisfies two basic axioms. Shannon information and vector space dimension are examples. We discuss two quantitative indices that summarize system structure: an existing index, the complexity profile, and a new index, the marginal utility of information. Using simple examples, we show how these indices capture the multiscale structure of complex systems in a quantitative way. View Full-Text
Keywords: complexity; complex systems; entropy; information; scale complexity; complex systems; entropy; information; scale
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Allen, B.; Stacey, B.C.; Bar-Yam, Y. Multiscale Information Theory and the Marginal Utility of Information. Entropy 2017, 19, 273.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top