Next Issue
Previous Issue

Table of Contents

Information, Volume 3, Issue 2 (June 2012), Pages 175-255

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-5
Export citation of selected articles as:

Research

Jump to: Other

Open AccessArticle Beyond Bayes: On the Need for a Unified and Jaynesian Definition of Probability and Information within Neuroscience
Information 2012, 3(2), 175-203; doi:10.3390/info3020175
Received: 9 December 2011 / Revised: 3 March 2012 / Accepted: 9 April 2012 / Published: 20 April 2012
Cited by 10 | PDF Full-text (403 KB) | HTML Full-text | XML Full-text
Abstract
It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify [...] Read more.
It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information have been in agreement that it depends on probabilities (through Shannon entropy), but there has long been a dispute about the definition of probabilities themselves. The “frequentist” view is that probabilities are (or can be) essentially equivalent to frequencies, and that they are therefore properties of a physical system, independent of any observer of the system. E.T. Jaynes developed the alternate “Bayesian” definition, in which probabilities are always conditional on a state of knowledge through the rules of logic, as expressed in the maximum entropy principle. In doing so, Jaynes and others provided the objective means for deriving probabilities, as well as a unified account of information and logic (knowledge and reason). However, neuroscience literature virtually never specifies any definition of probability, nor does it acknowledge any dispute concerning the definition. Although there has recently been tremendous interest in Bayesian approaches to the brain, even in the Bayesian literature it is common to find probabilities that are purported to come directly and unconditionally from frequencies. As a result, scientists have mistakenly attributed their own information to the neural systems they study. Here I argue that the adoption of a strictly Jaynesian approach will prevent such errors and will provide us with the philosophical and mathematical framework that is needed to understand the general function of the brain. Accordingly, our challenge becomes the identification of the biophysical basis of Jaynesian information and logic. I begin to address this issue by suggesting how we might identify a probability distribution over states of one physical system (an “object”) conditional only on the biophysical state of another physical system (an “observer”). The primary purpose in doing so is not to characterize information and inference in exquisite, quantitative detail, but to be as clear and precise as possible about what it means to perform inference and how the biophysics of the brain could achieve this goal. Full article
(This article belongs to the Special Issue Information and Energy/Matter)
Figures

Open AccessArticle Physical Computation as Dynamics of Form that Glues Everything Together
Information 2012, 3(2), 204-218; doi:10.3390/info3020204
Received: 1 March 2012 / Revised: 14 April 2012 / Accepted: 18 April 2012 / Published: 26 April 2012
Cited by 7 | PDF Full-text (284 KB) | HTML Full-text | XML Full-text
Abstract
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the [...] Read more.
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the ability to carry out a process, which corresponds to computation. The relationship between each two complementary parts of each dichotomous pair (matter/energy, structure/process, information/computation) are analogous to the relationship between being and becoming, where being is the persistence of an existing structure while becoming is the emergence of a new structure through the process of interactions. This approach presents a unified view built on two fundamental ontological categories: Information and computation. Conceptualizing the physical world as an intricate tapestry of protoinformation networks evolving through processes of natural computation helps to make more coherent models of nature, connecting non-living and living worlds. It presents a suitable basis for incorporating current developments in understanding of biological/cognitive/social systems as generated by complexification of physicochemical processes through self-organization of molecules into dynamic adaptive complex systems by morphogenesis, adaptation and learning—all of which are understood as information processing. Full article
(This article belongs to the Special Issue Information: Its Different Modes and Its Relation to Meaning)
Open AccessArticle Information and Physics
Information 2012, 3(2), 219-223; doi:10.3390/info3020219
Received: 6 March 2012 / Revised: 9 May 2012 / Accepted: 10 May 2012 / Published: 11 May 2012
Cited by 1 | PDF Full-text (176 KB) | HTML Full-text | XML Full-text
Abstract
In this paper I discuss the question: what comes first, physics or information? The two have had a long-standing, symbiotic relationship for almost a hundred years out of which we have learnt a great deal. Information theory has enriched our interpretations of [...] Read more.
In this paper I discuss the question: what comes first, physics or information? The two have had a long-standing, symbiotic relationship for almost a hundred years out of which we have learnt a great deal. Information theory has enriched our interpretations of quantum physics, and, at the same time, offered us deep insights into general relativity through the study of black hole thermodynamics. Whatever the outcome of this debate, I argue that physicists will be able to benefit from continuing to explore connections between the two. Full article
(This article belongs to the Special Issue Information and Energy/Matter)
Open AccessArticle The World Within Wikipedia: An Ecology of Mind
Information 2012, 3(2), 229-255; doi:10.3390/info3020229
Received: 22 May 2012 / Revised: 11 June 2012 / Accepted: 12 June 2012 / Published: 18 June 2012
Cited by 1 | PDF Full-text (324 KB)
Abstract
Human beings inherit an informational culture transmitted through spoken and written language. A growing body of empirical work supports the mutual influence between language and categorization, suggesting that our cognitive-linguistic environment both reflects and shapes our understanding. By implication, artifacts that manifest [...] Read more.
Human beings inherit an informational culture transmitted through spoken and written language. A growing body of empirical work supports the mutual influence between language and categorization, suggesting that our cognitive-linguistic environment both reflects and shapes our understanding. By implication, artifacts that manifest this cognitive-linguistic environment, such asWikipedia, should represent language structure and conceptual categorization in a way consistent with human behavior. We use this intuition to guide the construction of a computational cognitive model, situated in Wikipedia, that generates semantic association judgments. Our unsupervised model combines information at the language structure and conceptual categorization levels to achieve state of the art correlation with human ratings on semantic association tasks including WordSimilarity-353, semantic feature production norms, word association, and false memory. Full article
(This article belongs to the Special Issue Cognition and Communication)
Figures

Other

Jump to: Research

Open AccessBook Review Mark Burgin’s Theory of Information
Information 2012, 3(2), 224-228; doi:10.3390/info3020224
Received: 12 April 2012 / Revised: 30 May 2012 / Accepted: 30 May 2012 / Published: 1 June 2012
Cited by 1 | PDF Full-text (37 KB) | HTML Full-text | XML Full-text
Abstract A review of a major, definitive source book on the basis of information theory is presented. Full article
(This article belongs to the Section Information Theory and Methodology)

Journal Contact

MDPI AG
Information Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
information@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Information
Back to Top