Next Article in Journal
The Many Classical Faces of Quantum Structures
Next Article in Special Issue
Deriving Proper Uniform Priors for Regression Coefficients, Parts I, II, and III
Previous Article in Journal
A Novel Framework for Shock Filter Using Partial Differential Equations
Previous Article in Special Issue
Systematic Analysis of the Non-Extensive Statistical Approach in High Energy Particle Collisions—Experiment vs. Theory
Article Menu
Issue 4 (April) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(4), 143; doi:10.3390/e19040143

Paradigms of Cognition

Department of Mathematical Sciences, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen, Denmark
This paper is an extended version of our paper published in the 36th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Ghent, Belgium, 10–15 July 2016.
Received: 19 December 2016 / Revised: 23 February 2017 / Accepted: 10 March 2017 / Published: 27 March 2017
(This article belongs to the Special Issue Selected Papers from MaxEnt 2016)
View Full-Text   |   Download PDF [732 KB, uploaded 31 March 2017]   |  

Abstract

An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed. Seemingly unrelated results are thereby unified. As an indication of this, consider results in classical probabilistic information theory involving information projections and so-called Pythagorean inequalities. This has a certain resemblance to classical results in geometry bearing Pythagoras’ name. By appealing to the abstract theory presented here, you have a common point of reference for these results. In fact, the new theory provides a general framework for the treatment of a multitude of global optimization problems across a range of disciplines such as geometry, statistics and statistical physics. Several applications are given, among them an “explanation” of Tsallis entropy is suggested. For this, as well as for the general development of the abstract underlying theory, emphasis is placed on interpretations and associated philosophical considerations. Technically, game theory is the key tool. View Full-Text
Keywords: entropy; divergence; redundancy; information triples; proper effort functions; fundamental inequality; Jensen-Shannon divergence; core; Bregman construction; Tsallis entropy entropy; divergence; redundancy; information triples; proper effort functions; fundamental inequality; Jensen-Shannon divergence; core; Bregman construction; Tsallis entropy
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Topsøe, F. Paradigms of Cognition. Entropy 2017, 19, 143.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top