Next Issue

Table of Contents

Information, Volume 1, Issue 1 (September 2010), Pages 1-59

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-4
Export citation of selected articles as:

Editorial

Jump to: Research

Open AccessEditorial Information – A New Open Access Scientific Journal on Information Science, Information Technology, Data, Knowledge and Communication
Information 2010, 1(1), 1-2; doi:10.3390/info1010001
Received: 16 June 2010 / Published: 23 June 2010
PDF Full-text (31 KB) | HTML Full-text | XML Full-text
Abstract
We plan to expand our Open Access publishing project to encompass additional fundamental areas in science and technology and to provide publication opportunities for scientists working in these areas. To achieve these goals, we are in the process of launching new journals. [...] Read more.
We plan to expand our Open Access publishing project to encompass additional fundamental areas in science and technology and to provide publication opportunities for scientists working in these areas. To achieve these goals, we are in the process of launching new journals. [...] Full article

Research

Jump to: Editorial

Open AccessArticle Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
Information 2010, 1(1), 3-12; doi:10.3390/info1010003
Received: 6 July 2010 / Revised: 22 July 2010 / Accepted: 5 August 2010 / Published: 12 August 2010
Cited by 2 | PDF Full-text (232 KB) | HTML Full-text | XML Full-text
Abstract
We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a [...] Read more.
We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), etc. We define efficiency and capacity of computers and suggest a method for their estimation, which is based on the analysis of processor instructions and their execution time. How the suggested method can be applied to estimate the computer capacity is shown. In particular, this consideration gives a new look at the organization of the memory of a computer. Obtained results can be of some interest for practical applications. Full article
Figures

Open AccessArticle New Information Measures for the Generalized Normal Distribution
Information 2010, 1(1), 13-27; doi:10.3390/info1010013
Received: 24 June 2010 / Revised: 4 August 2010 / Accepted: 5 August 2010 / Published: 20 August 2010
Cited by 2 | PDF Full-text (576 KB) | HTML Full-text | XML Full-text
Abstract
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distribution family, to study the generalized entropy type measures of information. For this generalized normal, the Kullback-Leibler information is evaluated, which extends the well known result for the normal [...] Read more.
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distribution family, to study the generalized entropy type measures of information. For this generalized normal, the Kullback-Leibler information is evaluated, which extends the well known result for the normal distribution, and plays an important role for the introduced generalized information measure. These generalized entropy type measures of information are also evaluated and presented. Full article
(This article belongs to the Special Issue What Is Information?)
Open AccessArticle A Paradigm Shift in Biology?
Information 2010, 1(1), 28-59; doi:10.3390/info1010028
Received: 16 July 2010 / Accepted: 6 September 2010 / Published: 13 September 2010
Cited by 4 | PDF Full-text (908 KB) | HTML Full-text | XML Full-text
Abstract
All new developments in biology deal with the issue of the complexity of organisms, often pointing out the necessity to update our current understanding. However, it is impossible to think about a change of paradigm in biology without introducing new explanatory mechanisms. [...] Read more.
All new developments in biology deal with the issue of the complexity of organisms, often pointing out the necessity to update our current understanding. However, it is impossible to think about a change of paradigm in biology without introducing new explanatory mechanisms. I shall introduce the mechanisms of teleonomy and teleology as viable explanatory tools. Teleonomy is the ability of organisms to build themselves through internal forces and processes (in the expression of the genetic program) and not external ones, implying a freedom relative to the exterior; however, the organism is able to integrate internal and external constraints in a process of co-adaptation. Teleology is that mechanism through which an organism exercises an informational control on another system in order to establish an equivalence class and select some specific information for its metabolic needs. Finally, I shall examine some interesting processes in phylogeny, ontogeny, and epigeny in which these two mechanisms are involved. Full article
(This article belongs to the Special Issue What Is Information?)

Journal Contact

MDPI AG
Information Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
information@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Information
Back to Top