Information 2010, 1(1), 3-12; doi:10.3390/info1010003
Article

Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices

email
Received: 6 July 2010; in revised form: 22 July 2010 / Accepted: 5 August 2010 / Published: 12 August 2010
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract: We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), etc. We define efficiency and capacity of computers and suggest a method for their estimation, which is based on the analysis of processor instructions and their execution time. How the suggested method can be applied to estimate the computer capacity is shown. In particular, this consideration gives a new look at the organization of the memory of a computer. Obtained results can be of some interest for practical applications.
Keywords: computer capacity; computer efficiency; information theory; Shannon entropy; channel capacity
PDF Full-text Download PDF Full-Text [232 KB, uploaded 12 August 2010 11:26 CEST]

Export to BibTeX |
EndNote


MDPI and ACS Style

Ryabko, B. Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices. Information 2010, 1, 3-12.

AMA Style

Ryabko B. Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices. Information. 2010; 1(1):3-12.

Chicago/Turabian Style

Ryabko, Boris. 2010. "Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices." Information 1, no. 1: 3-12.

Information EISSN 2078-2489 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert