This article is
- freely available
Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
Institute of Computational Technology of Siberian Branch of Russian Academy of Science, Siberian State University of Telecommunications and Informatics, Novosibirsk 630000, Russia
Received: 6 July 2010; in revised form: 22 July 2010 / Accepted: 5 August 2010 / Published: 12 August 2010
Abstract: We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of different computers, which can have different sets of instructions, different kinds of memory, a different number of cores (or processors), etc. We define efficiency and capacity of computers and suggest a method for their estimation, which is based on the analysis of processor instructions and their execution time. How the suggested method can be applied to estimate the computer capacity is shown. In particular, this consideration gives a new look at the organization of the memory of a computer. Obtained results can be of some interest for practical applications.
Keywords: computer capacity; computer efficiency; information theory; Shannon entropy; channel capacity
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Ryabko, B. Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices. Information 2010, 1, 3-12.
Ryabko B. Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices. Information. 2010; 1(1):3-12.
Ryabko, Boris. 2010. "Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices." Information 1, no. 1: 3-12.