Journal Menu► ▼ Journal Menu
Journal Browser► ▼ Journal Browser
Special Issue "Data Compression"
A special issue of Algorithms (ISSN 1999-4893).
Deadline for manuscript submissions: closed (30 September 2009).
Data compression is the operation of converting an input data file to a smaller file. This operation is important for the following reasons: 1. People like to accumulate data. Thus, no matter how big a storage device one has, sooner or later it is going to fill up. 2. People hate to wait for data transfers. We often upload and download files from our computers and we hate to wait for long, slow data transfers. How can data be compressed? We can represent the same amount of information in fewer bits because the original data representation is not the shortest possible. It is intentionally long in order to simplify processing the data. We say that our data representations have redundancies. Compressing data is done by locating its redundancies and reducing or eliminating them. Thus, the field of data compression tries to understand the sources of redundancies in different types of data and find clever methods to eliminate them. Today, after decades of research, there are hundreds of algorithms and dozens of implementations that can reduce the size of all types of digital data. It is my hope that this issue of Algorithms will make a significant contribution toward this goal.
Dr. David Salomon
- data compression
- data coding
- source coding
- information theory
- data redundancy
- variable-length codes