Next Article in Journal
Environmental Response of 2D Thermal Cloak under Dynamic External Temperature Field
Previous Article in Journal
Non-Hermitian Hamiltonians and Quantum Transport in Multi-Terminal Conductors
Open AccessArticle

Finite-Length Analyses for Source and Channel Coding on Markov Chains

by Masahito Hayashi 1,2,3,4,*,‡ and Shun Watanabe 5,‡
Shenzhen Institute for Quantum Science and Engineering, Southern University of Science and Technology, Shenzhen 518055, China
Graduate School of Mathematics, Nagoya University, Nagoya 464-8602, Japan
Center for Quantum Computing, Peng Cheng Laboratory, Shenzhen 518000, China
Centre for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117542, Singapore
Department of Computer and Information Sciences, Tokyo University of Agriculture and Technology, Koganei-shi, Tokyo 184-8588, Japan
Author to whom correspondence should be addressed.
This paper is an extended version of the conference paper presented at the 51st Allerton Conference and 2014 Information Theory and Applications Workshop, San Diego, CA, USA, 9–14 February 2014.
These authors contributed equally to this work.
Entropy 2020, 22(4), 460;
Received: 9 March 2020 / Revised: 2 April 2020 / Accepted: 4 April 2020 / Published: 18 April 2020
(This article belongs to the Special Issue Finite-Length Information Theory)
We derive finite-length bounds for two problems with Markov chains: source coding with side-information where the source and side-information are a joint Markov chain and channel coding for channels with Markovian conditional additive noise. For this purpose, we point out two important aspects of finite-length analysis that must be argued when finite-length bounds are proposed. The first is the asymptotic tightness, and the other is the efficient computability of the bound. Then, we derive finite-length upper and lower bounds for the coding length in both settings such that their computational complexity is low. We argue the first of the above-mentioned aspects by deriving the large deviation bounds, the moderate deviation bounds, and second-order bounds for these two topics and show that these finite-length bounds achieve the asymptotic optimality in these senses. Several kinds of information measures for transition matrices are introduced for the purpose of this discussion. View Full-Text
Keywords: channel coding; Markov chain; finite-length analysis; source coding channel coding; Markov chain; finite-length analysis; source coding
Show Figures

Figure 1

MDPI and ACS Style

Hayashi, M.; Watanabe, S. Finite-Length Analyses for Source and Channel Coding on Markov Chains. Entropy 2020, 22, 460.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Search more from Scilit
Back to TopTop