Special Issue "Multiuser Information Theory II"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 1 February 2020.

Special Issue Editor

Prof. Dr. S. Sandeep Pradhan
E-Mail Website
Guest Editor
Department of Electrical Engineering and Computer Science, University of Michigan, 1301 Beal Avenue, Ann Arbor, MI 48103, USA
Interests: distributed compression in sensor networks; multiple description source coding; multi-user channel coding; group codes for network communication; network capacity problems
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue on “Multi-User Information Theory in 2017” has led to the publication of high quality papers on several new research directions. The Special Issue received a lot of attention from the community. In 2019, we will continue this Special Issue.

Possible topics include, but are not limited to, the following:

  • non-asymptotic performance characterizations
  • multi-letter coding techniques
  • non-Shannon-type inequalities
  • multi-user quantum information and coding
  • applications of information theory to new emerging areas such as cybersecurity, privacy, distributed data storage, bioinformatics and learning
  • new source models based on big data
  • new channel models based on mmWave technology

The goal of this issue is to develop new bridges between information theory and other fields, such as abstract algebra, ergodic theory, quantum physics, theory of random graphs, theory of communication complexity, information geometry and additive combinatorics, and thereby contribute to furthering collaborations between researchers working in these communities.

Prof. Dr. S. Sandeep Pradhan
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (15 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
A Strong Converse Theorem for Hypothesis Testing Against Independence over a Two-Hop Network
Entropy 2019, 21(12), 1171; https://doi.org/10.3390/e21121171 - 29 Nov 2019
Abstract
By proving a strong converse theorem, we strengthen the weak converse result by Salehkalaibar, Wigger and Wang (2017) concerning hypothesis testing against independence over a two-hop network with communication constraints. Our proof follows by combining two recently-proposed techniques for proving strong converse theorems, [...] Read more.
By proving a strong converse theorem, we strengthen the weak converse result by Salehkalaibar, Wigger and Wang (2017) concerning hypothesis testing against independence over a two-hop network with communication constraints. Our proof follows by combining two recently-proposed techniques for proving strong converse theorems, namely the strong converse technique via reverse hypercontractivity by Liu, van Handel, and Verdú (2017) and the strong converse technique by Tyagi and Watanabe (2018), in which the authors used a change-of-measure technique and replaced hard Markov constraints with soft information costs. The techniques used in our paper can also be applied to prove strong converse theorems for other multiterminal hypothesis testing against independence problems. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
On the Optimality of Interference Decoding Schemes for K-User Gaussian Interference Channels
Entropy 2019, 21(11), 1053; https://doi.org/10.3390/e21111053 - 28 Oct 2019
Abstract
The sum capacity of the general K-user Gaussian Interference Channel (GIC) is known only when the channel coefficients are such that treating interference as noise (TIN) is optimal. The Han-Kobayashi (HK) scheme is an extensively studied coding scheme for the K-user [...] Read more.
The sum capacity of the general K-user Gaussian Interference Channel (GIC) is known only when the channel coefficients are such that treating interference as noise (TIN) is optimal. The Han-Kobayashi (HK) scheme is an extensively studied coding scheme for the K-user interference channel (IC). Simple HK schemes are HK schemes with Gaussian signaling, no time sharing and no private-common power splitting. The class of simple HK (S-HK) schemes includes the TIN scheme and schemes that involve various levels of interference decoding and cancellation at each receiver. For the 2-user GIC, simple HK schemes are sufficient to achieve all known sum capacity results—sum capacity under mixed, strong and noisy interference conditions. We derive channel conditions under which simple HK schemes achieve sum capacity for general K-user Gaussian ICs. For the K-user GIC, these results generalize existing sum capacity results for the TIN scheme to the class of simple HK schemes. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Secure Degrees of Freedom in Networks with User Misbehavior
Entropy 2019, 21(10), 945; https://doi.org/10.3390/e21100945 - 26 Sep 2019
Abstract
We investigate the secure degrees of freedom (s.d.o.f.) of three new channel models: broadcast channel with combating helpers, interference channel with selfish users, and multiple access wiretap channel with deviating users. The goal of introducing these channel models is to investigate various malicious [...] Read more.
We investigate the secure degrees of freedom (s.d.o.f.) of three new channel models: broadcast channel with combating helpers, interference channel with selfish users, and multiple access wiretap channel with deviating users. The goal of introducing these channel models is to investigate various malicious interactions that arise in networks, including active adversaries. That is in contrast with the common assumption in the literature that the users follow a certain protocol altruistically and transmit both message-carrying and cooperative jamming signals in an optimum manner. In the first model, over a classical broadcast channel with confidential messages (BCCM), there are two helpers, each associated with one of the receivers. In the second model, over a classical interference channel with confidential messages (ICCM), there is a helper and users are selfish. By casting each problem as an extensive-form game and applying recursive real interference alignment, we show that, for the first model, the combating intentions of the helpers are neutralized and the full s.d.o.f. is retained; for the second model, selfishness precludes secure communication and no s.d.o.f. is achieved. In the third model, we consider the multiple access wiretap channel (MAC-WTC), where multiple legitimate users wish to have secure communication with a legitimate receiver in the presence of an eavesdropper. We consider the case when a subset of users deviate from the optimum protocol that attains the exact s.d.o.f. of this channel. We consider two kinds of deviation: when some of the users stop transmitting cooperative jamming signals, and when a user starts sending intentional jamming signals. For the first scenario, we investigate possible responses of the remaining users to counteract such deviation. For the second scenario, we use an extensive-form game formulation for the interactions of the deviating and well-behaving users. We prove that a deviating user can drive the s.d.o.f. to zero; however, the remaining users can exploit its intentional jamming signals as cooperative jamming signals against the eavesdropper and achieve an optimum s.d.o.f. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Distributed Recovery of a Gaussian Source in Interference with Successive Lattice Processing
Entropy 2019, 21(9), 845; https://doi.org/10.3390/e21090845 - 30 Aug 2019
Cited by 1
Abstract
A scheme for recovery of a signal by distributed listeners in the presence of Gaussian interference is constructed by exhausting an “iterative power reduction” property. An upper bound for the scheme’s achieved mean-squared-error distortion is derived. The strategy exposes a parameter search problem, [...] Read more.
A scheme for recovery of a signal by distributed listeners in the presence of Gaussian interference is constructed by exhausting an “iterative power reduction” property. An upper bound for the scheme’s achieved mean-squared-error distortion is derived. The strategy exposes a parameter search problem, which, when solved, causes the scheme to outperform others of its kind. Performance of a blocklength-one scheme is simulated and is seen to improve over plain source coding without compression in the presence of many interferers, and experiences less outages over ensembles of channels. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Structural Characteristics of Two-Sender Index Coding
Entropy 2019, 21(6), 615; https://doi.org/10.3390/e21060615 - 21 Jun 2019
Abstract
This paper studies index coding with two senders. In this setup, source messages are distributed among the senders possibly with common messages. In addition, there are multiple receivers, with each receiver having some messages a priori, known as side-information, and requesting one unique [...] Read more.
This paper studies index coding with two senders. In this setup, source messages are distributed among the senders possibly with common messages. In addition, there are multiple receivers, with each receiver having some messages a priori, known as side-information, and requesting one unique message such that each message is requested by only one receiver. Index coding in this setup is called two-sender unicast index coding (TSUIC). The main goal is to find the shortest aggregate normalized codelength, which is expressed as the optimal broadcast rate. In this work, firstly, for a given TSUIC problem, we form three independent sub-problems each consisting of the only subset of the messages, based on whether the messages are available only in one of the senders or in both senders. Then, we express the optimal broadcast rate of the TSUIC problem as a function of the optimal broadcast rates of those independent sub-problems. In this way, we discover the structural characteristics of TSUIC. For the proofs of our results, we utilize confusion graphs and coding techniques used in single-sender index coding. To adapt the confusion graph technique in TSUIC, we introduce a new graph-coloring approach that is different from the normal graph coloring, which we call two-sender graph coloring, and propose a way of grouping the vertices to analyze the number of colors used. We further determine a class of TSUIC instances where a certain type of side-information can be removed without affecting their optimal broadcast rates. Finally, we generalize the results of a class of TSUIC problems to multiple senders. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Exponential Strong Converse for One Helper Source Coding Problem
Entropy 2019, 21(6), 567; https://doi.org/10.3390/e21060567 - 05 Jun 2019
Cited by 1
Abstract
We consider the one helper source coding problem posed and investigated by Ahlswede, Körner and Wyner. Two correlated sources are separately encoded and are sent to a destination where the decoder wishes to decode one of the two sources with an arbitrary small [...] Read more.
We consider the one helper source coding problem posed and investigated by Ahlswede, Körner and Wyner. Two correlated sources are separately encoded and are sent to a destination where the decoder wishes to decode one of the two sources with an arbitrary small error probability of decoding. In this system, the error probability of decoding goes to one as the source block length n goes to infinity. This implies that we have a strong converse theorem for the one helper source coding problem. In this paper, we provide the much stronger version of this strong converse theorem for the one helper source coding problem. We prove that the error probability of decoding tends to one exponentially and derive an explicit lower bound of this exponent function. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Information Theoretic Security for Shannon Cipher System under Side-Channel Attacks
Entropy 2019, 21(5), 469; https://doi.org/10.3390/e21050469 - 05 May 2019
Cited by 2
Abstract
In this paper, we propose a new theoretical security model for Shannon cipher systems under side-channel attacks, where the adversary is not only allowed to collect ciphertexts by eavesdropping the public communication channel but is also allowed to collect the physical information leaked [...] Read more.
In this paper, we propose a new theoretical security model for Shannon cipher systems under side-channel attacks, where the adversary is not only allowed to collect ciphertexts by eavesdropping the public communication channel but is also allowed to collect the physical information leaked by the devices where the cipher system is implemented on, such as running time, power consumption, electromagnetic radiation, etc. Our model is very robust as it does not depend on the kind of physical information leaked by the devices. We also prove that in the case of one-time pad encryption, we can strengthen the secrecy/security of the cipher system by using an appropriate affine encoder. More precisely, we prove that for any distribution of the secret keys and any measurement device used for collecting the physical information, we can derive an achievable rate region for reliability and security such that if we compress the ciphertext using an affine encoder with a rate within the achievable rate region, then: (1) anyone with a secret key will be able to decrypt and decode the ciphertext correctly, but (2) any adversary who obtains the ciphertext and also the side physical information will not be able to obtain any information about the hidden source as long as the leaked physical information is encoded with a rate within the rate region. We derive our result by adapting the framework of the one helper source coding problem posed and investigated by Ahlswede and Körner (1975) and Wyner (1975). For reliability and security, we obtain our result by combining the result of Csizár (1982) on universal coding for a single source using linear codes and the exponential strong converse theorem of Oohama (2015) for the one helper source coding problem. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Exponential Strong Converse for Successive Refinement with Causal Decoder Side Information
Entropy 2019, 21(4), 410; https://doi.org/10.3390/e21040410 - 17 Apr 2019
Abstract
We consider the k-user successive refinement problem with causal decoder side information and derive an exponential strong converse theorem. The rate-distortion region for the problem can be derived as a straightforward extension of the two-user case by Maor and Merhav (2008). We [...] Read more.
We consider the k-user successive refinement problem with causal decoder side information and derive an exponential strong converse theorem. The rate-distortion region for the problem can be derived as a straightforward extension of the two-user case by Maor and Merhav (2008). We show that for any rate-distortion tuple outside the rate-distortion region of the k-user successive refinement problem with causal decoder side information, the joint excess-distortion probability approaches one exponentially fast. Our proof follows by judiciously adapting the recently proposed strong converse technique by Oohama using the information spectrum method, the variational form of the rate-distortion region and Hölder’s inequality. The lossy source coding problem with causal decoder side information considered by El Gamal and Weissman is a special case ( k = 1 ) of the current problem. Therefore, the exponential strong converse theorem for the El Gamal and Weissman problem follows as a corollary of our result. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
A Monotone Path Proof of an Extremal Result for Long Markov Chains
Entropy 2019, 21(3), 276; https://doi.org/10.3390/e21030276 - 13 Mar 2019
Abstract
We prove an extremal result for long Markov chains based on the monotone path argument, generalizing an earlier work by Courtade and Jiao. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
MIMO Gaussian State-Dependent Channels with a State-Cognitive Helper
Entropy 2019, 21(3), 273; https://doi.org/10.3390/e21030273 - 12 Mar 2019
Abstract
We consider the problem of channel coding over multiterminal state-dependent channels in which neither transmitters nor receivers but only a helper node has a non-causal knowledge of the state. Such channel models arise in many emerging communication schemes. We start by investigating the [...] Read more.
We consider the problem of channel coding over multiterminal state-dependent channels in which neither transmitters nor receivers but only a helper node has a non-causal knowledge of the state. Such channel models arise in many emerging communication schemes. We start by investigating the parallel state-dependent channel with the same but differently scaled state corrupting the receivers. A cognitive helper knows the state in a non-causal manner and wishes to mitigate the interference that impacts the transmission between two transmit–receive pairs. Outer and inner bounds are derived. In our analysis, the channel parameters are partitioned into various cases, and segments on the capacity region boundary are characterized for each case. Furthermore, we show that for a particular set of channel parameters, the capacity region is entirely characterized. In the second part of this work, we address a similar scenario, but now each channel is corrupted by an independent state. We derive an inner bound using a coding scheme that integrates single-bin Gel’fand–Pinsker coding and Marton’s coding for the broadcast channel. We also derive an outer bound and further partition the channel parameters into several cases for which parts of the capacity region boundary are characterized. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Universality of Logarithmic Loss in Successive Refinement
Entropy 2019, 21(2), 158; https://doi.org/10.3390/e21020158 - 08 Feb 2019
Cited by 2
Abstract
We establish an universal property of logarithmic loss in the successive refinement problem. If the first decoder operates under logarithmic loss, we show that any discrete memoryless source is successively refinable under an arbitrary distortion criterion for the second decoder. Based on this [...] Read more.
We establish an universal property of logarithmic loss in the successive refinement problem. If the first decoder operates under logarithmic loss, we show that any discrete memoryless source is successively refinable under an arbitrary distortion criterion for the second decoder. Based on this result, we propose a low-complexity lossy compression algorithm for any discrete memoryless source. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
On Continuous-Time Gaussian Channels
Entropy 2019, 21(1), 67; https://doi.org/10.3390/e21010067 - 14 Jan 2019
Cited by 1
Abstract
A continuous-time white Gaussian channel can be formulated using a white Gaussian noise, and a conventional way for examining such a channel is the sampling approach based on the Shannon–Nyquist sampling theorem, where the original continuous-time channel is converted to an equivalent discrete-time [...] Read more.
A continuous-time white Gaussian channel can be formulated using a white Gaussian noise, and a conventional way for examining such a channel is the sampling approach based on the Shannon–Nyquist sampling theorem, where the original continuous-time channel is converted to an equivalent discrete-time channel, to which a great variety of established tools and methodology can be applied. However, one of the key issues of this scheme is that continuous-time feedback and memory cannot be incorporated into the channel model. It turns out that this issue can be circumvented by considering the Brownian motion formulation of a continuous-time white Gaussian channel. Nevertheless, as opposed to the white Gaussian noise formulation, a link that establishes the information-theoretic connection between a continuous-time channel under the Brownian motion formulation and its discrete-time counterparts has long been missing. This paper is to fill this gap by establishing causality-preserving connections between continuous-time Gaussian feedback/memory channels and their associated discrete-time versions in the forms of sampling and approximation theorems, which we believe will play important roles in the long run for further developing continuous-time information theory. As an immediate application of the approximation theorem, we propose the so-called approximation approach to examine continuous-time white Gaussian channels in the point-to-point or multi-user setting. It turns out that the approximation approach, complemented by relevant tools from stochastic calculus, can enhance our understanding of continuous-time Gaussian channels in terms of giving alternative and strengthened interpretation to some long-held folklore, recovering “long-known” results from new perspectives, and rigorously establishing new results predicted by the intuition that the approximation approach carries. More specifically, using the approximation approach complemented by relevant tools from stochastic calculus, we first derive the capacity regions of continuous-time white Gaussian multiple access channels and broadcast channels, and we then analyze how feedback affects their capacity regions: feedback will increase the capacity regions of some continuous-time white Gaussian broadcast channels and interference channels, while it will not increase capacity regions of continuous-time white Gaussian multiple access channels. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Open AccessArticle
Robust Signaling for Bursty Interference
Entropy 2018, 20(11), 870; https://doi.org/10.3390/e20110870 - 12 Nov 2018
Abstract
This paper studies a bursty interference channel, where the presence/absence of interference is modeled by a block-i.i.d. Bernoulli process that stays constant for a duration of T symbols (referred to as coherence block) and then changes independently to a new state. We consider [...] Read more.
This paper studies a bursty interference channel, where the presence/absence of interference is modeled by a block-i.i.d. Bernoulli process that stays constant for a duration of T symbols (referred to as coherence block) and then changes independently to a new state. We consider both a quasi-static setup, where the interference state remains constant during the whole transmission of the codeword, and an ergodic setup, where a codeword spans several coherence blocks. For the quasi-static setup, we study the largest rate of a coding strategy that provides reliable communication at a basic rate and allows an increased (opportunistic) rate when there is no interference. For the ergodic setup, we study the largest achievable rate. We study how non-causal knowledge of the interference state, referred to as channel-state information (CSI), affects the achievable rates. We derive converse and achievability bounds for (i) local CSI at the receiver side only; (ii) local CSI at the transmitter and receiver side; and (iii) global CSI at all nodes. Our bounds allow us to identify when interference burstiness is beneficial and in which scenarios global CSI outperforms local CSI. The joint treatment of the quasi-static and ergodic setup further allows for a thorough comparison of these two setups. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Distributed Joint Source-Channel Coding Using Quasi-Uniform Systematic Polar Codes
Entropy 2018, 20(10), 806; https://doi.org/10.3390/e20100806 - 22 Oct 2018
Cited by 2
Abstract
This paper proposes a distributed joint source-channel coding (DJSCC) scheme using polar-like codes. In the proposed scheme, each distributed source encodes source message with a quasi-uniform systematic polar code (QSPC) or a punctured QSPC, and only transmits parity bits over its independent channel. [...] Read more.
This paper proposes a distributed joint source-channel coding (DJSCC) scheme using polar-like codes. In the proposed scheme, each distributed source encodes source message with a quasi-uniform systematic polar code (QSPC) or a punctured QSPC, and only transmits parity bits over its independent channel. These systematic codes play the role of both source compression and error protection. For the infinite code-length, we show that the proposed scheme approaches the information-theoretical limit by the technique of joint source-channel polarization with side information. For the finite code-length, the simulation results verify that the proposed scheme outperforms the distributed separate source-channel coding (DSSCC) scheme using polar codes and the DJSCC scheme using classic systematic polar codes. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Open AccessArticle
Multilevel Diversity Coding with Secure Regeneration: Separate Coding Achieves the MBR Point
Entropy 2018, 20(10), 751; https://doi.org/10.3390/e20100751 - 30 Sep 2018
Cited by 1
Abstract
The problem of multilevel diversity coding with secure regeneration (MDC-SR) is considered, which includes the problems of multilevel diversity coding with regeneration (MDC-R) and secure regenerating code (SRC) as special cases. Two outer bounds are established, showing that separate coding can achieve the [...] Read more.
The problem of multilevel diversity coding with secure regeneration (MDC-SR) is considered, which includes the problems of multilevel diversity coding with regeneration (MDC-R) and secure regenerating code (SRC) as special cases. Two outer bounds are established, showing that separate coding can achieve the minimum-bandwidth-regeneration (MBR) point of the achievable normalized storage-capacity repair-bandwidth trade-off regions for the general MDC-SR problem. The core of the new converse results is an exchange lemma, which can be established using Han’s subset inequality. Full article
(This article belongs to the Special Issue Multiuser Information Theory II)
Show Figures

Figure 1

Back to TopTop