Next Article in Journal
3D CNN-Based Speech Emotion Recognition Using K-Means Clustering and Spectrograms
Next Article in Special Issue
Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information
Previous Article in Journal
Fluctuation Theorem of Information Exchange within an Ensemble of Paths Conditioned on Correlated-Microstates
Previous Article in Special Issue
Guessing with Distributed Encoders
Open AccessArticle

Distributed Hypothesis Testing with Privacy Constraints

1
Department of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran 14171614418, Iran
2
Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117583, Singapore
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(5), 478; https://doi.org/10.3390/e21050478
Received: 6 February 2019 / Revised: 8 March 2019 / Accepted: 20 March 2019 / Published: 7 May 2019
We revisit the distributed hypothesis testing (or hypothesis testing with communication constraints) problem from the viewpoint of privacy. Instead of observing the raw data directly, the transmitter observes a sanitized or randomized version of it. We impose an upper bound on the mutual information between the raw and randomized data. Under this scenario, the receiver, which is also provided with side information, is required to make a decision on whether the null or alternative hypothesis is in effect. We first provide a general lower bound on the type-II exponent for an arbitrary pair of hypotheses. Next, we show that if the distribution under the alternative hypothesis is the product of the marginals of the distribution under the null (i.e., testing against independence), then the exponent is known exactly. Moreover, we show that the strong converse property holds. Using ideas from Euclidean information theory, we also provide an approximate expression for the exponent when the communication rate is low and the privacy level is high. Finally, we illustrate our results with a binary and a Gaussian example. View Full-Text
Keywords: hypothesis testing; privacy; mutual information; testing against independence; zero-rate communication hypothesis testing; privacy; mutual information; testing against independence; zero-rate communication
Show Figures

Figure 1

MDPI and ACS Style

Gilani, A.; Belhadj Amor, S.; Salehkalaibar, S.; Tan, V.Y.F. Distributed Hypothesis Testing with Privacy Constraints. Entropy 2019, 21, 478.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop