Next Article in Journal
3D CNN-Based Speech Emotion Recognition Using K-Means Clustering and Spectrograms
Previous Article in Journal
Fluctuation Theorem of Information Exchange within an Ensemble of Paths Conditioned on Correlated-Microstates
Previous Article in Special Issue
Guessing with Distributed Encoders
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessArticle

Distributed Hypothesis Testing with Privacy Constraints

1
Department of Electrical and Computer Engineering, College of Engineering, University of Tehran, Tehran 14171614418, Iran
2
Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117583, Singapore
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(5), 478; https://doi.org/10.3390/e21050478
Received: 6 February 2019 / Revised: 8 March 2019 / Accepted: 20 March 2019 / Published: 7 May 2019
  |  
PDF [432 KB, uploaded 14 May 2019]
  |  

Abstract

We revisit the distributed hypothesis testing (or hypothesis testing with communication constraints) problem from the viewpoint of privacy. Instead of observing the raw data directly, the transmitter observes a sanitized or randomized version of it. We impose an upper bound on the mutual information between the raw and randomized data. Under this scenario, the receiver, which is also provided with side information, is required to make a decision on whether the null or alternative hypothesis is in effect. We first provide a general lower bound on the type-II exponent for an arbitrary pair of hypotheses. Next, we show that if the distribution under the alternative hypothesis is the product of the marginals of the distribution under the null (i.e., testing against independence), then the exponent is known exactly. Moreover, we show that the strong converse property holds. Using ideas from Euclidean information theory, we also provide an approximate expression for the exponent when the communication rate is low and the privacy level is high. Finally, we illustrate our results with a binary and a Gaussian example. View Full-Text
Keywords: hypothesis testing; privacy; mutual information; testing against independence; zero-rate communication hypothesis testing; privacy; mutual information; testing against independence; zero-rate communication
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Gilani, A.; Belhadj Amor, S.; Salehkalaibar, S.; Tan, V.Y.F. Distributed Hypothesis Testing with Privacy Constraints. Entropy 2019, 21, 478.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top