Distributed Hypothesis Testing with Privacy Constraints
AbstractWe revisit the distributed hypothesis testing (or hypothesis testing with communication constraints) problem from the viewpoint of privacy. Instead of observing the raw data directly, the transmitter observes a sanitized or randomized version of it. We impose an upper bound on the mutual information between the raw and randomized data. Under this scenario, the receiver, which is also provided with side information, is required to make a decision on whether the null or alternative hypothesis is in effect. We first provide a general lower bound on the type-II exponent for an arbitrary pair of hypotheses. Next, we show that if the distribution under the alternative hypothesis is the product of the marginals of the distribution under the null (i.e., testing against independence), then the exponent is known exactly. Moreover, we show that the strong converse property holds. Using ideas from Euclidean information theory, we also provide an approximate expression for the exponent when the communication rate is low and the privacy level is high. Finally, we illustrate our results with a binary and a Gaussian example. View Full-Text
Share & Cite This Article
Gilani, A.; Belhadj Amor, S.; Salehkalaibar, S.; Tan, V.Y.F. Distributed Hypothesis Testing with Privacy Constraints. Entropy 2019, 21, 478.
Gilani A, Belhadj Amor S, Salehkalaibar S, Tan VYF. Distributed Hypothesis Testing with Privacy Constraints. Entropy. 2019; 21(5):478.Chicago/Turabian Style
Gilani, Atefeh; Belhadj Amor, Selma; Salehkalaibar, Sadaf; Tan, Vincent Y.F. 2019. "Distributed Hypothesis Testing with Privacy Constraints." Entropy 21, no. 5: 478.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.