#
Information Extraction Under Privacy Constraints^{ †}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

#### 1.1. Our Model and Main Contributions

- We study lower and upper bounds of ${g}_{\epsilon}(X;Y)$. The lower bound, in particular, establishes a multiplicative bound on $I(Y;Z)$ for any optimal privacy filter. Specifically, we show that for a given $(X,Y)$ and $\epsilon >0$ there exists a channel $Q:Y\to Z$ such that $I(X;Z)\le \epsilon $ and$$I(Y;Z)\ge \lambda (X;Y)\epsilon $$
- We propose an information-theoretic setting in which ${g}_{\epsilon}(X;Y)$ appears as a natural upper-bound for the achievable rate in the so-called "dependence dilution" coding problem. Specifically, we examine the joint-encoder version of an amplification-masking tradeoff, a setting recently introduced by Courtade [26] and we show that the dual of ${g}_{\epsilon}(X;Y)$ upper bounds the masking rate. We also present an estimation-theoretic motivation for the privacy measure ${\rho}_{m}^{2}(X;Z)\le \epsilon $. In fact, by imposing ${\rho}_{m}^{2}(X;Y)\le \epsilon $, we require that an adversary who observes Z cannot efficiently estimate $f\left(X\right)$, for any function f. This is reminiscent of semantic security [27] in the cryptography community. An encryption mechanism is said to be semantically secure if the adversary’s advantage for correctly guessing any function of the privata data given an observation of the mechanism’s output (i.e., the ciphertext) is required to be negligible. This, in fact, justifies the use of maximal correlation as a measure of privacy. The use of mutual information as privacy measure can also be justified using Fano’s inequality. Note that $I(X;Z)\le \epsilon $ can be shown to imply that $Pr(\widehat{X}\left(Z\right)\ne X)\ge \frac{H\left(X\right)-1-\epsilon}{log\left(\right|\mathcal{X}\left|\right)}$ and hence the probability of adversary correctly guessing X is lower-bounded.
- We also study the rate of increase ${g}_{0}^{\prime}(X;Y)$ of ${g}_{\epsilon}(X;Y)$ at $\epsilon =0$ and show that this rate can characterize the behavior of ${g}_{\epsilon}(X;Y)$ for any $\epsilon \ge 0$ provided that ${g}_{0}(X;Y)=0$. This again has connections with the results of [25]. Letting$$\Gamma \left(R\right):=\underset{\genfrac{}{}{0pt}{}{{P}_{Z|Y}:X\phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{-2.84544pt}{0ex}}-\circ -\phantom{\rule{-2.84544pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}Y\phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{-2.84544pt}{0ex}}-\circ -\phantom{\rule{-2.84544pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}Z}{I(Y;Z)\le R}}{max}I(X;Z)$$
- Finally, we generalize the rate-privacy function to the continuous case where X and Y are both continuous and show that some of the properties of ${g}_{\epsilon}(X;Y)$ in the discrete case do not carry over to the continuous case. In particular, we assume that the privacy filter belongs to a family of additive noise channels followed by an M-level uniform scalar quantizer and give asymptotic bounds as $M\to \infty $ for the rate-privacy function.

#### 1.2. Organization

## 2. Utility-Privacy Measures: Definitions and Properties

#### 2.1. Mutual Information as Privacy Measure

**Lemma 1**

**.**For a given joint distribution P defined over $\mathcal{X}\times \mathcal{Y}$, the mapping $\epsilon \mapsto \frac{{g}_{\epsilon}(X;Y)}{\epsilon}$ is non-increasing on $\epsilon \in (0,\infty )$ and ${g}_{\epsilon}(X;Y)$ lies between two straight lines as follows:

**Lemma 2.**

**Proof.**

**Remark 1.**

**Corollary 3.**

**Proof.**

**Remark 2.**

#### 2.2. Maximal Correlation as Privacy Measure

**Definition 4**

**.**Given random variables X and Y, the maximal correlation ${\rho}_{m}(X;Y)$ is defined as follows (recall that the correlation coefficient between U and V, is defined as $\rho (U;V):=\frac{\text{cov}(U;V)}{{\sigma}_{U}{\sigma}_{V}}$, where $\text{cov}(U;V),{\sigma}_{U}$ and ${\sigma}_{V}$ are the covariance between U and V, the standard deviations of U and V, respectively):

**Lemma 5.**

**Proof.**

**Lemma 6.**

**Proof.**

**Corollary 7.**

**Proof.**

**Lemma 8.**

**Proof.**

#### 2.3. Non-Trivial Filters For Perfect Privacy

**Definition 9**

**.**The random variable X is said to be weakly independent of Y if the rows of the transition matrix ${P}_{X|Y}$, i.e., the set of vectors $\{{P}_{X|Y}(\xb7|y),\phantom{\rule{3.33333pt}{0ex}}y\in \mathcal{Y}\}$, are linearly dependent.

**Lemma 10.**

**Proof.**

**Remark 3.**

**Remark 4.**

**Corollary 11.**

## 3. Operational Interpretations of the Rate-Privacy Function

#### 3.1. Dependence Dilution

**Theorem 12.**

**Lemma 13**

**.**Given a pair of random variables $(U,V)$ defined over $\mathcal{U}\times \mathcal{V}$ for finite $\mathcal{V}$ and arbitrary $\mathcal{U}$, any list decoder $g:\mathcal{U}\to {2}^{\mathcal{V}}$, $U\mapsto g\left(U\right)$ of fixed list size m (i.e., $\left|g\right(u\left)\right|=m,\phantom{\rule{3.33333pt}{0ex}}\forall u\in \mathcal{U}$), satisfies

**Lemma 14.**

**Proof.**

**Proof of Theorem 12.**

**Corollary 15.**

**Remark 5.**

#### 3.2. MMSE Estimation of Functions of Private Information

**Definition 16.**

**Definition 17**

**.**A joint distribution ${P}_{UV}$ satisfies a Poincaré inequality with constant $c\le 1$ if for all $f:\mathcal{U}\to \mathbb{R}$

**Theorem 18**

## 4. Observation Channels for Minimal and Maximal ${\mathbf{g}}_{\epsilon}(\mathbf{X};\mathbf{Y})$

#### 4.1. Conditions for Minimal ${g}_{\epsilon}(X;Y)$

**Lemma 19.**

**Proof.**

**Remark 6.**

**Lemma 20.**

**Proof.**

**Theorem 21.**

**Proof.**

**Corollary 22.**

- (i)
- Y is uniformly distributed,
- (ii)
- $D\left({P}_{X|Y}(\xb7|y)\right|\left|{P}_{X}(\xb7)\right)$ is constant for all $y\in \mathcal{Y}$.

**Proof.**

**Example 1.**

**Example 2.**

#### 4.2. Special Observation Channels

#### 4.2.1. Observation Channels With Symmetric Reverse

**Lemma 23.**

**Proof.**

**Corollary 24.**

**Theorem 25.**

- (i)
- ${g}_{\epsilon}(X;Y)=\epsilon \frac{H\left(Y\right)}{I(X;Y)}$ for $0\le \epsilon \le I(X;Y)$.
- (ii)
- The initial efficiency of privacy-constrained information extraction is$${g}_{0}^{\prime}(X;Y)=\frac{-log{P}_{Y}\left(y\right)}{D\left({P}_{X|Y}(\xb7|y)\right|\left|{P}_{X}(\xb7)\right)},\phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}\forall y\in \mathcal{Y}$$

**Proof.**

**Corollary 26.**

**Proof.**

#### 4.2.2. Erasure Observation Channel

**Lemma 27.**

**Proof.**

**Example 3.**

## 5. Rate-Privacy Function for Continuous Random Variables

#### 5.1. General Properties of the Rate-Privacy Function

- (a)
- There exist constants ${C}_{1}>0$, $p>1$ and bounded function ${C}_{2}:\mathbb{R}\to \mathbb{R}$ such that$${f}_{Y}\left(y\right)\le {C}_{1}{\left|y\right|}^{-p}$$$${f}_{Y|X}\left(y\right|x)\le {C}_{2}\left(x\right){\left|y\right|}^{-p}$$
- (b)
- $\mathbb{E}\left[{X}^{2}\right]$ and $\mathbb{E}\left[{Y}^{2}\right]$ are both finite,
- (c)
- the differential entropy of $(X,Y)$ satisfies $h(X,Y)>-\infty $,
- (d)
- $H(\lfloor Y\rfloor )<\infty $, where $\lfloor a\rfloor $ denotes the largest integer ℓ such that $\ell \le a$.

**Theorem 28.**

**Proof.**

**Theorem 29.**

**Proof.**

#### 5.2. Gaussian Information

**Lemma 30.**

**Proof.**

**Theorem 31.**

**Proof.**

**Remark 7.**

**Theorem 32.**

**Proof.**

## 6. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## Appendix A. Proof of Lemma 19

## Appendix B. Completion of Proof of Theorem 25

**Lemma 33.**

**Proof.**

## Appendix C. Proof of Theorems 28 and 29

**Lemma 34.**

**Proof.**

**Lemma 35.**

**Proof.**

**Lemma 36.**

**Proof.**

**Proof of Theorem 29.**

**Theorem 37**

**.**If U is an absolutely continuous random variable with density ${f}_{U}\left(x\right)$ and if $H(\lfloor U\rfloor )<\infty $, then

**Lemma 38.**

**Lemma 39.**

**Proof.**

**Lemma 40.**

**Proof.**

**Lemma 41.**

**Proof.**

**Lemma 42.**

**Proof.**

**Proof of Theorem 28.**

## References

- Asoodeh, S.; Alajaji, F.; Linder, T. Notes on information-theoretic privacy. In Proceedings of the 52nd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, 30 September–3 October 2014; pp. 1272–1278.
- Asoodeh, S.; Alajaji, F.; Linder, T. On maximal correlation, mutual information and data privacy. In Proceedings of the IEEE 14th Canadian Workshop on Information Theory (CWIT), St. John’s, NL, Canada, 6–9 July 2015; pp. 27–31.
- Warner, S.L. Randomized Response: A Survey Technique for Eliminating Evasive Answer Bias. J. Am. Stat. Assoc.
**1965**, 60, 63–69. [Google Scholar] [CrossRef] [PubMed] - Blum, A.; Ligett, K.; Roth, A. A learning theory approach to non-interactive database privacy. In Proceedings of the Fortieth Annual ACM Symposium on the Theory of Computing, Victoria, BC, Canada, 17–20 May 2008; pp. 1123–1127.
- Dinur, I.; Nissim, K. Revealing information while preserving privacy. In Proceedings of the Twenty-Second Symposium on Principles of Database Systems, San Diego, CA, USA, 9–11 June 2003; pp. 202–210.
- Rubinstein, P.B.; Bartlett, L.; Huang, J.; Taft, N. Learning in a large function space: Privacy-preserving mechanisms for SVM learning. J. Priv. Confid.
**2012**, 4, 65–100. [Google Scholar] - Duchi, J.C.; Jordan, M.I.; Wainwright, M.J. Privacy aware learning. 2014; arXiv: 1210.2085. [Google Scholar]
- Dwork, C.; McSherry, F.; Nissim, K.; Smith, A. Calibrating noise to sensitivity in private data analysis. In Proceedings of the Third Conference on Theory of Cryptography (TCC’06), New York, NY, USA, 5–7 March 2006; pp. 265–284.
- Dwork, C. Differential privacy: A survey of results. In Theory and Applications of Models of Computation, Proceedings of the 5th International Conference, TAMC 2008, Xi’an, China, 25–29 April 2008; Agrawal, M., Du, D., Duan, Z., Li, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2008. Lecture Notes in Computer Science. Volume 4978, pp. 1–19. [Google Scholar]
- Dwork, C.; Lei, J. Differential privacy and robust statistics. In Proceedings of the 41st Annual ACM Symposium on the Theory of Computing, Bethesda, MD, USA, 31 May–2 June 2009; pp. 437–442.
- Kairouz, P.; Oh, S.; Viswanath, P. Extremal mechanisms for local differential privacy. 2014; arXiv: 1407.1338v2. [Google Scholar]
- Calmon, F.P.; Varia, M.; Médard, M.; Christiansen, M.M.; Duffy, K.R.; Tessaro, S. Bounds on inference. In Proceedings of the 51st Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, 2–4 October 2013; pp. 567–574.
- Yamamoto, H. A source coding problem for sources with additional outputs to keep secret from the receiver or wiretappers. IEEE Trans. Inf. Theory
**1983**, 29, 918–923. [Google Scholar] [CrossRef] - Sankar, L.; Rajagopalan, S.; Poor, H. Utility-privacy tradeoffs in databases: An information-theoretic approach. IEEE Trans. Inf. Forensics Secur.
**2013**, 8, 838–852. [Google Scholar] [CrossRef] - Tandon, R.; Sankar, L.; Poor, H. Discriminatory lossy source coding: side information privacy. IEEE Trans. Inf. Theory
**2013**, 59, 5665–5677. [Google Scholar] [CrossRef] - Calmon, F.; Fawaz, N. Privacy against statistical inference. In Proceedings of the 50th Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, 1–5 October 2012; pp. 1401–1408.
- Rebollo-Monedero, D.; Forne, J.; Domingo-Ferrer, J. From t-closeness-like privacy to postrandomization via information theory. IEEE Trans. Knowl. Data Eng.
**2010**, 22, 1623–1636. [Google Scholar] [CrossRef] [Green Version] - Makhdoumi, A.; Salamatian, S.; Fawaz, N.; Médard, M. From the information bottleneck to the privacy funnel. In Proceedings of the IEEE Information Theory Workshop (ITW), Hobart, Australia, 2–5 November 2014; pp. 501–505.
- Tishby, N.; Pereira, F.C.; Bialek, W. The information bottleneck method. 2000; arXiv: physics/0004057. [Google Scholar]
- Calmon, F.P.; Makhdoumi, A.; Médard, M. Fundamental limits of perfect privacy. In Proceedings of the IEEE Int. Symp. Inf. Theory (ISIT), Hong Kong, China, 14–19 June 2015; pp. 1796–1800.
- Wyner, A.D. The Wire-Tap Channel. Bell Syst. Tech. J.
**1975**, 54, 1355–1387. [Google Scholar] [CrossRef] - Makhdoumi, A.; Fawaz, N. Privacy-utility tradeoff under statistical uncertainty. In Proceedings of the 51st Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, 2–4 October 2013; pp. 1627–1634.
- Li, C.T.; El Gamal, A. Maximal correlation secrecy. 2015; arXiv: 1412.5374. [Google Scholar]
- Ahlswede, R.; Gács, P. Spreading of sets in product spaces and hypercontraction of the Markov operator. Ann. Probab.
**1976**, 4, 925–939. [Google Scholar] [CrossRef] - Anantharam, V.; Gohari, A.; Kamath, S.; Nair, C. On maximal correlation, hypercontractivity, and the data processing inequality studied by Erkip and Cover. 2014; arXiv:1304.6133v1. [Google Scholar]
- Courtade, T. Information masking and amplification: The source coding setting. In Proceedings of the IEEE Int. Symp. Inf. Theory (ISIT), Boston, MA, USA, 1–6 July 2012; pp. 189–193.
- Goldwasser, S.; Micali, S. Probabilistic encryption. J. Comput. Syst. Sci.
**1984**, 28, 270–299. [Google Scholar] [CrossRef] - Rockafellar, R.T. Convex Analysis; Princeton Univerity Press: Princeton, NJ, USA, 1997. [Google Scholar]
- Csiszár, I.; Körner, J. Information Theory: Coding Theorems for Discrete Memoryless Systems; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]
- Shulman, N.; Feder, M. The uniform distribution as a universal prior. IEEE Trans. Inf. Theory
**2004**, 50, 1356–1362. [Google Scholar] [CrossRef] - Rudin, W. Real and Complex Analysis, 3rd ed.; McGraw Hill: New York, NY, USA, 1987. [Google Scholar]
- Gebelein, H. Das statistische Problem der Korrelation als Variations- und Eigenwert-problem und sein Zusammenhang mit der Ausgleichungsrechnung. Zeitschrift f ur Angewandte Mathematik und Mechanik
**1941**, 21, 364–379. (In German) [Google Scholar] [CrossRef] - Hirschfeld, H.O. A connection between correlation and contingency. Camb. Philos. Soc.
**1935**, 31, 520–524. [Google Scholar] [CrossRef] - Rényi, A. On measures of dependence. Acta Mathematica Academiae Scientiarum Hungarica
**1959**, 10, 441–451. [Google Scholar] [CrossRef] - Linfoot, E.H. An informational measure of correlation. Inf. Control
**1957**, 1, 85–89. [Google Scholar] [CrossRef] - Csiszár, I. Information-type measures of difference of probability distributions and indirect observation. Studia Scientiarum Mathematicarum Hungarica
**1967**, 2, 229–318. [Google Scholar] - Zhao, L. Common Randomness, Efficiency, and Actions. Ph.D. Thesis, Stanford University, Stanford, CA, USA, 2011. [Google Scholar]
- Berger, T.; Yeung, R. Multiterminal source encoding with encoder breakdown. IEEE Trans. Inf. Theory
**1989**, 35, 237–244. [Google Scholar] [CrossRef] - Kim, Y.H.; Sutivong, A.; Cover, T. State mplification. IEEE Trans. Inf. Theory
**2008**, 54, 1850–1859. [Google Scholar] [CrossRef] - Merhav, N.; Shamai, S. Information rates subject to state masking. IEEE Trans. Inf. Theory
**2007**, 53, 2254–2261. [Google Scholar] [CrossRef] - Ahlswede, R.; Körner, J. Source coding with side information and a converse for degraded broadcast channels. IEEE Trans. Inf. Theory
**1975**, 21, 629–637. [Google Scholar] [CrossRef] - Kim, Y.H.; El Gamal, A. Network Information Theory; Cambridge University Press: Cambridge, UK, 2012. [Google Scholar]
- Asoodeh, S.; Alajaji, F.; Linder, T. Lossless secure source coding, Yamamoto’s setting. In Proceedings of the 53rd Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA, 30 September–2 October 2015.
- Raginsky, M. Logarithmic Sobolev inequalities and strong data processing theorems for discrete channels. In Proceedings of the IEEE Int. Sym. Inf. Theory (ISIT), Istanbul, Turkey, 7–12 July 2013; pp. 419–423.
- Geng, Y.; Nair, C.; Shamai, S.; Wang, Z.V. On broadcast channels with binary inputs and symmetric outputs. IEEE Trans. Inf. Theory
**2013**, 59, 6980–6989. [Google Scholar] [CrossRef] - Sutskover, I.; Shamai, S.; Ziv, J. Extremes of information combining. IEEE Trans. Inf. Theory
**2005**, 51, 1313–1325. [Google Scholar] [CrossRef] - Alajaji, F.; Chen, P.N. Information Theory for Single User Systems, Part I. Course Notes, Queen’s University. Available online: http://www.mast.queensu.ca/math474/it-lecture-notes.pdf (accessed on 4 March 2015).
- Chayat, N.; Shamai, S. Extension of an entropy property for binary input memoryless symmetric channels. IEEE Trans.Inf. Theory
**1989**, 35, 1077–1079. [Google Scholar] [CrossRef] - Oohama, Y. Gaussian multiterminal source coding. IEEE Trans. Inf. Theory
**1997**, 43, 2254–2261. [Google Scholar] [CrossRef] - Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley: New York, NY, USA, 2006. [Google Scholar]
- Linder, T.; Zamir, R. On the asymptotic tightness of the Shannon lower bound. IEEE Trans. Inf. Theory
**2008**, 40, 2026–2031. [Google Scholar] [CrossRef] - Rényi, A. On the dimension and entropy of probability distributions. cta Mathematica Academiae Scientiarum Hungarica
**1959**, 10, 193–215. [Google Scholar] [CrossRef]

**Figure 2.**Privacy filter that achieves the lower bound in (4) where ${Z}_{\delta}$ is the output of an erasure privacy filter with erasure probability specified in (5).

**Figure 4.**Optimal privacy filter for ${P}_{Y|X}=BSC\left(\alpha \right)$ with uniform X where $\delta (\epsilon ,\alpha )$ is specified in (40).

**Figure 5.**Optimal privacy filter for ${P}_{Y|X}=BEC\left(\delta \right)$ where $\delta (\epsilon ,\alpha )$ is specified in (42).

**Figure 6.**The privacy filter associated with (A1) and (A2) with $k=1$. We have ${P}_{Z|Y}(\xb7|1)=\mathsf{Bernoulli}\left(\delta \right)$ and ${P}_{Z|Y}(\xb7|y)=\mathsf{Bernoulli}\left(0\right)$ for $y\in \{2,3,\cdots ,n\}$.

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons by Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Asoodeh, S.; Diaz, M.; Alajaji, F.; Linder, T.
Information Extraction Under Privacy Constraints. *Information* **2016**, *7*, 15.
https://doi.org/10.3390/info7010015

**AMA Style**

Asoodeh S, Diaz M, Alajaji F, Linder T.
Information Extraction Under Privacy Constraints. *Information*. 2016; 7(1):15.
https://doi.org/10.3390/info7010015

**Chicago/Turabian Style**

Asoodeh, Shahab, Mario Diaz, Fady Alajaji, and Tamás Linder.
2016. "Information Extraction Under Privacy Constraints" *Information* 7, no. 1: 15.
https://doi.org/10.3390/info7010015