# On Extractable Shared Information

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

**(LM)**$SI(S;{X}_{1},{X}_{2})\ge SI(f(S);{X}_{1},{X}_{2})$ for any function f. (left monotonicity)

## 2. Properties of Information Decompositions

#### 2.1. The Williams–Beer Axioms

(S) | $SI(S;{X}_{1},\dots ,{X}_{k})$ is symmetric under permutations of ${X}_{1},\cdots ,{X}_{k}$, | (Symmetry) |

(SR) | $SI(S;{X}_{1})=I(S;{X}_{1})$, | (Self-redundancy) |

(M) | $SI(S;{X}_{1},\dots ,{X}_{k-1},{X}_{k})\le SI(S;{X}_{1},\dots ,{X}_{k-1})$, | |

with equality if ${X}_{i}=f({X}_{k})$ for some $i<k$ and some function f. | (Monotonicity) |

(RM) | $SI(S;{X}_{1},\cdots ,{X}_{k})\ge SI(S;{f}_{1}({X}_{1}),\cdots ,{f}_{k}({X}_{k}))$ for all functions ${f}_{1},\cdots ,{f}_{k}$. | (right monotonicity) |

#### 2.2. The Copy Example and the Identity Axiom

**(Id)**$SI(COPY({X}_{1},{X}_{2});{X}_{1},{X}_{2})=I({X}_{1};{X}_{2})$. (Identity)

**(Id)**. Similarly, the measures of bivariate shared information proposed in [1] satisfies

**(Id)**. However,

**(Id)**is incompatible with a nonnegative information decomposition according to the Williams–Beer axioms for $k\ge 3$ [2].

#### 2.3. The Blackwell Property and Property **(*)**

**(BP)**For a given joint distribution ${P}_{S{X}_{1}{X}_{2}}$, $UI(S;{X}_{1}\backslash {X}_{2})$ vanishes if and only if there exists a random variable ${X}_{1}^{\prime}$ such that $S-{X}_{2}-{X}_{1}^{\prime}$ is a Markov chain and ${P}_{S{X}_{1}^{\prime}}={P}_{S{X}_{1}}$. (Blackwell property)

**(*)**$SI$ and $UI$ depend only on the marginal distributions ${P}_{S{X}_{1}}$ and ${P}_{S{X}_{2}}$ of the pairs $(S,{X}_{1})$ and $(S,{X}_{2})$.

**(BP)**, which also depends only on the channels $S\to {X}_{1}$ and $S\to {X}_{2}$ and thus on ${P}_{S{X}_{1}}$ and ${P}_{S{X}_{2}}$. Most information decompositions proposed so far satisfy property (*).

## 3. Extractable Information Measures

- Most information measures satisfy $IM(O;{X}_{1},\dots ,{X}_{k})=0$ when O is a constant random variable. Thus, in this case, $\overline{IM}(S;{X}_{1},\dots ,{X}_{k})\ge 0$. Thus, for example, even though the coinformation can be negative, the extractable coinformation is never negative.
- Suppose that $IM$ satisfies left monotonicity. Then, $\overline{IM}=IM$. For example, entropy H and mutual information I satisfy left monotonicity, and so $\overline{H}=H$ and $\overline{I}=I$. Similarly, as shown in [2], the measure of unique information $\tilde{UI}$ defined in [1] satisfies left monotonicity, and so $\overline{\tilde{UI}}=\tilde{UI}$.
- In fact, $\overline{IM}$ is the smallest left monotonic information measure that is at least as large as $IM$.

**Lemma**

**1.**

**Proof.**

**(PLM)**$IM(S;{X}_{1},{X}_{2})\ge IM({S}^{\prime};{X}_{1},{X}_{2})$ whenever ${S}^{\prime}$ is independent of ${X}_{1}$ given S. (probabilistic left monotonicity)

## 4. Extractable Shared Information

**Lemma**

**2.**

**Proof.**

**Lemma**

**3.**

- If $SI$ satisfies
**(*)**, then $\overline{SI}$ also satisfies**(*)**. - If $SI$ is right monotonic, then $\overline{SI}$ is also right monotonic.

**Proof.**

**Lemma**

**4.**

**Proof.**

**Theorem**

**1.**

**Proof.**

- S and ${X}_{1}$ are independent given $f(S)$.
- The channel $f(S)\to {X}_{1}$ is a garbling of the channel $f(S)\to {X}_{2}$.
- The channel $S\to {X}_{1}$ is not a garbling of the channel $S\to {X}_{2}$.

**Corollary**

**1.**

**Proof.**

## 5. Left Monotonic Information Decompositions

**Proposition**

**1.**

**Proof.**

## 6. Examples

**(*)**. Therefore, $\overline{\tilde{SI}}$ is the smallest left monotonic measure of shared information that satisfies property

**(*)**.

## 7. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## Appendix A. Counterexample in Theorem 1

$f(s)$ | s | ${x}_{1}$ | ${x}_{2}$ | ${P}_{f(S)S{X}_{1}{X}_{2}}$ |

0 | 0 | 0 | 0 | 1/4 |

0 | 1 | 0 | 1 | 1/4 |

0 | 0 | 1 | 0 | 1/8 |

0 | 1 | 1 | 0 | 1/8 |

1 | 2 | 1 | 1 | 1/4 |

## References

- Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J.; Ay, N. Quantifying unique information. Entropy
**2014**, 16, 2161–2183. [Google Scholar] [CrossRef] - Rauh, J.; Bertschinger, N.; Olbrich, E.; Jost, J. Reconsidering unique information: Towards a multivariate information decomposition. Proceedings of 2014 IEEE International Symposium on Information Theory (ISIT), Honolulu, HI, USA, 29 June–4 July 2014; pp. 2232–2236. [Google Scholar]
- Williams, P.; Beer, R. Nonnegative Decomposition of Multivariate Information. arXiv, 2010; arXiv:1004.2515v1. [Google Scholar]
- Harder, M.; Salge, C.; Polani, D. A Bivariate measure of redundant information. Phys. Rev. E
**2013**, 87, 012130. [Google Scholar] [CrossRef] [PubMed] - Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J. Shared Information—New Insights and Problems in Decomposing Information in Complex Systems. In Proceedings of the European Conference on Complex Systems 2012, Brussels, Belgium, 2–7 September 2012; pp. 251–269. [Google Scholar]
- Griffith, V.; Koch, C. Quantifying Synergistic Mutual Information. In Guided Self-Organization: Inception; Prokopenko, M., Ed.; Springer: Berlin/Heidelberg, Germary, 2014; Volume 9, pp. 159–190. [Google Scholar]
- Wibral, M.; Priesemann, V.; Kay, J.W.; Lizier, J.T.; Phillips, W.A. Partial information decomposition as a unified approach to the specification of neural goal functions. Brain Cogn.
**2017**, 112, 25–38. [Google Scholar] [CrossRef] [PubMed] - Bell, A.J. The Co-Information Lattice. In Proceedings of the Fourth International Workshop on Independent Component Analysis and Blind Signal Separation (ICA 03), Nara, Japan, 1–4 April 2003; pp. 921–926. [Google Scholar]
- McGill, W. Multivariate information transmission. IRE Trans. Inf. Theory
**1954**, 4, 93–111. [Google Scholar] [CrossRef] - Blackwell, D. Equivalent Comparisons of Experiments. Ann. Math. Stat.
**1953**, 24, 265–272. [Google Scholar] [CrossRef] - Maurer, U.; Wolf, S. The intrinsic conditional mutual information and perfect secrecy. Proceedings of 1997 IEEE International Symposium on Information Theory, Ulm, Germany, 29 June–4 July 1997. [Google Scholar]
- Griffith, V.; Chong, E.K.P.; James, R.G.; Ellison, C.J.; Crutchfield, J.P. Intersection Information Based on Common Randomness. Entropy
**2014**, 16, 1985–2000. [Google Scholar] [CrossRef] - Rauh, J.; Banerjee, P.K.; Olbrich, E.; Jost, J.; Bertschinger, N.; Wolpert, D. Coarse-graining and the Blackwell order. arXiv, 2017; arXiv:1701.07805. [Google Scholar]

f | ${\mathit{I}}_{min}$ | ${\overline{\mathit{I}}}_{min}$ | $\tilde{\mathit{SI}}$ | $\overline{\tilde{\mathit{SI}}}$ |
---|---|---|---|---|

Copy | 1 | 1 | 0 | $1\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}2$ |

And/Or | $3\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}4log4\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}3$ | $3\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}4log4\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}3$ | $3\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}4log4\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}3$ | |

Xor | 0 | 0 | 0 | 0 |

Sum | $1\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}2$ | $1\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}2$ | $1\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}2$ | $1\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}2$ |

${X}_{1}$ | 0 | 0 | 0 | 0 |

${f}_{1}$ | $1\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}2$ | $1\phantom{\rule{-1.111pt}{0ex}}/\phantom{\rule{-0.55542pt}{0ex}}2$ | 0 | 0 |

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Rauh, J.; Banerjee, P.K.; Olbrich, E.; Jost, J.; Bertschinger, N.
On Extractable Shared Information. *Entropy* **2017**, *19*, 328.
https://doi.org/10.3390/e19070328

**AMA Style**

Rauh J, Banerjee PK, Olbrich E, Jost J, Bertschinger N.
On Extractable Shared Information. *Entropy*. 2017; 19(7):328.
https://doi.org/10.3390/e19070328

**Chicago/Turabian Style**

Rauh, Johannes, Pradeep Kr. Banerjee, Eckehard Olbrich, Jürgen Jost, and Nils Bertschinger.
2017. "On Extractable Shared Information" *Entropy* 19, no. 7: 328.
https://doi.org/10.3390/e19070328