# A Novel Approach to the Partial Information Decomposition

## Abstract

**:**

## 1. Introduction

## 2. Notation and Preliminaries

## 3. Background on the Partial Information Decomposition (PID)

- Synergy $S({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)$, the information found in the joint outcome of all sources, but not in any of their individual outcomes. Synergy is defined as [17]$$\begin{array}{cc}\hfill S({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)& =I(Y;{X}_{1},\dots ,{X}_{n})-{I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y).\hfill \end{array}$$
- Unique information in source ${X}_{i}$, $U({X}_{i}\text{}\to \text{}Y|{X}_{1};\dots ;{X}_{n})$, the non-redundant information in each particular source. Unique information is defined as$$\begin{array}{cc}\hfill U({X}_{i}\text{}\to \text{}Y|{X}_{1};\dots ;{X}_{n})& =I(Y;{X}_{i})-{I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y).\hfill \end{array}$$

## 4. Part I: Redundancy and Union Information from an Ordering Relation

#### 4.1. Introduction

- Monotonicity of mutual information: $A\u228fB\Rightarrow I(A;Y)\le I(B;Y)$ (less informative sources have less mutual information).
- Reflexivity: $A\u228fA$ for all A (each source is at least as informative as itself).
- For all sources ${X}_{i}$, $O\u228f{X}_{i}\u228f({X}_{1},\dots ,{X}_{n})$, where O indicates a constant random variable with a single outcome and $({X}_{1},\dots ,{X}_{n})$ indicates all sources considered jointly (each source is more informative than a trivial source and less informative than all sources jointly).

#### 4.2. Axiomatic Derivation

**Theorem 1.**

- Symmetry: ${I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)$ is invariant to the permutation of ${X}_{1},\dots ,{X}_{n}$.
- Self-redundancy: ${I}_{\cap}({X}_{1}\text{}\to \text{}Y)=I(Y;{X}_{1})$.
- Monotonicity: ${I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)\le {I}_{\cap}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$.
- Order equality: ${I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)={I}_{\cap}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$ if ${X}_{i}\u228f{X}_{n}$ for some $i<n$.
- Existence: There is some Q such that ${I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)=I(Y;Q)$ and $Q\u228f{X}_{i}$ for all i.

**Theorem 2.**

- Symmetry: ${I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)$ is invariant to the permutation of ${X}_{1},\dots ,{X}_{n}$.
- Self-union: ${I}_{\cup}({X}_{1}\text{}\to \text{}Y)=I(Y;{X}_{1})$.
- Monotonicity: ${I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)\ge {I}_{\cup}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$.
- Order equality: ${I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)={I}_{\cup}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$ if ${X}_{n}\u228f{X}_{i}$ for some $i<n$.
- Existence: There is some Q such that ${I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)=I(Y;Q)$ and ${X}_{i}\u228fQ$ for all i.

#### 4.3. Inclusion-Exclusion Principle

**Lemma 1.**

#### 4.4. Relation to Prior Work

#### 4.5. Further Generalizations

- Shannon information theory (beyond mutual information). In Section 4.1, $\varphi $ was the mutual information between each random variable and some target Y. This can be generalized by choosing a different “amount of information” function $\varphi $, so that redundancy and union information are quantified in terms of other measures of statistical dependence. Among many other options, possible choices of $\varphi $ include Pearson’s correlation (for continuous random variables) and measures of statistical dependency based f-divergences [52], Bregman divergences [53], and Fisher information [54].
- Shannon information theory (without a fixed target). The PID can also be defined for a different setup than the typical one considered in the literature. For example, consider a situation where the sources are channels ${\kappa}_{{X}_{1}|Y},\dots ,{\kappa}_{{X}_{n}|Y}$, while the marginal distribution over the target Y is left unspecified. Here one may take $\mathsf{\Omega}$ as the set of channels, $\varphi $ as the channel capacity $\varphi \left({\kappa}_{A|Y}\right):={\mathrm{max}}_{{P}_{Y}}{I}_{{P}_{Y}{\kappa}_{A|Y}}(A;Y)$, and ⊏ as some ordering relation on channels [24]
- Algorithmic information theory. The PID can be defined for other notions of information, such as the ones used in Algorithmic Information Theory (AIT) [55]. In AIT, “information” is not defined in terms of statistical uncertainty, but rather in terms of the program length necessary to generate strings. For example, one may take $\mathsf{\Omega}$ as the set of finite strings, ⊏ as algorithmic conditional independence ($a\u228fb\text{}\mathrm{iff}\text{}K\left(y\right|b)-K(y|b,a)\le \mathrm{const}$, where $K(\xb7|\xb7)$ is conditional Kolmogorov complexity), and $\varphi \left(a\right):=K\left(y\right)-K\left(y\right|a)$ as the “algorithmic mutual information” with some target string y. (This setup is closely related to the notion of algorithmic “common information” [47]).
- Quantum information theory. As a final example, the PID can be defined in the context of quantum information theory. For example, one may take $\mathsf{\Omega}$ as the set of quantum channels, ⊏ as quantum Blackwell order [56,57,58], and $\varphi \left(\mathsf{\Phi}\right)=\mathcal{I}(\rho ,\mathsf{\Phi})$, where $\mathcal{I}$ is the Ohya mutual information for some target density matrix $\rho $ under channel $\mathsf{\Phi}\in \mathsf{\Omega}$ [59].

## 5. Part II: Blackwell Redundancy and Union Information

#### 5.1. The Blackwell Order

#### 5.2. Blackwell Redundancy

**Theorem 3.**

#### 5.3. Blackwell Union Information

**Theorem 4.**

#### 5.4. Relation to Prior Work

#### 5.5. Continuity of Blackwell Redundancy and Union Information

**Theorem 5.**

#### 5.6. Behavior on the COPY Gate

**Theorem 6.**

## 6. Examples and Comparisons to Previous Measures

#### 6.1. Qualitative Comparison

- Has it been defined for more than 2 sources
- Does it obey the Monotonicity axiom from Section 4.2
- Is it compatible with the inclusion-exclusion principle (IEP) for the bivariate case, such that union information as defined in Equation (14) obeys ${I}_{\cup}({X}_{1};{X}_{2}\text{}\to \text{}Y)\le I({X}_{1},{X}_{2};Y)$
- Does it obey the Independent identity property, Equation (4)
- Does it obey the Blackwell property (possibly in its multivariate form, Theorem 3)

#### 6.2. Quantitative Comparison

`dit`Python package [69]. To our knowledge, there have been no previous proposals for how to compute ${I}_{\cap}^{\mathrm{GH}}$. In fact, this measure involves maximizing a convex function subject to linear constraints, and can be computed using similar methods as ${I}_{\cap}^{\prec}$. We provide code for computing ${I}_{\cap}^{\mathrm{GH}}$ at [64].

- The AND gate, $Y={X}_{1}\text{}\mathrm{AND}\text{}{X}_{2}$, with ${X}_{1}$ and ${X}_{2}$ independent. (It is incorrectly stated in Refs. [18,49] that ${I}_{\cap}^{\mathrm{GH}}$ vanishes here; actually ${I}_{\cap}^{\mathrm{GH}}({X}_{1};{X}_{2}\text{}\to \text{}{X}_{1}\text{}\mathrm{AND}\text{}{X}_{2})\approx 0.123$, which corresponds to the maximum achieved in Equation (18) by $Q={X}_{1}\text{}\mathrm{OR}\text{}{X}_{2}$.)
- The SUM gate: $Y={X}_{1}+{X}_{2}$, with ${X}_{1}$ and ${X}_{2}$ independent.
- The UNQ gate: $Y={X}_{1}$. Here ${I}_{\cap}^{\mathrm{Ince}}$ (marked with ∗) gave values that increased with the amount of correlation between ${X}_{1}$ and ${X}_{2}$ but were typically larger than $I({X}_{1};\text{}{X}_{2})$.
- The COPY gate: $Y=({X}_{1},{X}_{2})$. Here, our redundancy measure is equal to the Gács-Körner common information between X and Y, as discussed in Section 5.6. The same holds for the redundancy measures ${I}_{\cap}^{\mathrm{GH}}$ and ${I}_{\cap}^{\u22b2}$, which can be shown using a slight modification of the proof of Theorem 6. For this gate, ${I}_{\cap}^{\mathrm{Ince}}$ (marked with ∗) gave the same values as for the UNQ gate, which increased with the amount of correlation between ${X}_{1}$ and ${X}_{2}$ but were typically larger than $I({X}_{1};{X}_{2})$.

- Three-way AND gate: $Y={X}_{1}\mathrm{AND}{X}_{2}\mathrm{AND}{X}_{3}$, where the sources are binary and uniformly and independently distributed.
- Three-way SUM gate: $Y={X}_{1}+{X}_{2}+{X}_{3}$, where the sources are binary and uniformly and independently distributed.
- “Overlap” gate: we defined four independent uniformly distributed binary random variables, $A,B,C,D$. These were grouped into three sources ${X}_{1},{X}_{2},{X}_{3}$ as ${X}_{1}=(A,B)$, ${X}_{2}=(A,C)$, ${X}_{3}=(A,D)$. The target was the joint outcome of all three sources, $Y=({X}_{1},{X}_{2},{X}_{3})=((A,B),(A,C),(A,D))$. Note that the three sources overlap on a single random variable A, which suggests that the redundancy should be 1 bit.

## 7. Discussion and Future Work

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Appendix A. PID Axioms

- Symmetry: ${I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)$ is invariant to the permutation of ${X}_{1},\dots ,{X}_{n}$.
- Self-redundancy: ${I}_{\cap}({X}_{1}\text{}\to \text{}Y)=I(Y;{X}_{1})$.
- Monotonicity: ${I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)\le {I}_{\cap}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$.
- Deterministic equality:${I}_{\cap}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)={I}_{\cap}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$ if ${X}_{i}=f\left({X}_{n}\right)$ for some $i<n$ and deterministic function f.

- Symmetry: ${I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)$ is invariant to the permutation of ${X}_{1},\dots ,{X}_{n}$.
- Self-union: ${I}_{\cup}({X}_{1}\text{}\to \text{}Y)=I(Y;{X}_{1})$.
- Monotonicity: ${I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)\ge {I}_{\cup}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$.
- Deterministic equality: ${I}_{\cup}({X}_{1};\dots ;{X}_{n}\text{}\to \text{}Y)={I}_{\cup}({X}_{1};\dots ;{X}_{n-1}\text{}\to \text{}Y)$ if ${X}_{n}=f\left({X}_{i}\right)$ for some $i<n$ and deterministic function f.

## Appendix B. Uniqueness Proofs

**Proof of Theorem 1.**

**Proof of Theorem 2.**

## Appendix C. Computing ${I}_{\cap}^{\prec}$

**Theorem A1.**

**Proof.**

## Appendix D. Continuity of ${I}_{\cap}^{\prec}$

**Lemma A1.**

**Proof.**

**Proof of Theorem 5.**

**Corollary A1.**

**Proof.**

## Appendix E. Behavior of ${I}_{\cap}^{\prec}$ on Gaussian Random Variables

## Appendix F. Operational Interpretation of the ${I}_{\cap}^{\mathrm{GH}}$

**Theorem A2.**

**Proof.**

## Appendix G. Equivalence of ${I}_{\cup}^{\prec}$ and ${I}_{\cup}^{\mathrm{BROJA}}$

**Theorem A3.**

**Proof.**

## Appendix H. Relation between ${I}_{\cap}^{\mathrm{WB}}$ and Our General Framework

## Appendix I. Miscellaneous Derivations

**Proof of Lemma 1.**

**Proof of Theorem 3.**

**Proof of Theorem 4.**

**Proof of Theorem 6.**

## References

- Schneidman, E.; Bialek, W.; Berry, M.J. Synergy, Redundancy, and Independence in Population Codes. J. Neurosci.
**2003**, 23, 11539–11553. [Google Scholar] [CrossRef] [PubMed] - Daniels, B.C.; Ellison, C.J.; Krakauer, D.C.; Flack, J.C. Quantifying collectivity. Curr. Opin. Neurobiol.
**2016**, 37, 106–113. [Google Scholar] [CrossRef] [PubMed][Green Version] - Tax, T.; Mediano, P.; Shanahan, M. The partial information decomposition of generative neural network models. Entropy
**2017**, 19, 474. [Google Scholar] [CrossRef] - Amjad, R.A.; Liu, K.; Geiger, B.C. Understanding individual neuron importance using information theory. arXiv
**2018**, arXiv:1804.06679. [Google Scholar] - Lizier, J.; Bertschinger, N.; Jost, J.; Wibral, M. Information decomposition of target effects from multi-source interactions: Perspectives on previous, current and future work. Entropy
**2018**, 20, 307. [Google Scholar] [CrossRef][Green Version] - Wibral, M.; Priesemann, V.; Kay, J.W.; Lizier, J.T.; Phillips, W.A. Partial information decomposition as a unified approach to the specification of neural goal functions. Brain Cogn.
**2017**, 112, 25–38. [Google Scholar] [CrossRef][Green Version] - Timme, N.; Alford, W.; Flecker, B.; Beggs, J.M. Synergy, redundancy, and multivariate information measures: An experimentalist’s perspective. J. Comput. Neurosci.
**2014**, 36, 119–140. [Google Scholar] [CrossRef] - Chan, C.; Al-Bashabsheh, A.; Ebrahimi, J.B.; Kaced, T.; Liu, T. Multivariate Mutual Information Inspired by Secret-Key Agreement. Proc. IEEE
**2015**, 103, 1883–1913. [Google Scholar] [CrossRef] - Rosas, F.E.; Mediano, P.A.; Jensen, H.J.; Seth, A.K.; Barrett, A.B.; Carhart-Harris, R.L.; Bor, D. Reconciling emergences: An information-theoretic approach to identify causal emergence in multivariate data. PLoS Comput. Biol.
**2020**, 16, e1008289. [Google Scholar] [CrossRef] - Cang, Z.; Nie, Q. Inferring spatial and signaling relationships between cells from single cell transcriptomic data. Nat. Commun.
**2020**, 11, 2084. [Google Scholar] [CrossRef] - Williams, P.L.; Beer, R.D. Nonnegative decomposition of multivariate information. arXiv
**2010**, arXiv:1004.2515. [Google Scholar] - Williams, P.L. Information dynamics: Its theory and application to embodied cognitive systems. Ph.D. Thesis, Indiana University, Bloomington, IN, USA, 2011. [Google Scholar]
- Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J.; Ay, N. Quantifying unique information. Entropy
**2014**, 16, 2161–2183. [Google Scholar] [CrossRef][Green Version] - Quax, R.; Har-Shemesh, O.; Sloot, P. Quantifying synergistic information using intermediate stochastic variables. Entropy
**2017**, 19, 85. [Google Scholar] [CrossRef][Green Version] - James, R.G.; Emenheiser, J.; Crutchfield, J.P. Unique information via dependency constraints. J. Phys. Math. Theor.
**2018**, 52, 014002. [Google Scholar] [CrossRef][Green Version] - Griffith, V.; Chong, E.K.; James, R.G.; Ellison, C.J.; Crutchfield, J.P. Intersection information based on common randomness. Entropy
**2014**, 16, 1985–2000. [Google Scholar] [CrossRef][Green Version] - Griffith, V.; Koch, C. Quantifying synergistic mutual information. In Guided Self-Organization: Inception; Springer: Berlin/Heidelberg, Germany, 2014; pp. 159–190. [Google Scholar]
- Griffith, V.; Ho, T. Quantifying redundant information in predicting a target random variable. Entropy
**2015**, 17, 4644–4653. [Google Scholar] [CrossRef][Green Version] - Harder, M.; Salge, C.; Polani, D. Bivariate measure of redundant information. Phys. Rev.
**2013**, 87, 012130. [Google Scholar] [CrossRef][Green Version] - Ince, R. Measuring Multivariate Redundant Information with Pointwise Common Change in Surprisal. Entropy
**2017**, 19, 318. [Google Scholar] [CrossRef][Green Version] - Finn, C.; Lizier, J. Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices. Entropy
**2018**, 20, 297. [Google Scholar] [CrossRef][Green Version] - Shannon, C. The lattice theory of information. Trans. Ire Prof. Group Inf. Theory
**1953**, 1, 105–107. [Google Scholar] [CrossRef] - Shannon, C.E. A note on a partial ordering for communication channels. Inf. Control
**1958**, 1, 390–397. [Google Scholar] [CrossRef][Green Version] - Cohen, J.; Kempermann, J.H.; Zbaganu, G. Comparisons of Stochastic Matrices with Applications in Information Theory, Statistics, Economics and Population; Springer Science & Business Media: Berlin/Heidelberg, Germany, 1998. [Google Scholar]
- Le Cam, L. Sufficiency and approximate sufficiency. Ann. Math. Stat.
**1964**, 35, 1419–1455. [Google Scholar] [CrossRef] - Korner, J.; Marton, K. Comparison of two noisy channels. Top. Inf. Theory
**1977**, 16, 411–423. [Google Scholar] - Torgersen, E. Comparison of Statistical Experiments; Cambridge University Press: Cambridge, UK, 1991; Volume 36. [Google Scholar]
- Blackwell, D. Equivalent comparisons of experiments. Ann. Math. Stat.
**1953**, 24, 265–272. [Google Scholar] [CrossRef] - James, R.; Emenheiser, J.; Crutchfield, J. Unique information and secret key agreement. Entropy
**2019**, 21, 12. [Google Scholar] [CrossRef][Green Version] - Whitelaw, T.A. Introduction to Abstract Algebra, 2nd ed.; OCLC: 17440604; Blackie & Son: London, UK, 1988. [Google Scholar]
- Halmos, P.R. Naive Set Theory; Courier Dover Publications: Mineola, NY, USA, 2017. [Google Scholar]
- McGill, W. Multivariate information transmission. Trans. Ire Prof. Group Inf. Theory
**1954**, 4, 93–111. [Google Scholar] [CrossRef] - Fano, R.M. The Transmission of Information: A Statistical Theory of Communications; Massachusetts Institute of Technology: Cambridge, MA, USA, 1961. [Google Scholar]
- Reza, F.M. An Introduction to Information Theory; Dover Publications, Inc.: Mineola, NY, USA, 1961. [Google Scholar]
- Ting, H.K. On the amount of information. Theory Probab. Its Appl.
**1962**, 7, 439–447. [Google Scholar] [CrossRef] - Yeung, R.W. A new outlook on Shannon’s information measures. IEEE Trans. Inf. Theory
**1991**, 37, 466–474. [Google Scholar] [CrossRef] - Bell, A.J. The co-information lattice. In Proceedings of the Fifth International Workshop on Independent Component Analysis and Blind Signal Separation: ICA, Nara, Japan, 1–4 April 2003. [Google Scholar]
- Tilman. Examples of Common False Beliefs in Mathematics (Dimensions of Vector Spaces). MathOverflow. 2010. Available online: https://mathoverflow.net/q/23501 (accessed on 4 January 2022).
- Rauh, J.; Bertschinger, N.; Olbrich, E.; Jost, J. Reconsidering unique information: Towards a multivariate information decomposition. In Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, 29 June–4 July 2014; pp. 2232–2236. [Google Scholar]
- Rauh, J. Secret Sharing and Shared Information. Entropy
**2017**, 19, 601. [Google Scholar] [CrossRef][Green Version] - Chicharro, D.; Panzeri, S. Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss. Entropy
**2017**, 19, 71. [Google Scholar] [CrossRef][Green Version] - Ay, N.; Polani, D.; Virgo, N. Information decomposition based on cooperative game theory. arXiv
**2019**, arXiv:1910.05979. [Google Scholar] [CrossRef] - Rosas, F.E.; Mediano, P.A.; Rassouli, B.; Barrett, A.B. An operational information decomposition via synergistic disclosure. J. Phys. A Math. Theor.
**2020**, 53, 485001. [Google Scholar] [CrossRef] - Davey, B.A.; Priestley, H.A. Introduction to Lattices and Order; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
- Bertschinger, N.; Rauh, J. The Blackwell relation defines no lattice. In Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, 29 June–4 July 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 2479–2483. [Google Scholar]
- Li, H.; Chong, E.K. On a connection between information and group lattices. Entropy
**2011**, 13, 683–708. [Google Scholar] [CrossRef][Green Version] - Gács, P.; Körner, J. Common information is far less than mutual information. Probl. Control Inf. Theory
**1973**, 2, 149–162. [Google Scholar] - Aumann, R.J. Agreeing to disagree. Ann. Stat.
**1976**, 4, 1236–1239. [Google Scholar] [CrossRef] - Banerjee, P.K.; Griffith, V. Synergy, Redundancy and Common Information. arXiv
**2015**, arXiv:1509.03706v1. [Google Scholar] - Hexner, G.; Ho, Y. Information structure: Common and private (Corresp.). IEEE Trans. Inf. Theory
**1977**, 23, 390–393. [Google Scholar] [CrossRef] - Barrett, A.B. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Phys. Rev. E
**2015**, 91, 052802. [Google Scholar] [CrossRef][Green Version] - Pluim, J.P.; Maintz, J.A.; Viergever, M.A. F-information measures in medical image registration. IEEE Trans. Med. Imaging
**2004**, 23, 1508–1516. [Google Scholar] [CrossRef] - Banerjee, A.; Merugu, S.; Dhillon, I.S.; Ghosh, J.; Lafferty, J. Clustering with Bregman divergences. J. Mach. Learn. Res.
**2005**, 6, 1705–1749. [Google Scholar] - Brunel, N.; Nadal, J.P. Mutual information, Fisher information, and population coding. Neural Comput.
**1998**, 10, 1731–1757. [Google Scholar] [CrossRef] [PubMed] - Li, M.; Vitányi, P. An Introduction to Kolmogorov Complexity and Its Applications; Springer: Berlin/Heidelberg, Germany, 2008; Volume 3. [Google Scholar]
- Shmaya, E. Comparison of information structures and completely positive maps. J. Phys. A Math. Gen.
**2005**, 38, 9717. [Google Scholar] [CrossRef][Green Version] - Chefles, A. The quantum Blackwell theorem and minimum error state discrimination. arXiv
**2009**, arXiv:0907.0866. [Google Scholar] - Buscemi, F. Comparison of quantum statistical models: Equivalent conditions for sufficiency. Commun. Math. Phys.
**2012**, 310, 625–647. [Google Scholar] [CrossRef][Green Version] - Ohya, M.; Watanabe, N. Quantum entropy and its applications to quantum communication and statistical physics. Entropy
**2010**, 12, 1194–1245. [Google Scholar] [CrossRef] - Rauh, J.; Banerjee, P.K.; Olbrich, E.; Jost, J.; Bertschinger, N.; Wolpert, D. Coarse-Graining and the Blackwell Order. Entropy
**2017**, 19, 527. [Google Scholar] [CrossRef] - Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
- Makur, A.; Polyanskiy, Y. Comparison of channels: Criteria for domination by a symmetric channel. IEEE Trans. Inf. Theory
**2018**, 64, 5704–5725. [Google Scholar] [CrossRef] - Benson, H.P. Concave minimization: Theory, applications and algorithms. In Handbook of Global Optimization; Springer: Berlin/Heidelberg, Germany, 1995; pp. 43–148. [Google Scholar]
- Kolchinsky, A. Code for Computing I∩≺. 2022. Available online: https://github.com/artemyk/redundancy (accessed on 3 January 2022).
- Banerjee, P.K.; Rauh, J.; Montúfar, G. Computing the unique information. In Proceedings of the 2018 IEEE International Symposium on Information Theory (ISIT), Vail, CO, USA, 17–22 June 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 141–145. [Google Scholar]
- Banerjee, P.K.; Olbrich, E.; Jost, J.; Rauh, J. Unique informations and deficiencies. In Proceedings of the 2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton), Monticello, IL, USA, 2–5 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 32–38. [Google Scholar]
- Wolf, S.; Wultschleger, J. Zero-error information and applications in cryptography. In Proceedings of the Information Theory Workshop, San Antonio, TX, USA, 24–29 October 2004; IEEE: Piscataway, NJ, USA, 2004; pp. 1–6. [Google Scholar]
- Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J. Shared information - new insights and problems in decomposing information in complex systems. In Proceedings of the European Conference on Complex Systems 2012; Springer: Berlin/Heidelberg, Germany, 2013; pp. 251–269. [Google Scholar]
- James, R.G.; Ellison, C.J.; Crutchfield, J.P. dit: A Python package for discrete information theory. J. Open Source Softw.
**2018**, 3, 738. [Google Scholar] [CrossRef] - Kovačević, M.; Stanojević, I.; Šenk, V. On the entropy of couplings. Inf. Comput.
**2015**, 242, 369–382. [Google Scholar] [CrossRef] - Horst, R. On the global minimization of concave functions. Oper.-Res.-Spektrum
**1984**, 6, 195–205. [Google Scholar] [CrossRef] - Pardalos, P.M.; Rosen, J.B. Methods for global concave minimization: A bibliographic survey. Siam Rev.
**1986**, 28, 367–379. [Google Scholar] [CrossRef] - Williams, P.L.; Beer, R.D. Generalized measures of information transfer. arXiv
**2011**, arXiv:1102.1507. [Google Scholar] - Dubins, L.E. On extreme points of convex sets. J. Math. Anal. Appl.
**1962**, 5, 237–244. [Google Scholar] [CrossRef][Green Version] - Yeung, R.W. A First Course in Information Theory; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Lewis, A.D. Semicontinuity of Rank and Nullity and Some Consequences. 2009. Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.709.7290&rep=rep1&type=pdf (accessed on 3 January 2022).
- Hoffman, A.J. On Approximate Solutions of Systems of Linear Inequalities. J. Res. Natl. Bur. Stand.
**1952**, 49, 174–176. [Google Scholar] [CrossRef] - Daniel, J.W. On Perturbations in Systems of Linear Inequalities. SIAM J. Numer. Anal.
**1973**, 10, 299–307. [Google Scholar] [CrossRef]

**Figure 1.**Partial information decomposition of the information provided by two sources about a target. On the left, we show the decomposition induced by redundancy ${I}_{\cap}$, which leads to measures of unique information U. On the right, we show the decomposition induced by union information ${I}_{\cup}$, which leads to measures of synergy S and excluded information E.

**Figure 2.**Illustration of Theorem 5, which provides a sufficient condition for the local continuity of ${I}_{\cap}^{\prec}$. Consider two scenarios, both of which involves two sources ${X}_{1}$ and ${X}_{2}$ and a target Y with cardinality $\left|\mathcal{Y}\right|=3$. The blue areas indicate the simplex of probability distributions over $\mathcal{Y}$, with the marginal ${P}_{Y}$ and the pairwise conditionals ${P}_{Y|{X}_{i}={x}_{i}}$ marked. On the left, both sources have $\mathrm{rank}\text{}{P}_{Y|{X}_{i}}=3=\left|\mathcal{Y}\right|$, so ${I}_{\cap}^{\prec}$ is locally continuous. On the right, both sources have $\mathrm{rank}\text{}{P}_{Y|{X}_{i}}=2\left|\mathcal{Y}\right|$, so ${I}_{\cap}^{\prec}$ is not necessarily locally continuous. Note that ${I}_{\cap}^{\prec}$ is also continuous if only source has $\mathrm{rank}\text{}{P}_{Y|{X}_{i}}=3$.

**Table 1.**Comparison of different redundancy measures. ? indicate properties that we could not easily establish.

${\mathit{I}}_{\cap}^{\prec}$ | ${\mathit{I}}_{\cap}^{\mathbf{WB}}$ | ${\mathit{I}}_{\cap}^{\mathbf{MMI}}$ | ${\mathit{I}}_{\cap}^{\u22b2}$ | ${\mathit{I}}_{\cap}^{\mathbf{GH}}$ | ${\mathit{I}}_{\cap}^{\mathbf{Ince}}$ | ${\mathit{I}}_{\cap}^{\mathbf{FL}}$ | ${\mathit{I}}_{\cap}^{\mathbf{BROJA}}$ | ${\mathit{I}}_{\cap}^{\mathbf{Harder}}$ | ${\mathit{I}}_{\cap}^{\mathbf{dep}}$ | |
---|---|---|---|---|---|---|---|---|---|---|

More than 2 sources | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||

Monotonicity | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ||

IEP for bivariate case | ✓ | ✓ | ? | ? | ✓ | ✓ | ✓ | |||

Independent identity | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||

Blackwell property | ✓ | ✓ | ✓ | |||||||

Pairwise marginals | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ||||

Target equality | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |

Target | ${\mathit{I}}_{\cap}^{\prec}$ | ${\mathit{I}}_{\cap}^{\mathbf{WB}}$ | ${\mathit{I}}_{\cap}^{\mathbf{MMI}}$ | ${\mathit{I}}_{\cap}^{\wedge}$ | ${\mathit{I}}_{\cap}^{\mathbf{GH}}$ | ${\mathit{I}}_{\cap}^{\mathbf{Ince}}$ | ${\mathit{I}}_{\cap}^{\mathbf{FL}}$ | $\begin{array}{c}{\mathit{I}}_{\cap}^{\mathbf{BROJA}}\\ {\mathit{I}}_{\cap}^{\mathbf{Harder}}\end{array}$ | ${\mathit{I}}_{\cap}^{\mathbf{dep}}$ |
---|---|---|---|---|---|---|---|---|---|

$Y={X}_{1}\text{}\mathrm{AND}\text{}{X}_{2}$ | 0.311 | 0.311 | 0.311 | 0 | 0.123 | 0.104 | 0.561 | 0.311 | 0.082 |

$Y={X}_{1}+{X}_{2}$ | 0.5 | 0.5 | 0.5 | 0 | 0 | 0 | 0.5 | 0.5 | 0.189 |

$Y={X}_{1}$ | $I({X}_{1};\text{}{X}_{2})$ | $I({X}_{1};\text{}{X}_{2})$ | $I({X}_{1};\text{}{X}_{2})$ | $C({X}_{1}\text{}\wedge \text{}{X}_{2})$ | $I({X}_{1};\text{}{X}_{2})$ | * | 1 | $I({X}_{1};\text{}{X}_{2})$ | $I({X}_{1};\text{}{X}_{2})$ |

$Y=({X}_{1},{X}_{2})$ | $C({X}_{1}\text{}\wedge \text{}{X}_{2})$ | 1 | 1 | $C({X}_{1}\text{}\wedge \text{}{X}_{2})$ | $C({X}_{1}\text{}\wedge \text{}{X}_{2})$ | * | 1 | $I({X}_{1};\text{}{X}_{2})$ | $I({X}_{1};\text{}{X}_{2})$ |

Target | ${\mathit{I}}_{\cap}^{\prec}$ | ${\mathit{I}}_{\cap}^{\mathbf{WB}}$ | ${\mathit{I}}_{\cap}^{\mathbf{MMI}}$ | ${\mathit{I}}_{\cap}^{\wedge}$ | ${\mathit{I}}_{\cap}^{\mathbf{Ince}}$ | ${\mathit{I}}_{\cap}^{\mathbf{FL}}$ |
---|---|---|---|---|---|---|

$Y={X}_{1}\text{}\mathrm{AND}\text{}{X}_{2}\text{}\mathrm{AND}\text{}{X}_{3}$ | 0.138 | 0.138 | 0.138 | 0 | 0.024 | 0.294 |

$Y={X}_{1}+{X}_{2}+{X}_{3}$ | 0.311 | 0.311 | 0.311 | 0 | 0 | 0.561 |

$Y=\left(\right(A,B),(A,C),(A,D\left)\right)$ | 1 | 2 | 2 | 1 | 1 | 2 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Kolchinsky, A.
A Novel Approach to the Partial Information Decomposition. *Entropy* **2022**, *24*, 403.
https://doi.org/10.3390/e24030403

**AMA Style**

Kolchinsky A.
A Novel Approach to the Partial Information Decomposition. *Entropy*. 2022; 24(3):403.
https://doi.org/10.3390/e24030403

**Chicago/Turabian Style**

Kolchinsky, Artemy.
2022. "A Novel Approach to the Partial Information Decomposition" *Entropy* 24, no. 3: 403.
https://doi.org/10.3390/e24030403