- freely available
Quantifying Unique Information
AbstractWe propose new measures of shared information, unique information and synergistic information that can be used to decompose the mutual information of a pair of random variables (Y, Z) with a third random variable X. Our measures are motivated by an operational idea of unique information, which suggests that shared information and unique information should depend only on the marginal distributions of the pairs (X, Y) and (X,Z). Although this invariance property has not been studied before, it is satisfied by other proposed measures of shared information. The invariance property does not uniquely determine our new measures, but it implies that the functions that we define are bounds to any other measures satisfying the same invariance property. We study properties of our measures and compare them to other candidate measures.
Share & Cite This Article
Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J.; Ay, N. Quantifying Unique Information. Entropy 2014, 16, 2161-2183.View more citation formats
Bertschinger N, Rauh J, Olbrich E, Jost J, Ay N. Quantifying Unique Information. Entropy. 2014; 16(4):2161-2183.Chicago/Turabian Style
Bertschinger, Nils; Rauh, Johannes; Olbrich, Eckehard; Jost, Jürgen; Ay, Nihat. 2014. "Quantifying Unique Information." Entropy 16, no. 4: 2161-2183.
Notes: Multiple requests from the same IP address are counted as one view.