# Dual Loomis-Whitney Inequalities via Information Theory

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

- Volume lower bounds: In Theorem 3, we establish a new lower bound on the volume of a compact convex set in terms of the size of its slices. Just as Ball [12] extended the Loomis-Whitney inequality to projections in more general subspaces, our inequality also allows for slices parallel to subspaces that are not necessarily ${e}_{i}^{\perp}$. Another distinguishing feature of this bound is that unlike classical dual Loomis-Whitney inequalities, the lower bound is in terms of maximal slices; that is, the largest slice parallel to a given subspace. The key ideas we use are the Brascamp-Lieb inequality and entropy bounds for log-concave random variables.
- Surface area lower bounds: Theorem 7 contains our main result that provides lower bounds for surface areas. Unlike the volume bounds, the surface area bounds are valid for the larger class of polyconvex sets, which consists of finite unions of compact, convex sets. Moreover, the surface area lower bound is not simply in terms of the maximal slice; instead, this bound uses all available slices along a particular hyperplane. As in the volume bounds, the slices used may be parallel to general $(n-1)$-dimensional subspaces and not just ${e}_{i}^{\perp}$. The key idea is motivated by a superadditivity property of Fisher information established in Carlen [24]. Instead of classical Fisher information, we develop superadditivity properties for a new notion of Fisher information which we call the ${L}_{1}$-Fisher information. This superadditivity property when restricted to uniform distributions over convex bodies yields the lower bound in Theorem 7.

**Notation:**For $n\ge 1$, let $[n]$ denote the set $\{1,2,\dots ,n\}$. For $K\subseteq {\mathbb{R}}^{n}$ and any subspace $E\subseteq {\mathbb{R}}^{n}$, the orthogonal projection of K on E is denoted by ${P}_{E}K$. The standard basis vectors in ${\mathbb{R}}^{n}$ are denoted by $\{{e}_{1},{e}_{2},\dots ,{e}_{n}\}$. We use the notation ${V}_{r}$ to denote the volume functional in ${\mathbb{R}}^{r}$. The boundary of K is denoted by $\partial K$ and its surface area is denoted by ${V}_{n-1}(\partial K)$. For a random variable X taking values in ${\mathbb{R}}^{n}$, the marginal of X along a subspace E is denoted by ${P}_{E}X$. In this paper, we shall consider random variables with bounded variances and whose densities lie in the convex set $\{f|{\int}_{{\mathbb{R}}^{n}}f(x)log(1+f(x))<\infty \}$. The differential entropy of such random variables is well-defined and is given by

## 2. Volume Bounds

#### 2.1. Background on the Brascamp-Lieb Inequality

**Theorem**

**1.**

**Theorem**

**2**

**.**Let K be a closed and bounded set in ${\mathbb{R}}^{n}$. Let ${E}_{i}$ and ${c}_{i}$ for $i\in [m]$ and ${M}_{g}$ be as in Theorem 1. Let ${P}_{{E}_{i}}K$ be the projection of K on to the subspace ${E}_{i}$, for $i\in [m]$. Let the dimension of ${E}_{i}$ be ${r}_{i}$ for $i\in [m]$. Then the volume of K may be upper-bounded as follows:

**Proof.**

#### 2.2. Volume Bounds Using Slices

**Theorem**

**3.**

**Proof.**

## 3. Surface Area Bounds

#### 3.1. Superadditivity of Fisher Information

**Theorem**

**4**

**.**For $p\in [1,\infty )$, let $f:{\mathbb{R}}^{m}\times {\mathbb{R}}^{n}\to \mathbb{R}$ be a function in ${L}^{p}({\mathbb{R}}^{m})\otimes {W}^{1,p}({\mathbb{R}}^{n})$. Define the marginal map M as

**Definition**

**1.**

**Lemma**

**1.**

**Proof.**

**Theorem**

**5.**

**Proof.**

#### 3.2. Surface Integral Form of the ${L}_{1}$-Fisher Information

- (a)
- The ${L}_{1}$-Fisher information ${I}_{1}(X)$ is well-defined for X and is given by a surface integral over $\partial K$ and
- (b)
- The quantity ${I}_{1}{(X)}_{{e}_{i}}$ may be calculated exactly given the sizes of all slices parallel to ${e}_{i}^{\perp}$ or may be lower-bounded by using any finite number of slices parallel to ${e}_{i}^{\perp}$.

**Definition**

**2.**

**Definition**

**3.**

**Theorem**

**6.**

**Proof**

**of**

**Theorem**

**6.**

**Lemma**

**2**

**.**Let X be uniformly distributed over a compact measurable set $K\subseteq {\mathbb{R}}^{n}$. If there exists an integer L such that the intersection between K and any straight line can be divided into at most L disjoint closed intervals, then

**Lemma**

**3**

**.**Let X be uniform over a polytope $K\in \mathcal{P}$. Then

#### 3.3. ${L}_{1}$-Fisher Information via Slices

**Lemma**

**4**

**.**Let X be a continuous real-valued random variable with density ${f}_{X}$. If we can find $-\infty ={a}_{0}<{a}_{1}<\dots <{a}_{M+1}=\infty $ such that (a) ${f}_{X}$ is continuous and monotonic on each open interval $({a}_{i},{a}_{i+1})$; (b) For $i=0,\dots ,M$, the limits

**Corollary**

**1.**

**Corollary**

**2.**

**Corollary**

**3.**

**Proof.**

**Remark**

**1.**

#### 3.4. Procedure to Obtain Lower Bounds on the Surface Area

**Lemma**

**5**

**.**Suppose $X=({X}_{1},\dots ,{X}_{n})$ is uniformly distributed over a polytope $K\in \mathcal{P}$. Let u be any unit vector and let ${f}_{X\xb7u}$ be the marginal density of $X\xb7u$. Then ${f}_{X\xb7u}(\xb7)$ satisfies the conditions in Lemma 4.

**Theorem**

**7.**

**Proof.**

**Lemma**

**6**

**.**Let ${K}_{1},\dots {K}_{m}\subseteq {\mathbb{R}}^{n}$ be compact sets. Let $\{{K}_{i}^{k}\}$, $k\ge 1$ be a sequence of compact approximations converging to ${K}_{i}$ in Hausdorff distance, such that ${K}_{i}\subseteq {K}_{i}^{n}$ for all $n\ge 1$ and for $i\in [m]$. Then it holds that

## 4. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## Appendix A. Proof of Lemma 2

- 1.
- Support of ${f}_{k}$ is larger than or equal to $\u03f5$, then$$\begin{array}{c}\hfill {\int}_{\mathbb{R}}\frac{|{f}_{k}({x}_{1})-{f}_{k}({x}_{1}-\u03f5)|}{\u03f5}dx=2\xb7\frac{1}{{V}_{n}(K)}\u03f5/\u03f5=\frac{2}{{V}_{n}(K)}.\end{array}$$

- 2.
- Support of ${f}_{k}$ is ${\u03f5}^{\prime}<\u03f5$, then$$\begin{array}{c}\hfill {\int}_{\mathbb{R}}\frac{|{f}_{k}({x}_{1})-{f}_{k}({x}_{1}-\u03f5)|}{\u03f5}d{x}_{1}=2\xb7\frac{1}{{V}_{n}(K)}{\u03f5}^{\prime}/\u03f5\le \frac{2}{{V}_{n}(K)}.\end{array}$$

## Appendix B. Proof of Lemma 3

## Appendix C. Proof of Lemma 4

## Appendix D. Proof of Lemma 5

**Remark**

**A1.**

## Appendix E. Example

## References

- Gardner, R. Geometric Tomography; Cambridge University Press: Cambridge, UK, 1995; Volume 58. [Google Scholar]
- Campi, S.; Gronchi, P. Estimates of Loomis–Whitney type for intrinsic volumes. Adv. Appl. Math.
**2011**, 47, 545–561. [Google Scholar] [CrossRef] - Wulfsohn, D.; Gunderson, H.; Jensen, V.; Nyengaard, J. Volume estimation from projections. J. Microsc.
**2004**, 215, 111–120. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Wulfsohn, D.; Nyengaard, J.; Gundersen, H.; Jensen, V. Stereology for Biosystems Engineering. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.497.404&rep=rep1&type=pdf (accessed on 16 August 2019).
- Shepherd, T.; Rankin, A.; Alderton, D. A Practical Guide to Fluid Inclusion Studies; Blackie Academic & Professional: Los Angeles, CA, USA, 1985. [Google Scholar]
- Bakker, R.; Diamond, L. Estimation of volume fractions of liquid and vapor phases in fluid inclusions, and definition of inclusion shapes. Am. Mineral.
**2006**, 91, 635–657. [Google Scholar] [CrossRef] - Connelly, R.; Ostro, S. Ellipsoids and lightcurves. Geometriae Dedicata
**1984**, 17, 87–98. [Google Scholar] [CrossRef] - Ostro, S.; Connelly, R. Convex profiles from asteroid lightcurves. Icarus
**1984**, 57, 443–463. [Google Scholar] [CrossRef] - Loomis, L.; Whitney, H. An inequality related to the isoperimetric inequality. Bull. Am. Math. Soc.
**1949**, 55, 961–962. [Google Scholar] [CrossRef] [Green Version] - Burago, Y.; Zalgaller, V. Geometric Inequalities; Springer Science & Business Media: Berlin, Germany, 2013; Volume 285. [Google Scholar]
- Bollobás, B.; Thomason, A. Projections of bodies and hereditary properties of hypergraphs. Bull. Lond. Math. Soc.
**1995**, 27, 417–424. [Google Scholar] [CrossRef] - Ball, K. Shadows of convex bodies. Trans. Am. Math. Soc.
**1991**, 327, 891–901. [Google Scholar] [CrossRef] - Ball, K. Convex geometry and functional analysis. Handb. Geom. Banach Spaces
**2001**, 1, 161–194. [Google Scholar] - Bennett, J.; Carbery, A.; Christ, M.; Tao, T. The Brascamp–Lieb inequalities: Finiteness, structure and extremals. Geom. Funct. Anal.
**2008**, 17, 1343–1415. [Google Scholar] [CrossRef] - Balister, P.; Bollobás, B. Projections, entropy and sumsets. Combinatorica
**2012**, 32, 125–141. [Google Scholar] [CrossRef] - Gyarmati, K.; Matolcsi, M.; Ruzsa, I. A superadditivity and submultiplicativity property for cardinalities of sumsets. Combinatorica
**2010**, 30, 163–174. [Google Scholar] [CrossRef] [Green Version] - Madiman, M.; Tetali, P. Information inequalities for joint distributions, with interpretations and applications. IEEE Trans. Inf. Theory
**2010**, 56, 2699–2713. [Google Scholar] [CrossRef] - Betke, U.; McMullen, P. Estimating the sizes of convex bodies from projections. J. Lond. Math. Soc.
**1983**, 2, 525–538. [Google Scholar] [CrossRef] - Schneider, R. Convex Bodies: The Brunn–Minkowski Theory; Number 151; Cambridge University Press: Cambridge, UK, 2014. [Google Scholar]
- Meyer, M. A volume inequality concerning sections of convex sets. Bull. Lond. Math. Soc.
**1988**, 20, 151–155. [Google Scholar] [CrossRef] - Li, A.J.; Huang, Q. The dual Loomis-Whitney inequality. Bull. Lond. Math. Soc.
**2016**, 48, 676–690. [Google Scholar] [CrossRef] - Liakopoulos, D.M. Reverse Brascamp–Lieb inequality and the dual Bollobás–Thomason inequality. Archiv der Mathematik
**2019**, 112, 293–304. [Google Scholar] [CrossRef] - Campi, S.; Gardner, R.; Gronchi, P. Reverse and dual Loomis-Whitney-type inequalities. Trans. Am. Math. Soc.
**2016**, 368, 5093–5124. [Google Scholar] [CrossRef] - Carlen, E. Superadditivity of Fisher’s information and logarithmic Sobolev inequalities. J. Funct. Anal.
**1991**, 101, 194–211. [Google Scholar] [CrossRef] - Carlen, E.; Cordero-Erausquin, D. Subadditivity of the entropy and its relation to Brascamp–Lieb type inequalities. Geom. Funct. Anal.
**2009**, 19, 373–405. [Google Scholar] [CrossRef] - Beckenbach, E.; Bellman, R. Inequalities; Springer Science & Business Media: Berlin, Germany, 2012; Volume 30. [Google Scholar]
- Saumard, A.; Wellner, J. Log-concavity and strong log-concavity: A review. Stat. Surv.
**2014**, 8, 45. [Google Scholar] [CrossRef] [PubMed] - Bobkov, S.; Madiman, M. The entropy per coordinate of a random vector is highly constrained under convexity conditions. IEEE Trans. Inf. Theory
**2011**, 57, 4940–4954. [Google Scholar] [CrossRef] - Cover, T.T.; Thomas, J. Elements of Information Theory; John Wiley & Sons: New York, NY, USA, 2012. [Google Scholar]
- Jog, V.; Anantharam, V. Intrinsic entropies of log-concave distributions. IEEE Trans. Inf. Theory
**2017**, 64, 93–108. [Google Scholar] [CrossRef] - Dembo, A.; Cover, T.; Thomas, J. Information theoretic inequalities. IEEE Trans. Inf. Theory
**1991**, 37, 1501–1518. [Google Scholar] [CrossRef] - Madiman, M.; Barron, A. Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inf. Theory
**2007**, 53, 2317–2329. [Google Scholar] [CrossRef] - Ball, K. Volumes of sections of cubes and related problems. In Geometric Aspects of Functional Analysis; Springer: Berlin, Germany, 1989; pp. 251–260. [Google Scholar]
- Sason, I.; Verdu, S. f-divergence Inequalities. IEEE Trans. Inf. Theory
**2016**, 62, 5973–6006. [Google Scholar] [CrossRef] - Klain, D.A.; Rota, G.C. Introduction to Geometric Probability; Cambridge University Press: Cambridge, UK, 1997. [Google Scholar]
- Meschenmoser, D.; Spodarev, E. On the computation of intrinsic volumes. preprint
**2012**. [Google Scholar] - Lasserre, J. Volume of slices and sections of the simplex in closed form. Optim. Lett.
**2015**, 9, 1263–1269. [Google Scholar] [CrossRef] [Green Version]

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Hao, J.; Jog, V.
Dual Loomis-Whitney Inequalities via Information Theory. *Entropy* **2019**, *21*, 809.
https://doi.org/10.3390/e21080809

**AMA Style**

Hao J, Jog V.
Dual Loomis-Whitney Inequalities via Information Theory. *Entropy*. 2019; 21(8):809.
https://doi.org/10.3390/e21080809

**Chicago/Turabian Style**

Hao, Jing, and Varun Jog.
2019. "Dual Loomis-Whitney Inequalities via Information Theory" *Entropy* 21, no. 8: 809.
https://doi.org/10.3390/e21080809