The Gromov–Wasserstein Distance: A Brief Overview

: We recall the construction of the Gromov–Wasserstein distance and concentrate on quantitative aspects of the deﬁnition


Introduction
Modeling datasets as metric spaces seems to be natural for some applications and concepts revolving around the Gromov-Hausdorff distance-a notion of distance between compact metric spaces-provide a useful language for expressing properties of data and shape analysis methods.In many situations, however, this is not enough, and one must incorporate other sources of information into the model, with "weights" attached to each point being one of them.This gives rise to the idea of representing data as metric measure spaces, which are metric spaces endowed with a probability measure.In terms of a distance, the Gromov-Hausdorff metric is replaced with the Gromov-Wasserstein metric.

Notation and Background Concepts
The book by Burago, et al. [1] is a valuable source for many concepts in metric geometry.We refer the reader to that book for any concepts not explicitly defined in these notes.
We let M denote the collection of all compact metric spaces and by M iso the collection of all isometry classes of M.
Recall that for a given metric space (X, d X ) ∈ M, its diameter is defined as diam (X) := max x,x ∈X d X (x, x ).Similarly, the radius of X is defined as rad (X) := min x∈X max x ∈X d X (x, x ).
For a fixed metric space (Z, d Z ), we let d Z H denote the Hausdorff distance between (closed) subsets of Z.
We will often refer to a metric space (X, d X ) by only X, but the notation for the underlying metric will be implicitly understood to be d X .Recall that a map ϕ : The map ϕ is an isometry if it is a surjective isometric embedding.
Recall that given measurable spaces (X, Σ X ) and (Y, Σ Y ), a measure µ on (X, Σ X ) and a measurable map f : X → Y , the push-forward measure A metric measure space (mm-space for short) is a triple (X, d X , µ X ) where (X, d X ) is a compact metric space and µ X is a Borel probability measure with full support: supp (µ X ) = X.We denote by M w the collection of all mm-spaces.An isomorphism between X, Y ∈ M w is any isometry

The Gromov-Hausdorff Distance
One says that a subset R ⊂ X × Y is a correspondence between sets X and Y whenever π 1 (R) = X and π 2 (R) = Y, where π 1 : X × Y → X and π 2 : X × Y → Y are the canonical projections.Let R(X, Y ) denote the set of all correspondences between X and Y .
The Gromov-Hausdorff (GH) distance between compact metric spaces (X, d X ) and (Y, d Y ) is defined as: where R ranges over R(X, Y ).
Example 1.The GH distance between any compact metric space X and the space with exactly one point is equal to 1 2 diam (X) .
It turns out that (M, d GH ) is a nice space in that is has many compact subclasses.
Theorem 1. ( [1]) Let N : [0, +∞) → N be a bounded function and D > 0. Let F(N, D) ⊂ M be any family of compact metric spaces, such that diam (X) ≤ D for all X ∈ F(N, D), such that for any ε > 0, any X ∈ F(N, ε) admits an ε-net with at most N (ε) elements.Then, F(N, D) is pre-compact in the Gromov-Hausdorff topology.
Example 2. An important example of families, such as the above, is given by those closed n-dimensional Riemannian manifolds (X, g X ) ∈ M(n, κ, D) with the diameter bounded by D > 0 and the Ricci curvature bounded below by κ.
It then follows from the two theorems above that classes F(N, D), such as above, are totally bounded for the Gromov-Hausdorff distance.This means that such classes are easy to organize in the sense of clustering or databases.
In many practical applications, one would like to take into account "weights" attached to points in a dataset.For example, the two metric spaces with the weights below are isometric, but not isomorphic in the sense that no isometry respects the weights: The idea is that weights represent how much we trust a given "measurement" in practical applications.This leads to considering a more general collection of datasets and, in turn, an adapted notion of equality and a compatible metric over them.This naturally leads to regarding datasets as mm-spaces and then finding a notion of distance on M w compatible with isomorphism of mm-spaces.

A Metric on M w
Let (X, d X , µ X ) and (Y, d Y , µ Y ) be two given mm-spaces.In our path to defining a distance between mm-spaces, we emulate the construction of the Gromov-Hausdorff distance and start by identifying a notion of correspondence between mm-spaces.
A probability measure µ over X × Y is called a coupling between µ X and µ Y if (π 1 ) # µ = µ X and (π 2 ) # µ = µ Y .We denote by U(µ X , µ Y ) the collection of all couplings between µ X and µ Y .
Example 3. When Y = {p}, µ Y = δ p , and thus, there is a unique coupling between X and Y : Example 4. Consider for example the spaces with two points each that we depicted above.In that case, µ X can be identified with the vector ( 1 2 , 1 2 ) and µ Y with the vector ( 3 4 , 1  4 ).In this case, one sees that the matrix: Remark 1.This is an L p analogue of Equation (1).

Theorem 3 ([3]
).The Gromov-Wasserstein distance of order p ≥ 1 defines a proper distance on the collection of isomorphism classes of mm-spaces.
By standard compactness arguments, one can prove that the infimum above is always attained [3].Let U opt p (X, Y ) denote the set of all the couplings in U(µ X , µ Y ) that achieve the minimum.The structure of the former set depends not only on µ X and µ Y , but also on d X , d Y and p.
Example 5. Consider the mm-space with exactly one point: ({ * }, (0), δ * ).Then, 1/p and we define diam p (X)-the p-statistical diameter of X-as twice the right-hand side.Notice that lim p→∞ diam p (X) is equal to the usual diameter of X (as a metric space).

Question 1.
To what extent are we able to replicate the nice properties of (M, d GH ) in the context of (M w , d GW,p )?In particular, it is of interest to investigate whether this new space of datasets is complete and whether one can easily identify rich pre-compact classes.
Remark 2. Recall Example 2, where closed n-dimensional Riemannian manifolds were regarded as metric spaces.
One can, all the same, embed closed Riemannian manifolds into M w via (X, g X ) → (X, d X , µ X ), where d X is the geodesic distance induced by the metric tensor g X and µ X stands for the normalized volume measure on X.It is well known [4] that for ε > 0 small, , where s X (x) is the scalar curvature of X at x, and vol(X) is the total volume of X.Thus, a lower bound on µ X (B ε (x)) plays the role of a proxy for an upper bound on curvature.

Completeness
The Space M w with any p-Gromov-Wasserstein distance is not complete.Indeed, consider the following family of mm-spaces: ∆ n ∈ M w , where ∆ n consists of n ∈ N points at distance one from each other, and all with weights 1/n.
The claim will follow from the following claim and triangle inequality for d GW,p : In order to verify the claim, we denote by {x 1 , x 2 , . . ., x n } the points of ∆ n and label the points in ∆ n•m by {y 11 , . . ., y 1m , y 21 , . . ., y 2m , . . . . . ., y n1 , . . ., y nm }.Consider the following coupling between µ n and µ n•m , the reference measures on ∆ n and ∆ n•m : µ(x i , y kj ) := 1 n • m δ ik , for all i, k ∈ {1, . . ., n} and j ∈ {1, . . ., m} It is clear that this defines a valid coupling between µ n and µ n•m .Now, note that 1/p , so the claim follows.
Claim 1 indicates that {∆ n } n∈N constitutes a Cauchy sequence in M w .However, a potential limit object for this sequence will have countably infinitely many points at distance one from each other.This space is not compact, thus d GW,p is not a complete metric.

Other Properties: Geodesics and Alexandrov Curvature
Very recently, Sturm [5] pointed out that M w is a geodesic space when endowed with any d GW,p , p ≥ 1.This means that given any two spaces
Sturm further proved that the completion M w of the space M w with metric d GW,2 satisfies: ).The metric space M w , d GW,2 is an Alexandrov space of curvature ≥ 0.
Amongst the consequences of this property is the fact that one can conceive of gradient flows on the space of all mm-spaces [5].

The Metric d GW,p in Applications
Applications of the notion of Gromov-Wasserstein distance arise in shape and data analysis.In shape analysis, the main application is shape matching under invariances.Many easily computable lower bounds for the GW distance have been discussed in [3,6].All of them lead to solving linear programming optimization problems (for which there are polynomial time algorithms) or can be computed via elicit formulas.As an example, consider the following invariant of an mm-space (X, d X , µ X ): This invariant simply encodes the distribution of pairwise distances on the dataset X, and it is defined by analogy with the so-called shape distributions that are well known in computer graphics [7].Then, one has: Proposition 2 ( [3,6]).Let X, Y ∈ M w be any two mm-spaces and p ≥ 1.Then, This invariant is also related to the work of Boutin and Kemper [8] and Brinkman and Olver [9].
Other lower bounds which can be computed in time polynomial in the number of points in the underlying mm-spaces have been reported in [3].As a primary example, one has that the local shape distributions of shapes provide a lower bound which is strictly stronger than the ones in the Proposition above.In more detail, consider for a given mm-space (X, d X , µ X ) the invariant: Then, for mm-spaces X and Y consider the cost function c X,Y : X × Y → R + given by: One then has: Proposition 3 ( [3,6]).Let X, Y ∈ M w be any two mm-spaces and p ≥ 1.Then, where µ ranges in U(µ X , µ Y ).
Remark 4. Solving for the infimum above leads to a mass transportation problem for which there exists efficient linear programming techniques.
Remark 5.It is possible to define a notion of spectral Gromov-Wasserstein distance which operates at the level of compact Riemannian manifolds without boundaries, and is based con the comparison of heat-kernels.This notion permits inter-relating many pre-existing shape matching methods and suggests some others [12].

Discussion and Outlook
The Gromov-Hausdorff distance offers a useful language for expressing different tasks in shape and data analysis.Its origins are in the work of Gromov on synthetic geometry.For finite metric spaces, the Gromov-Hausdorff distance leads to solving NP-hard combinatorial optimization problems.Related to construction is Gromov-Wasserstein distances that operate on metric measure spaces [3,10].In contrast to the Gromov-Hausdorff distance, the computation of Gromov-Wasserstein distances leads to solving quadratic optimization problems on continuous variables.The space of all metric measures induces a valid coupling.Now, given p ≥ 1, consider the function(x, y, x , y ) → d X (x, x ) − d Y (y, y ) pand pick any µ ∈ U(µ X , µ Y ).One then integrates this function against the measure µ ⊗ µ and infimizes over the choice of µ ∈ U(µ X , µ Y ) to define the Gromov-Wasserstein distance of order p [3]: