Next Article in Journal
Weak n-Ary Relational Products in Allegories
Next Article in Special Issue
A Simplified Algorithm for Inverting Higher Order Diffusion Tensors
Previous Article in Journal
Space-Time Fractional Reaction-Diffusion Equations Associated with a Generalized Riemann–Liouville Fractional Derivative
Previous Article in Special Issue
Matching the LBO Eigenspace of Non-Rigid Shapes via High Order Statistics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Gromov–Wasserstein Distance: A Brief Overview

Department of Mathematics, The Ohio State University, Columbus, OH, USA
Axioms 2014, 3(3), 335-341; https://doi.org/10.3390/axioms3030335
Submission received: 1 May 2014 / Revised: 12 August 2014 / Accepted: 22 August 2014 / Published: 2 September 2014

Abstract

:
We recall the construction of the Gromov–Wasserstein distance and concentrate on quantitative aspects of the definition.

1. Introduction

Modeling datasets as metric spaces seems to be natural for some applications and concepts revolving around the Gromov–Hausdorff distance—a notion of distance between compact metric spaces—provide a useful language for expressing properties of data and shape analysis methods. In many situations, however, this is not enough, and one must incorporate other sources of information into the model, with “weights” attached to each point being one of them. This gives rise to the idea of representing data as metric measure spaces, which are metric spaces endowed with a probability measure. In terms of a distance, the Gromov–Hausdorff metric is replaced with the Gromov–Wasserstein metric.

1.1. Notation and Background Concepts

The book by Burago, et al. [1] is a valuable source for many concepts in metric geometry. We refer the reader to that book for any concepts not explicitly defined in these notes.
We let M denote the collection of all compact metric spaces and by M iso the collection of all isometry classes of M . Recall that for a given metric space ( X , d X ) M , its diameter is defined as diam X : = max x , x X d X ( x , x ) . Similarly, the radius of X is defined as rad X : = min x X max x X d X ( x , x ) .
For a fixed metric space ( Z , d Z ) , we let d H Z denote the Hausdorff distance between (closed) subsets of Z.
We will often refer to a metric space ( X , d X ) by only X, but the notation for the underlying metric will be implicitly understood to be d X . Recall that a map φ : X Y between metric spaces ( X , d X ) and ( Y , d Y ) is an isometric embedding if d Y ( φ ( x ) , φ ( x ) ) = d X ( x , x ) for all x , x X . The map φ is an isometry if it is a surjective isometric embedding.
Recall that given measurable spaces ( X , Σ X ) and ( Y , Σ Y ) , a measure μ on ( X , Σ X ) and a measurable map f : X Y , the push-forward measure f # μ on ( Y , Σ Y ) acts according to f # μ ( B ) = μ ( f 1 ( B ) ) for any B Σ Y .
A metric measure space (mm-space for short) is a triple ( X , d X , μ X ) where ( X , d X ) is a compact metric space and μ X is a Borel probability measure with full support: supp μ X = X . We denote by M w the collection of all mm-spaces. An isomorphism between X , Y M w is any isometry Ψ : X Y , such that Ψ # μ X = μ Y .

2. The Gromov–Hausdorff Distance

One says that a subset R X × Y is a correspondence between sets X and Y whenever π 1 ( R ) = X and π 2 ( R ) = Y , where π 1 : X × Y X and π 2 : X × Y Y are the canonical projections. Let R ( X , Y ) denote the set of all correspondences between X and Y.
The Gromov–Hausdorff (GH) distance between compact metric spaces ( X , d X ) and ( Y , d Y ) is defined as:
d G H ( X , Y ) : = 1 2 inf R sup ( x , y ) , ( x , y ) R | d X ( x , x ) d Y ( y , y ) |
where R ranges over R ( X , Y ) .
Example 1. 
The GH distance between any compact metric space X and the space with exactly one point is equal to 1 2 diam X .
It turns out that ( M , d G H ) is a nice space in that is has many compact subclasses.
Theorem 1. 
([1]) Let N : [ 0 , + ) N be a bounded function and D > 0 . Let F ( N , D ) M be any family of compact metric spaces, such that diam X D for all X F ( N , D ) , such that for any ε > 0 , any X F ( N , ε ) admits an ε-net with at most N ( ε ) elements. Then, F ( N , D ) is pre-compact in the Gromov–Hausdorff topology.
Example 2. 
An important example of families, such as the above, is given by those closed n-dimensional Riemannian manifolds ( X , g X ) M ( n , κ , D ) with the diameter bounded by D > 0 and the Ricci curvature bounded below by κ.
Theorem 2 
([2]). The space ( M iso , d G H ) is complete.
It then follows from the two theorems above that classes F ( N , D ) , such as above, are totally bounded for the Gromov–Hausdorff distance. This means that such classes are easy to organize in the sense of clustering or databases.
In many practical applications, one would like to take into account “weights” attached to points in a dataset. For example, the two metric spaces with the weights below are isometric, but not isomorphic in the sense that no isometry respects the weights:
Axioms 03 00335 i001
The idea is that weights represent how much we trust a given “measurement” in practical applications. This leads to considering a more general collection of datasets and, in turn, an adapted notion of equality and a compatible metric over them. This naturally leads to regarding datasets as mm-spaces and then finding a notion of distance on M w compatible with isomorphism of mm-spaces.

3. A Metric on M w

Let ( X , d X , μ X ) and ( Y , d Y , μ Y ) be two given mm-spaces. In our path to defining a distance between mm-spaces, we emulate the construction of the Gromov–Hausdorff distance and start by identifying a notion of correspondence between mm-spaces.
A probability measure μ over X × Y is called a coupling between μ X and μ Y if ( π 1 ) # μ = μ X and ( π 2 ) # μ = μ Y . We denote by U ( μ X , μ Y ) the collection of all couplings between μ X and μ Y .
Example 3. 
When Y = { p } , μ Y = δ p , and thus, there is a unique coupling between X and Y: U ( μ X , μ Y ) = { μ X δ p } .
Example 4. 
Consider for example the spaces with two points each that we depicted above. In that case, μ X can be identified with the vector ( 1 2 , 1 2 ) and μ Y with the vector ( 3 4 , 1 4 ) . In this case, one sees that the matrix:
1 4 1 2 1 4 0
induces a valid coupling.
Now, given p 1 , consider the function
( x , y , x , y ) | d X ( x , x ) d Y ( y , y ) | p
and pick any μ U ( μ X , μ Y ) . One then integrates this function against the measure μ μ and infimizes over the choice of μ U ( μ X , μ Y ) to define the Gromov–Wasserstein distance of order p [3]:
d G W , p ( X , Y ) : = 1 2 inf μ | d X ( x , x ) d Y ( y , y ) | p μ ( d x × d y ) μ ( d x × d y ) 1 / p
Remark 1. 
This is an L p analogue of Equation (1).
Theorem 3 
([3]). The Gromov–Wasserstein distance of order p 1 defines a proper distance on the collection of isomorphism classes of mm-spaces.
By standard compactness arguments, one can prove that the infimum above is always attained [3]. Let U p opt ( X , Y ) denote the set of all the couplings in U ( μ X , μ Y ) that achieve the minimum. The structure of the former set depends not only on μ X and μ Y , but also on d X , d Y and p.
Example 5. 
Consider the mm-space with exactly one point: ( { * } , ( 0 ) , δ * ) . Then,
d G W , p ( X , { * } ) = 1 2 d X ( x , x ) p μ X ( d x ) μ X ( d x ) 1 / p
and we define diam p X —the p-statistical diameter of X—as twice the right-hand side. Notice that lim p diam p X is equal to the usual diameter of X (as a metric space).
Question 1. 
To what extent are we able to replicate the nice properties of ( M , d G H ) in the context of ( M w , d G W , p ) ? In particular, it is of interest to investigate whether this new space of datasets is complete and whether one can easily identify rich pre-compact classes.

3.1. Pre-Compactness

Theorem 4 
([3]). For a function non-decreasing ρ : [ 0 , ) [ 0 , 1 ] , such that ρ ( ε ) > 0 for ε > 0 and D > 0 , let F w ( ρ , D ) M w denote the set of all mm-spaces X, such that diam X D and inf x μ X B ε ( x ) ρ ( ε ) for all ε > 0 . Then, F w ( ρ , D ) is pre-compact for the Gromov–Wasserstein topology, for any p 1 .
Remark 2. 
Recall Example 2, where closed n-dimensional Riemannian manifolds were regarded as metric spaces. One can, all the same, embed closed Riemannian manifolds into M w via ( X , g X ) ( X , d X , μ X ) , where d X is the geodesic distance induced by the metric tensor g X and μ X stands for the normalized volume measure on X. It is well known [4] that for ε > 0 small, μ X B ε ( x ) = c n vol ( X ) ε n 1 s X ( x ) 6 ( n + 1 ) ε 2 + O ( ε 4 ) , where s X ( x ) is the scalar curvature of X at x, and vol ( X ) is the total volume of X. Thus, a lower bound on μ X B ε ( x ) plays the role of a proxy for an upper bound on curvature.

3.2. Completeness

The Space M w with any p-Gromov–Wasserstein distance is not complete. Indeed, consider the following family of mm-spaces: Δ n M w , where Δ n consists of n N points at distance one from each other, and all with weights 1 / n .
Claim 1. 
For all n , m 1 , d G W , p ( Δ n , Δ m ) 1 2 n 1 / p + m 1 / p .
The claim will follow from the following claim and triangle inequality for d G W , p :
Claim 2. 
For all n , m 1 , d G W , p ( Δ n , Δ n · m ) 1 2 n 1 / p .
In order to verify the claim, we denote by { x 1 , x 2 , , x n } the points of Δ n and label the points in Δ n · m by { y 11 , , y 1 m , y 21 , , y 2 m , , y n 1 , , y n m } . Consider the following coupling between μ n and μ n · m , the reference measures on Δ n and Δ n · m :
μ ( x i , y k j ) : = 1 n · m δ i k , for all i , k { 1 , , n } and j { 1 , , m }
It is clear that this defines a valid coupling between μ n and μ n · m .
Now, note that
J ( μ ) : = i , i ( k , j ) , ( k , j ) | d Δ n ( x i , x i ) d Δ n · m ( y k j , y k j ) | p μ ( x i , y k j ) μ ( x i , y k j ) = 1 ( n · m ) 2 i , i j , j | d Δ n ( x i , x i ) d Δ n · m ( y i j , y i j ) | p = 1 ( n · m ) 2 i j , j | 1 δ j j | p = m 1 n · m n 1
Now, by definition, d G W , p ( Δ m , Δ n · m ) 1 2 J ( μ ) 1 / p , so the claim follows.
Claim 1 indicates that { Δ n } n N constitutes a Cauchy sequence in M w . However, a potential limit object for this sequence will have countably infinitely many points at distance one from each other. This space is not compact, thus d G W , p is not a complete metric.

3.3. Other Properties: Geodesics and Alexandrov Curvature

Very recently, Sturm [5] pointed out that M w is a geodesic space when endowed with any d G W , p , p 1 . This means that given any two spaces X 0 , X 1 in M w , one can find a curve [ 0 , 1 ] t X t M w , such that d G W , p ( X t , X s ) = | t s | d G W , p ( X 0 , X 1 ) , s , t [ 0 , 1 ] .
Proposition 1 
([5]). For each p 1 , the space ( M w , d G W , p ) is geodesic. Furthermore, for p > 1 , the following curves on M w define geodesics between ( X 0 , d 0 , μ 0 ) and ( X 1 , d 1 , μ 1 ) in M w :
[ 0 , 1 ] t ( X 0 × X 1 , d t , μ )
where d t ( x 0 , x 1 ) , ( x 0 , x 1 ) : = ( 1 t ) d 0 ( x 0 , x 0 ) + t d 1 ( x 1 , x 1 ) for ( x 0 , x 1 ) , ( x 0 , x 1 ) X 0 × X 1 and μ U p opt ( X , Y ) . Furthermore, for p > 1 , all geodesics are of this form.
Sturm further proved that the completion M w ¯ of the space M w with metric d G W , 2 satisfies:
Theorem 5 
([5]). The metric space M w ¯ , d G W , 2 is an Alexandrov space of curvature 0 .
Amongst the consequences of this property is the fact that one can conceive of gradient flows on the space of all mm-spaces [5].

3.4. The Metric d G W , p in Applications

Applications of the notion of Gromov-Wasserstein distance arise in shape and data analysis. In shape analysis, the main application is shape matching under invariances. Many easily computable lower bounds for the GW distance have been discussed in [3,6]. All of them lead to solving linear programming optimization problems (for which there are polynomial time algorithms) or can be computed via elicit formulas. As an example, consider the following invariant of an mm-space ( X , d X , μ X ) :
H X : [ 0 , ) [ 0 , 1 ] , t μ X μ X { ( x , x ) | d X ( x , x ) t }
This invariant simply encodes the distribution of pairwise distances on the dataset X, and it is defined by analogy with the so-called shape distributions that are well known in computer graphics [7]. Then, one has:
Proposition 2 
([3,6]). Let X , Y M w be any two mm-spaces and p 1 . Then,
d G W , p ( X , Y ) 1 2 0 | H X ( t ) H Y ( t ) | d t
Remark 3. 
This invariant is also related to the work of Boutin and Kemper [8] and Brinkman and Olver [9].
Other lower bounds which can be computed in time polynomial in the number of points in the underlying mm-spaces have been reported in [3]. As a primary example, one has that the local shape distributions of shapes provide a lower bound which is strictly stronger than the ones in the Proposition above. In more detail, consider for a given mm-space ( X , d X , μ X ) the invariant:
h X : X × [ 0 , ) [ 0 , 1 ] , ( x , t ) μ X B t ( x ) ¯ .
Then, for mm-spaces X and Y consider the cost function c X , Y : X × Y R + given by:
c X , Y ( x , y ) : = 0 | h X ( x , t ) h Y ( y , t ) | d t .
One then has:
Proposition 3 
([3,6]). Let X , Y M w be any two mm-spaces and p 1 . Then,
d G W , p ( X , Y ) 1 2 inf μ c X , Y ( x , y ) μ ( d x × d y ) ,
where μ ranges in U ( μ X , μ Y ) .
Remark 4. 
Solving for the infimum above leads to a mass transportation problem for which there exists efficient linear programming techniques.
Remark 5. 
It is possible to define a notion of spectral Gromov-Wasserstein distance which operates at the level of compact Riemannian manifolds without boundaries, and is based con the comparison of heat-kernels. This notion permits inter-relating many pre-existing shape matching methods and suggests some others [12].

4. Discussion and Outlook

The Gromov–Hausdorff distance offers a useful language for expressing different tasks in shape and data analysis. Its origins are in the work of Gromov on synthetic geometry. For finite metric spaces, the Gromov–Hausdorff distance leads to solving NP-hard combinatorial optimization problems. Related to construction is Gromov–Wasserstein distances that operate on metric measure spaces [3,10]. In contrast to the Gromov–Hausdorff distance, the computation of Gromov–Wasserstein distances leads to solving quadratic optimization problems on continuous variables. The space of all metric measures spaces endowed with a certain variant of the Gromov–Wasserstein distance [3] enjoys nice theoretical properties [5]. It seems of interest to develop provably correct approximations to these distances when restricted to some suitable subclasses of finite metric spaces. Other aspects of the Gromov–Wasserstein distance are discussed in [3,5,10,11,12].

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Burago, D.; Burago, Y.; Ivanov, S. A Course in Metric Geometry. In AMS Graduate Studies in Math; American Mathematical Society: Providence, RI, USA, 2001; Volume 33. [Google Scholar]
  2. Petersen, P. Gromov-Hausdorff convergence of metric spaces. In Differential Geometry: Riemannian Geometry, Proceedings of the Symposium in Pure Mathematics, Los Angeles, CA, USA, 8–18 July 1990; 1990. [Google Scholar]
  3. Mémoli, F. Gromov-Wasserstein distances and the metric approach to object matching. Found. Comput. Math. 2011, 11, 417–487. [Google Scholar] [CrossRef]
  4. Sakai, T. Riemannian geometry. In Translations of Mathematical Monographs; American Mathematical Society: Providence, RI, USA, 1996; Volume 149. [Google Scholar]
  5. Sturm, K.-T. The space of spaces: Curvature bounds and gradient flows on the space of metric measure spaces. Mathematics 2012. [Google Scholar]
  6. Mémoli, F. On the use of Gromov-Hausdorff distances for shape comparison. In Proceedings of the Point Based Graphics 2007, Prague, Czech Republic, 2–3 September 2007.
  7. Osada, R.; Funkhouser, T.; Chazelle, B.; Dobkin, D. Shape distributions. ACM Trans. Graph. 2002, 21, 807–832. [Google Scholar] [CrossRef]
  8. Boutin, M.; Kemper, G. On reconstructing n-point configurations from the distribution of distances or areas. Adv. in Appl. Math. 2004, 32, 709–735. [Google Scholar] [CrossRef]
  9. Brinkman, D.; Olver, P.J. Invariant histograms. Am. Math. Mon. 2012, 119, 4–24. [Google Scholar]
  10. Sturm, K.-T. On the geometry of metric measure spaces. I. Acta Math. 2006, 196, 65–131. [Google Scholar] [CrossRef]
  11. Mémoli, F. Gromov-Hausdorff distances in Euclidean spaces. In Proceedings of the 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPR Workshops 2008, Anchorage, AK, USA, 23–28 June 2008.
  12. Mémoli, F. A spectral notion of Gromov-Wasserstein distances and related methods. Appl. Comput. Harmon. Anal. 2011, 30, 363–401. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Mémoli, F. The Gromov–Wasserstein Distance: A Brief Overview. Axioms 2014, 3, 335-341. https://doi.org/10.3390/axioms3030335

AMA Style

Mémoli F. The Gromov–Wasserstein Distance: A Brief Overview. Axioms. 2014; 3(3):335-341. https://doi.org/10.3390/axioms3030335

Chicago/Turabian Style

Mémoli, Facundo. 2014. "The Gromov–Wasserstein Distance: A Brief Overview" Axioms 3, no. 3: 335-341. https://doi.org/10.3390/axioms3030335

Article Metrics

Back to TopTop