Special Issue "Lie Group Machine Learning and Lie Group Structure Preserving Integrators"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (2 March 2020).

Special Issue Editors

Prof. Elena Cellodoni
E-Mail
Guest Editor
Department of Mathematical Sciences, Norwegian University of Science and Technology, 7491 Trondheim, Norway
Interests: numerical analysis; structure preserving algorithms; differential equations; geometric numerical integration
Prof. François Gay-Balmaz
E-Mail
Guest Editor
CNRS & LMD-IPSL,Ecole Normale Supérieure de Paris, 75005 Paris, France
Interests: geometric methods for the modeling and numerical discretization of partial differential equations arising in fluid dynamics; nonlinear elasticity, and nonequilibrium thermodynamics
Prof. Joël Bensoam
E-Mail
Guest Editor
Ircam, Centre G. Pompidou, CNRS UMR 9912, 75004 Paris, France
Interests: geometric mechanics; Lie group theory; acoustics; numerical sound synthesis; finite and boundary element methods

Special Issue Information

Dear Colleague,

Machine/deep learning is exploring use-cases extensions for more abstract spaces such as graphs, differential manifolds, and structured data. The most recent fruitful exchanges between geometric science of information and Lie group theory have opened new perspectives to extend machine learning on Lie groups. After the Lie group’s foundation by Sophus Lie, Felix Klein, and Henri Poincaré, based on the Wilhelm Killing study of Lie algebra, Elie Cartan achieved the classification of simple real Lie algebras and introduced affine representation of Lie groups/algebras applied systematically by Jean-Louis Koszul. In parallel, the noncommutative harmonic analysis for non-Abelian groups has been addressed with the orbit method (coadjoint representation of group) with many contributors (Jacques Dixmier, Alexander Kirillov, etc.). In physics, Valentine Bargmann, Jean-Marie Souriau, and Bertram Kostant provided the basic concepts of Symplectic Geometry to Geometric Mechanics, such as the KKS symplectic form on coadjoint orbits and the notion of Momentum map associated to the action of a Lie group. Using these tools Souriau also developed the theory of Lie Group Thermodynamics based on coadjoint representations. These set of tools could be revisited in the framework of Lie group machine learning to develop new schemes for processing structured data.

Structure preserving integrators are numerical algorithms that are specifically designed to preserve the geometric properties of the flow of the differential equation, such as invariants, (multi-)symplecticity, volume preservation, as well as the configuration manifold. As a consequence, such algorithms have proven to be highly superior in correctly reproducing the global qualitative behavior of the system. Structure-preserving methods have recently undergone significant development and constitute today a privileged road in building numerical algorithms with high reliability and robustness in various areas of computational mathematics. In particular, the capability for long-term computation makes these methods particularly well adapted to deal with the new opportunities and challenges offered by scientific computations. Among the different ways to construct such numerical integrators, the application of variational principles (such as Hamilton’s variational principle and its generalizations) has appeared to be very powerful, since it is very constructive and because of its wide range of applicability.

An important specific situation encountered in a wide range of applications going from multibody dynamics to nonlinear beam dynamics and fluid mechanics is the case of ordinary and partial differential equations on Lie groups. In this case, one can additionally take advantage of the rich geometric structure of the Lie group and its Lie algebra for the construction of the integrators. Structure preserving integrators that preserve the Lie group structure have been studied from many points of view and with several extensions to a wide range of situations, including forced, controlled, constrained, nonsmooth, stochastic, or multiscale systems, in both the finite and infinite dimensional Lie group setting. They also naturally find applications in the extension of machine learning and deep learning algorithms to Lie group data.

This Special Issue will collect long versions of papers from contributions presented during the GSI'19 "Geometric Science of Information" conference (www.gsi2019.org) but will not be limited to these authors and is open to international communities involved in research on Lie groups machine learning and Lie group structure-preserving integrators.

Prof. Frédéric Barbaresco
Prof. Elena Cellodoni
Prof. François Gay-Balmaz
Prof. Joël Bensoam
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Lie groups machine learning
  • orbits method
  • symplectic geometry
  • geometric integrator
  • symplectic integrator
  • Hamilton’s variational principle

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Lie Group Statistics and Lie Group Machine Learning Based on Souriau Lie Groups Thermodynamics & Koszul-Souriau-Fisher Metric: New Entropy Definition as Generalized Casimir Invariant Function in Coadjoint Representation
Entropy 2020, 22(6), 642; https://doi.org/10.3390/e22060642 - 09 Jun 2020
Cited by 1 | Viewed by 1249
Abstract
In 1969, Jean-Marie Souriau introduced a “Lie Groups Thermodynamics” in Statistical Mechanics in the framework of Geometric Mechanics. This Souriau’s model considers the statistical mechanics of dynamic systems in their “space of evolution” associated to a homogeneous symplectic manifold by a Lagrange 2-form, [...] Read more.
In 1969, Jean-Marie Souriau introduced a “Lie Groups Thermodynamics” in Statistical Mechanics in the framework of Geometric Mechanics. This Souriau’s model considers the statistical mechanics of dynamic systems in their “space of evolution” associated to a homogeneous symplectic manifold by a Lagrange 2-form, and defines in case of non null cohomology (non equivariance of the coadjoint action on the moment map with appearance of an additional cocyle) a Gibbs density (of maximum entropy) that is covariant under the action of dynamic groups of physics (e.g., Galileo’s group in classical physics). Souriau Lie Group Thermodynamics was also addressed 30 years after Souriau by R.F. Streater in the framework of Quantum Physics by Information Geometry for some Lie algebras, but only in the case of null cohomology. Souriau method could then be applied on Lie groups to define a covariant maximum entropy density by Kirillov representation theory. We will illustrate this method for homogeneous Siegel domains and more especially for Poincaré unit disk by considering SU(1,1) group coadjoint orbit and by using its Souriau’s moment map. For this case, the coadjoint action on moment map is equivariant. For non-null cohomology, we give the case of Lie group SE(2). Finally, we will propose a new geometric definition of Entropy that could be built as a generalized Casimir invariant function in coadjoint representation, and Massieu characteristic function, dual of Entropy by Legendre transform, as a generalized Casimir invariant function in adjoint representation, where Souriau cocycle is a measure of the lack of equivariance of the moment mapping. Full article
Show Figures

Figure 1

Open AccessArticle
Multi-Stage Meta-Learning for Few-Shot with Lie Group Network Constraint
Entropy 2020, 22(6), 625; https://doi.org/10.3390/e22060625 - 05 Jun 2020
Viewed by 875
Abstract
Deep learning has achieved many successes in different fields but can sometimes encounter an overfitting problem when there are insufficient amounts of labeled samples. In solving the problem of learning with limited training data, meta-learning is proposed to remember some common knowledge by [...] Read more.
Deep learning has achieved many successes in different fields but can sometimes encounter an overfitting problem when there are insufficient amounts of labeled samples. In solving the problem of learning with limited training data, meta-learning is proposed to remember some common knowledge by leveraging a large number of similar few-shot tasks and learning how to adapt a base-learner to a new task for which only a few labeled samples are available. Current meta-learning approaches typically uses Shallow Neural Networks (SNNs) to avoid overfitting, thus wasting much information in adapting to a new task. Moreover, the Euclidean space-based gradient descent in existing meta-learning approaches always lead to an inaccurate update of meta-learners, which poses a challenge to meta-learning models in extracting features from samples and updating network parameters. In this paper, we propose a novel meta-learning model called Multi-Stage Meta-Learning (MSML) to post the bottleneck during the adapting process. The proposed method constrains a network to Stiefel manifold so that a meta-learner could perform a more stable gradient descent in limited steps so that the adapting process can be accelerated. An experiment on the mini-ImageNet demonstrates that the proposed method reached a better accuracy under 5-way 1-shot and 5-way 5-shot conditions. Full article
Show Figures

Figure 1

Open AccessArticle
Rigid Shape Registration Based on Extended Hamiltonian Learning
Entropy 2020, 22(5), 539; https://doi.org/10.3390/e22050539 - 12 May 2020
Viewed by 790
Abstract
Shape registration, finding the correct alignment of two sets of data, plays a significant role in computer vision such as objection recognition and image analysis. The iterative closest point (ICP) algorithm is one of well known and widely used algorithms in this area. [...] Read more.
Shape registration, finding the correct alignment of two sets of data, plays a significant role in computer vision such as objection recognition and image analysis. The iterative closest point (ICP) algorithm is one of well known and widely used algorithms in this area. The main purpose of this paper is to incorporate ICP with the fast convergent extended Hamiltonian learning (EHL), so called EHL-ICP algorithm, to perform planar and spatial rigid shape registration. By treating the registration error as the potential for the extended Hamiltonian system, the rigid shape registration is modelled as an optimization problem on the special Euclidean group S E ( n ) ( n = 2 , 3 ) . Our method is robust to initial values and parameters. Compared with some state-of-art methods, our approach shows better efficiency and accuracy by simulation experiments. Full article
Show Figures

Figure 1

Open AccessFeature PaperArticle
Lie Group Cohomology and (Multi)Symplectic Integrators: New Geometric Tools for Lie Group Machine Learning Based on Souriau Geometric Statistical Mechanics
Entropy 2020, 22(5), 498; https://doi.org/10.3390/e22050498 - 25 Apr 2020
Cited by 3 | Viewed by 1729
Abstract
In this paper, we describe and exploit a geometric framework for Gibbs probability densities and the associated concepts in statistical mechanics, which unifies several earlier works on the subject, including Souriau’s symplectic model of statistical mechanics, its polysymplectic extension, Koszul model, and approaches [...] Read more.
In this paper, we describe and exploit a geometric framework for Gibbs probability densities and the associated concepts in statistical mechanics, which unifies several earlier works on the subject, including Souriau’s symplectic model of statistical mechanics, its polysymplectic extension, Koszul model, and approaches developed in quantum information geometry. We emphasize the role of equivariance with respect to Lie group actions and the role of several concepts from geometric mechanics, such as momentum maps, Casimir functions, coadjoint orbits, and Lie-Poisson brackets with cocycles, as unifying structures appearing in various applications of this framework to information geometry and machine learning. For instance, we discuss the expression of the Fisher metric in presence of equivariance and we exploit the property of the entropy of the Souriau model as a Casimir function to apply a geometric model for energy preserving entropy production. We illustrate this framework with several examples including multivariate Gaussian probability densities, and the Bogoliubov-Kubo-Mori metric as a quantum version of the Fisher metric for quantum information on coadjoint orbits. We exploit this geometric setting and Lie group equivariance to present symplectic and multisymplectic variational Lie group integration schemes for some of the equations associated with Souriau symplectic and polysymplectic models, such as the Lie-Poisson equation with cocycle. Full article
Open AccessArticle
Black-Scholes Theory and Diffusion Processes on the Cotangent Bundle of the Affine Group
Entropy 2020, 22(4), 455; https://doi.org/10.3390/e22040455 - 17 Apr 2020
Cited by 1 | Viewed by 1211
Abstract
The Black-Scholes partial differential equation (PDE) from mathematical finance has been analysed extensively and it is well known that the equation can be reduced to a heat equation on Euclidean space by a logarithmic transformation of variables. However, an alternative interpretation is proposed [...] Read more.
The Black-Scholes partial differential equation (PDE) from mathematical finance has been analysed extensively and it is well known that the equation can be reduced to a heat equation on Euclidean space by a logarithmic transformation of variables. However, an alternative interpretation is proposed in this paper by reframing the PDE as evolving on a Lie group. This equation can be transformed into a diffusion process and solved using mean and covariance propagation techniques developed previously in the context of solving Fokker–Planck equations on Lie groups. An extension of the Black-Scholes theory with coupled asset dynamics produces a diffusion equation on the affine group, which is not a unimodular group. In this paper, we show that the cotangent bundle of a Lie group endowed with a semidirect product group operation, constructed in this paper for the case of groups with trivial centers, is always unimodular and considering PDEs as diffusion processes on the unimodular cotangent bundle group allows a direct application of previously developed mean and covariance propagation techniques, thereby offering an alternative means of solution of the PDEs. Ultimately these results, provided here in the context of PDEs in mathematical finance may be applied to PDEs arising in a variety of different fields and inform new methods of solution. Full article
Show Figures

Figure 1

Open AccessArticle
A Bi-Invariant Statistical Model Parametrized by Mean and Covariance on Rigid Motions
Entropy 2020, 22(4), 432; https://doi.org/10.3390/e22040432 - 10 Apr 2020
Viewed by 799
Abstract
This paper aims to describe a statistical model of wrapped densities for bi-invariant statistics on the group of rigid motions of a Euclidean space. Probability distributions on the group are constructed from distributions on tangent spaces and pushed to the group by the [...] Read more.
This paper aims to describe a statistical model of wrapped densities for bi-invariant statistics on the group of rigid motions of a Euclidean space. Probability distributions on the group are constructed from distributions on tangent spaces and pushed to the group by the exponential map. We provide an expression of the Jacobian determinant of the exponential map of S E ( n ) which enables the obtaining of explicit expressions of the densities on the group. Besides having explicit expressions, the strengths of this statistical model are that densities are parametrized by their moments and are easy to sample from. Unfortunately, we are not able to provide convergence rates for density estimation. We provide instead a numerical comparison between the moment-matching estimators on S E ( 2 ) and R 3 , which shows similar behaviors. Full article
Show Figures

Figure 1

Back to TopTop