Previous Article in Journal
On the Security and Efficiency of TLS 1.3 Handshake with Hybrid Key Exchange from CPA-Secure KEMs
Previous Article in Special Issue
Geometry of Statistical Manifolds
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Mirror Descent and Exponentiated Gradient Algorithms Using Trace-Form Entropies

1
Systems Research Institute of Polish Academy of Science, Newelska 6, 01-447 Warsaw, Poland
2
Department of Electrical Engineering, Warsaw University of Technology, Koszykowa 75, 00-662 Warsaw, Poland
3
Department of Electronic and Information Engineering, Tokyo University of Agriculture and Technology, Koganei-shi, Tokyo 184–8588, Japan
4
Riken Artificial Intelligence Project (AIP), 1 Chome-4-1, Nihonbashi, Tokyo 103-0027, Japan
5
Sony Computer Science Laboratories, Tokyo 141-0022, Japan
6
Department of Signal Processing and Communications, Universidad de Sevilla, 41092 Seville, Spain
*
Authors to whom correspondence should be addressed.
Entropy 2025, 27(12), 1243; https://doi.org/10.3390/e27121243
Submission received: 24 October 2025 / Revised: 27 November 2025 / Accepted: 27 November 2025 / Published: 8 December 2025

Abstract

This paper introduces a broad class of Mirror Descent (MD) and Generalized Exponentiated Gradient (GEG) algorithms derived from trace-form entropies defined via deformed logarithms. Leveraging these generalized entropies yields MD and GEG algorithms with improved convergence behavior, robustness against vanishing and exploding gradients, and inherent adaptability to non-Euclidean geometries through mirror maps. We establish deep connections between these methods and Amari’s natural gradient, revealing a unified geometric foundation for additive, multiplicative, and natural gradient updates. Focusing on the Tsallis, Kaniadakis, Sharma–Taneja–Mittal, and Kaniadakis–Lissia–Scarfone entropy families, we show that each entropy induces a distinct Riemannian metric on the parameter space, leading to GEG algorithms that preserve the natural statistical geometry. The tunable parameters of deformed logarithms enable adaptive geometric selection, providing enhanced robustness and convergence over classical Euclidean optimization. Overall, our framework unifies key first-order MD optimization methods under a single information-geometric perspective based on generalized Bregman divergences, where the choice of entropy determines the underlying metric and dual geometric structure.
Keywords: mirror descent; natural gradient; information geometry; deformed logarithms; generalized exponentiated gradient; Bregman divergences; Riemnnian optimization; (q,κ)-algebra mirror descent; natural gradient; information geometry; deformed logarithms; generalized exponentiated gradient; Bregman divergences; Riemnnian optimization; (q,κ)-algebra

Share and Cite

MDPI and ACS Style

Cichocki, A.; Tanaka, T.; Nielsen, F.; Cruces, S. Mirror Descent and Exponentiated Gradient Algorithms Using Trace-Form Entropies. Entropy 2025, 27, 1243. https://doi.org/10.3390/e27121243

AMA Style

Cichocki A, Tanaka T, Nielsen F, Cruces S. Mirror Descent and Exponentiated Gradient Algorithms Using Trace-Form Entropies. Entropy. 2025; 27(12):1243. https://doi.org/10.3390/e27121243

Chicago/Turabian Style

Cichocki, Andrzej, Toshihisa Tanaka, Frank Nielsen, and Sergio Cruces. 2025. "Mirror Descent and Exponentiated Gradient Algorithms Using Trace-Form Entropies" Entropy 27, no. 12: 1243. https://doi.org/10.3390/e27121243

APA Style

Cichocki, A., Tanaka, T., Nielsen, F., & Cruces, S. (2025). Mirror Descent and Exponentiated Gradient Algorithms Using Trace-Form Entropies. Entropy, 27(12), 1243. https://doi.org/10.3390/e27121243

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop