This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Open AccessArticle
Adam Algorithm with Step Adaptation
by
Vladimir Krutikov
Vladimir Krutikov 1,2
,
Elena Tovbis
Elena Tovbis 1,*
and
Lev Kazakovtsev
Lev Kazakovtsev 1
1
Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, 31, Krasnoyarskii Rabochii Prospekt, 660037 Krasnoyarsk, Russia
2
Department of Applied Mathematics, Kemerovo State University, 6 Krasnaya Street, 650043 Kemerovo, Russia
*
Author to whom correspondence should be addressed.
Algorithms 2025, 18(5), 268; https://doi.org/10.3390/a18050268 (registering DOI)
Submission received: 19 March 2025
/
Revised: 15 April 2025
/
Accepted: 1 May 2025
/
Published: 4 May 2025
Abstract
Adam (Adaptive Moment Estimation) is a well-known algorithm for the first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. As shown by computational experiments, with an increase in the degree of conditionality of the problem and in the presence of interference, Adam is prone to looping, which is associated with difficulties in step adjusting. In this paper, an algorithm for step adaptation for the Adam method is proposed. The principle of the step adaptation scheme used in the paper is based on reproducing the state in which the descent direction and the new gradient are found during one-dimensional descent. In the case of exact one-dimensional descent, the angle between these directions is right. In case of inexact descent, if the angle between the descent direction and the new gradient is obtuse, then the step is large and should be reduced; if the angle is acute, then the step is small and should be increased. For the experimental analysis of the new algorithm, test functions of a certain degree of conditionality with interference on the gradient and learning problems with mini-batches for calculating the gradient were used. As the computational experiment showed, in stochastic optimization problems, the proposed Adam modification with step adaptation turned out to be significantly more efficient than both the standard Adam algorithm and the other methods with step adaptation that are studied in the work.
Share and Cite
MDPI and ACS Style
Krutikov, V.; Tovbis, E.; Kazakovtsev, L.
Adam Algorithm with Step Adaptation. Algorithms 2025, 18, 268.
https://doi.org/10.3390/a18050268
AMA Style
Krutikov V, Tovbis E, Kazakovtsev L.
Adam Algorithm with Step Adaptation. Algorithms. 2025; 18(5):268.
https://doi.org/10.3390/a18050268
Chicago/Turabian Style
Krutikov, Vladimir, Elena Tovbis, and Lev Kazakovtsev.
2025. "Adam Algorithm with Step Adaptation" Algorithms 18, no. 5: 268.
https://doi.org/10.3390/a18050268
APA Style
Krutikov, V., Tovbis, E., & Kazakovtsev, L.
(2025). Adam Algorithm with Step Adaptation. Algorithms, 18(5), 268.
https://doi.org/10.3390/a18050268
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details
here.
Article Metrics
Article metric data becomes available approximately 24 hours after publication online.