Next Article in Journal
Distributed Control for Coordinated Tracking of Fixed-Wing Unmanned Aerial Vehicles under Model Uncertainty and Disturbances
Previous Article in Journal
Design and Implementation of Composed Position/Force Controllers for Object Manipulation
Article

Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving

Computer Science, Stockton University, 101 Vera King Farris Dr, Galloway, NJ 08205, USA
Academic Editor: Carlos A. Iglesias
Appl. Sci. 2021, 11(21), 9828; https://doi.org/10.3390/app11219828
Received: 15 September 2021 / Revised: 13 October 2021 / Accepted: 19 October 2021 / Published: 21 October 2021
(This article belongs to the Topic Machine and Deep Learning)
The runtime behavior of Simulated Annealing (SA), similar to other metaheuristics, is controlled by hyperparameters. For SA, hyperparameters affect how “temperature” varies over time, and “temperature” in turn affects SA’s decisions on whether or not to transition to neighboring states. It is typically necessary to tune the hyperparameters ahead of time. However, there are adaptive annealing schedules that use search feedback to evolve the “temperature” during the search. A classic and generally effective adaptive annealing schedule is the Modified Lam. Although effective, the Modified Lam can be sensitive to the scale of the cost function, and is sometimes slow to converge to its target behavior. In this paper, we present a novel variation of the Modified Lam that we call Self-Tuning Lam, which uses early search feedback to auto-adjust its self-adaptive behavior. Using a variety of discrete and continuous optimization problems, we demonstrate the ability of the Self-Tuning Lam to nearly instantaneously converge to its target behavior independent of the scale of the cost function, as well as its run length. Our implementation is integrated into Chips-n-Salsa, an open-source Java library for parallel and self-adaptive local search. View Full-Text
Keywords: Self-Tuning; Simulated Annealing; Modified Lam; hyperparameters; Exponential Moving Average; adaptive search; metaheuristics; self-adaptive; optimization; open-source Self-Tuning; Simulated Annealing; Modified Lam; hyperparameters; Exponential Moving Average; adaptive search; metaheuristics; self-adaptive; optimization; open-source
Show Figures

Figure 1

MDPI and ACS Style

Cicirello, V.A. Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving. Appl. Sci. 2021, 11, 9828. https://doi.org/10.3390/app11219828

AMA Style

Cicirello VA. Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving. Applied Sciences. 2021; 11(21):9828. https://doi.org/10.3390/app11219828

Chicago/Turabian Style

Cicirello, Vincent A. 2021. "Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving" Applied Sciences 11, no. 21: 9828. https://doi.org/10.3390/app11219828

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop