Next Article in Journal
Differential Evolution with Linear Bias Reduction in Parameter Adaptation
Previous Article in Journal
Cross-Entropy Method in Application to the SIRC Model
Previous Article in Special Issue
A Mixed-Integer and Asynchronous Level Decomposition with Application to the Stochastic Hydrothermal Unit-Commitment Problem
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Special Issue “Nonsmooth Optimization in Honor of the 60th Birthday of Adil M. Bagirov”: Foreword by Guest Editors

1
Department of Mathematics and Statistics, University of Turku, FI-20014 Turku, Finland
2
School of Science, RMIT University Australia, Melbourne, VIC 3001, Australia
*
Author to whom correspondence should be addressed.
Algorithms 2020, 13(11), 282; https://doi.org/10.3390/a13110282
Submission received: 4 November 2020 / Accepted: 4 November 2020 / Published: 7 November 2020

Abstract

:
Nonsmooth optimization refers to the general problem of minimizing (or maximizing) functions that have discontinuous gradients. This Special Issue contains six research articles that collect together the most recent techniques and applications in the area of nonsmooth optimization. These include novel techniques utilizing some decomposable structures in nonsmooth problems—for instance, the difference-of-convex (DC) structure—and interesting important practical problems, like multiple instance learning, hydrothermal unit-commitment problem, and scheduling the disposal of nuclear waste.

1. Introduction

In this special issue, we take the opportunity to acknowledge the outstanding contributions of Professor Adil Bagirov (Figure 1) to nonsmooth optimization (NSO) in both theoretical foundations and its practical aspects during his 35 year long research career. This Special Issue collects together the most recent techniques and applications in the area of NSO. It contains six excellent research papers by well-known mathematicians. Some of the authors have at some point collaborated with Adil Bagirov, and all of them would like to show their respect to him and his work.
Adil Bagirov received a master’s degree in Applied Mathematics from Baku State University, Azerbaijan in 1983, and the Candidate of Sciences degree in Mathematical Cybernetics from the Institute of Cybernetics of Azerbaijan National Academy of Sciences in 1989. Then he worked at the Space Research Institute (Baku, Azerbaijan), Baku State University (Baku, Azerbaijan) and Joint Institute for Nuclear Research (Moscow, Russia) until 1998.
Bagirov has been joined with Federation University Australia since 1999. He completed his PhD in Optimization under the supervision of Professor Alexander Rubinov at Federation University Australia (formerly the University of Ballarat) in 2002. Currently, he holds the full Professor position at this university. Professor Bagirov has contributed exceptionally to NSO and its applications to real-life problems. These contributions include writing books on NSO [1] and its applications in clustering [2], an edited book on NSO methods [3] and more than 170 journal papers, book chapters and papers in conference proceedings in the area of NSO and its applications (see, e.g., [4,5,6,7,8,9,10,11,12]). He has also supervised more than 28 PhD students.
Professor Bagirov has been successful in securing five grants from the Australian Research Council’s Discovery and Linkage schemes to conduct research in nonsmooth and global optimization and their applications. He was awarded the Australian Research Council Postdoctoral Fellowship and the Australian Research Council Research Fellowship. In addition, he is EUROPT Fellow 2009.
The Guest Editors are grateful to Professor Adil Bagirov, with whom they have had the privilege to do research in the area of NSO and its real-life applications. On behalf of the journal, the Guest Editors wish him all the best in his career and personal life.

2. Nonsmooth Optimization

NSO refers to the general problem of minimizing (or maximizing) functions that have discontinuous gradients. These types of functions arise in many applied fields, for instance, in image deionising, optimal shape design, computational chemistry and physics, water management, cyber security, machine learning, and data mining including cluster analysis, classification and regression. In most of these applications, the number of decision variables is very large, and their NSO formulations allow us to reduce these numbers significantly. Thus, the application of NSO approaches facilitates the design of efficient algorithms for their solutions, the more realistic modeling of various real-world problems, the robust formulation of a system, and even the solving of difficult smooth (continuously differentiable) problems that require reducing the problem’s size or simplifying its structure. These are some of the main reasons for the increased attraction to nonsmooth analysis and optimization during the past few years. This Special Issue collects some of the most recent methods in NSO and its applications. These include novel techniques for solving NSO problems by utilizing, for instance, the decomposable (difference of convex (DC)) structure of the objective, the nonsmooth Gauss-Newton algorithm, the biased-randomized algorithm, and also interesting practical problems such as the multiple instance learning, the hydrothermal unit-commitment problem, and scheduling the disposal of nuclear waste.
In the first article, “A Mixed-Integer and Asynchronous Level Decomposition with Application to the Stochastic Hydrothermal Unit-Commitment Problem” by Bruno Colonetti, Erlon Cristian Finardi and Welington de Oliveira [13], the authors develop an efficient algorithm for solving uncertain unit-commitment (UC) problems. The efficiency of the algorithm is based on the novel asynchronous level decomposition of the UC problem and the parallelization of the algorithm.
In the second article “On a Nonsmooth Gauss-Newton Algorithm for Solving Nonlinear Complementarity Problems” by Marek J. Śmietański [14], the author proposes a new nonsmooth version of the generalized damped Gauss-Newton method for solving nonlinear complementarity problems. In the proposed algorithm, the Bouligand differential plays the role of the derivative. The author presents two types of algorithms (usual and inexact), which have superlinear and global convergence for semismooth cases.
In the article “Polyhedral DC Decomposition and DCA Optimization of Piecewise Linear Functions” by Andreas Griewank and Andrea Walther [15], the abs-linear representation of the piecewise linear functions is extended, yielding their DC decomposition as well as a pair of generalized gradients that can be computed using the reverse mode of algorithmic differentiation. The DC decomposition and two subgradients are used to drive DCA algorithms where the (convex) inner problem can be solved in a finite many iterations and the gradients of the concave part can be updated using a reflection technique.
The fourth article, “On the Use of Biased-Randomized Algorithms for Solving Non-Smooth Optimization Problems” by Angel Alejandro Juan, Canan Gunes Corlu, Rafael David Tordecilla, Rocio de la Torre and Albert Ferrer [16], introduces the use of biased-randomized algorithms as an effective methodology to cope with NP-hard and NSO problems in many practical applications, in particular, those including so called soft constraints. Biased-randomized algorithms extend constructive heuristics by introducing a nonuniform randomization pattern into them. Thus, they can be used to explore promising areas of the solution space without the limitations of gradient-based approaches that assume the existence of the smooth objective.
In the fifth article, “Planning the Schedule for the Disposal of the Spent Nuclear Fuel with Interactive Multiobjective Optimization” by Outi Montonen, Timo Ranta and Marko M. Mäkelä [17], the very important problem of the scheduling of nuclear waste disposal is modelled as a multiobjective mixed-integer nonlinear NSO problem with the minimization of nine objectives. A novel method using the two-slope parameterized achievement scalarizing functions is introduced for solving this problem, and a case study adapting the disposal in Finland is given.
Finally, the article “SVM-Based Multiple Instance Classification via DC Optimization” by Annabella Astorino, Antonio Fuduli, Giovanni Giallombardo and Giovanna Miglionico considers the binary classification of the multiple instance learning problem [18]. The problem is formulated as a nonconvex unconstrained NSO problem with a DC objective function, and an appropriate nonsmooth DC algorithm is used to solve this problem.
The Guest Editors would like to thank all the authors for their contributions in this Special Issue. They would also like to thank all the reviewers for their timely and insightful comments on the submitted articles as well as the editorial staff of the MDPI Journal Algorithms for their assistance in managing the review process in a prompt manner.

Funding

This work was financially supported by Academy of Finland grant #289500.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bagirov, A.M.; Karmitsa, N.; Mäkelä, M.M. Introduction to Nonsmooth Optimization: Theory, Practice and Software; Springer: Cham, Switzerland, 2014. [Google Scholar]
  2. Bagirov, A.M.; Karmitsa, N.; Taheri, S. Partitional Clustering via Nonsmooth Optimization: Clustering via Optimization; Springe: Cham, Switzerland, 2020. [Google Scholar]
  3. Bagirov, A.M.; Gaudioso, M.; Karmitsa, N.; Mäkelä, M.M.; Taheri, S. Numerical Nonsmooth Optimization: State of the Art Algorithms; Springer: Cham, Switzerland, 2020. [Google Scholar]
  4. Bagirov, A.M. Modified global k-means algorithm for sum-of-squares clustering problems. Pattern Recognit. 2008, 41, 3192–3199. [Google Scholar] [CrossRef] [Green Version]
  5. Bagirov, A.M.; Clausen, C.; Kohler, M. Estimation of a regression function by maxima of minima of linear functions. IEEE Trans. Inf. Theory 2009, 55, 833–845. [Google Scholar] [CrossRef] [Green Version]
  6. Bagirov, A.M.; Karasozen, B.; Sezer, M. Discrete gradient method: Derivative-free method for nonsmooth optimization. J. Optim. Theory Appl. 2008, 137, 317–334. [Google Scholar] [CrossRef] [Green Version]
  7. Bagirov, A.M.; Mahmood, A.; Barton, A. Prediction of monthly rainfall in Victoria, Australia: Clusterwise linear regression approach. Atmos. Res. 2017, 188, 20–29. [Google Scholar] [CrossRef]
  8. Bagirov, A.M.; Rubinov, A.M.; Zhang, J. A multidimensional descent method for global optimization. Optimization 2009, 58, 611–625. [Google Scholar] [CrossRef]
  9. Bagirov, A.M.; Taheri, S.; Karmitsa, N.; Joki, K.; Mäkelä, M.M. Aggregate subgradient method for nonsmooth DC optimization. Optim. Lett. 2020, in press. [Google Scholar] [CrossRef]
  10. Bagirov, A.M.; Taheri, S.; Ugon, J. Nonsmooth DC programming approach to the minimum sum-of-squares clustering problems. Pattern Recognit. 2016, 53, 12–24. [Google Scholar] [CrossRef]
  11. Joki, K.; Bagirov, A.M.; Karmitsa, N.; Mäkelä, M.M.; Taheri, S. Double bundle method for finding Clarke stationary points in nonsmooth DC programming. SIAM J. Optim. 2018, 28, 1892–1919. [Google Scholar] [CrossRef] [Green Version]
  12. Karmitsa, N.; Bagirov, A.M.; Taheri, S. New diagonal bundle method for clustering problems in large data sets. Eur. J. Oper. Res. 2017, 263, 367–379. [Google Scholar] [CrossRef]
  13. Colonetti, B.; Finardi, E.C.; de Oliveira, W. A Mixed-Integer and Asynchronous Level Decomposition with Application to the Stochastic Hydrothermal Unit-Commitment Problem. Algorithms 2020, 13, 235. [Google Scholar] [CrossRef]
  14. Śmietański, M.J. On a Nonsmooth Gauss–Newton Algorithms for Solving Nonlinear Complementarity Problems. Algorithms 2020, 13, 190. [Google Scholar] [CrossRef]
  15. Griewank, A.; Walther, A. Polyhedral DC Decomposition and DCA Optimization of Piecewise Linear Functions. Algorithms 2020, 13, 166. [Google Scholar] [CrossRef]
  16. Juan, A.A.; Corlu, C.G.; Tordecilla, R.D.; de la Torre, R.; Ferrer, A. On the Use of Biased-Randomized Algorithms for Solving Non-Smooth Optimization Problems. Algorithms 2020, 13, 8. [Google Scholar] [CrossRef] [Green Version]
  17. Montonen, O.; Ranta, T.; Mäkelä, M.M. Planning the Schedule for the Disposal of the Spent Nuclear Fuel with Interactive Multiobjective Optimization. Algorithms 2019, 12, 252. [Google Scholar] [CrossRef] [Green Version]
  18. Astorino, A.; Fuduli, A.; Giallombardo, G.; Miglionico, G. SVM-Based Multiple Instance Classification via DC Optimization. Algorithms 2019, 12, 249. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Professor Adil Bagirov.
Figure 1. Professor Adil Bagirov.
Algorithms 13 00282 g001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Karmitsa, N.; Taheri, S. Special Issue “Nonsmooth Optimization in Honor of the 60th Birthday of Adil M. Bagirov”: Foreword by Guest Editors. Algorithms 2020, 13, 282. https://doi.org/10.3390/a13110282

AMA Style

Karmitsa N, Taheri S. Special Issue “Nonsmooth Optimization in Honor of the 60th Birthday of Adil M. Bagirov”: Foreword by Guest Editors. Algorithms. 2020; 13(11):282. https://doi.org/10.3390/a13110282

Chicago/Turabian Style

Karmitsa, Napsu, and Sona Taheri. 2020. "Special Issue “Nonsmooth Optimization in Honor of the 60th Birthday of Adil M. Bagirov”: Foreword by Guest Editors" Algorithms 13, no. 11: 282. https://doi.org/10.3390/a13110282

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop