Optimization Theory, Method and Application

A special issue of Mathematics (ISSN 2227-7390).

Deadline for manuscript submissions: 20 November 2024 | Viewed by 4292

Special Issue Editors


E-Mail Website
Guest Editor
School of Mathematics and Statistics, Beijing Jiaotong University, Beijing 100044, China
Interests: non-smooth optimization; stochastic programming; data mining

E-Mail Website
Guest Editor
College of Science, Minzu University of China, Beijing 100086, China
Interests: random optimization and its application; random equilibrium; statistical optimization and algorithms
School of Mathematics and Statistics, Shandong Normal University, Jinan 250061, China
Interests: optimization theory and method

E-Mail Website
Guest Editor
Faculty of Computer Science, Dalhousie University, Halifax, NS B3H 4R2, Canada
Interests: wireless networks; mobile computing; internet of things; network security; data analytics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Optimization theory, methods and applications play increasingly important roles in our modern society for solving optimization challenges arising from nonsmooth optimization problems, stochastic programming, integer programming, etc. This Special Issue will focus on collecting recent research works on topics such as optimality conditions and duality theory, algorithms and applications for difficult optimization problems, including nonsmooth optimization, stochastic programming, variational inequalities, bilevel programming, etc. Topics of interest include, but are not limited to:

  1. Optimality conditions for various optimization problems;
  2. Duality theory for various optimization problems;
  3. Smoothing algorithms for nonsmooth optimization;
  4. Stochastic approximation algorithms;
  5. Proximal algorithms for nonsmooth optimization;
  6. Cutting plane algorithms;
  7. Applications in economics;
  8. Applications in transportation;
  9. Applications in data mining.

Prof. Dr. Chao Zhang
Dr. Yanfang Zhang
Dr. Yang Zhou
Prof. Dr. Qiang Ye
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • optimality theory
  • dual theory
  • nonsmooth optimization
  • smoothing method
  • stochastic approximation method
  • applications

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 321 KiB  
Article
First-Order Conditions for Set-Constrained Optimization
by Steven M. Rovnyak, Edwin K. P. Chong and James Rovnyak
Mathematics 2023, 11(20), 4274; https://doi.org/10.3390/math11204274 - 13 Oct 2023
Viewed by 1006
Abstract
A well-known first-order necessary condition for a point to be a local minimizer of a given function is the non-negativity of the dot product of the gradient and a vector in a feasible direction. This paper proposes a series of alternative first-order necessary [...] Read more.
A well-known first-order necessary condition for a point to be a local minimizer of a given function is the non-negativity of the dot product of the gradient and a vector in a feasible direction. This paper proposes a series of alternative first-order necessary conditions and corresponding first-order sufficient conditions that seem not to appear in standard texts. The conditions assume a nonzero gradient. The methods use extensions of the notions of gradient, differentiability, and twice differentiability. Examples, including one involving the Karush–Kuhn–Tucker (KKT) theorem, illustrate the scope of the conditions. Full article
(This article belongs to the Special Issue Optimization Theory, Method and Application)
Show Figures

Figure 1

20 pages, 1429 KiB  
Article
Sparse Support Tensor Machine with Scaled Kernel Functions
by Shuangyue Wang and Ziyan Luo
Mathematics 2023, 11(13), 2829; https://doi.org/10.3390/math11132829 - 24 Jun 2023
Cited by 1 | Viewed by 1048
Abstract
As one of the supervised tensor learning methods, the support tensor machine (STM) for tensorial data classification is receiving increasing attention in machine learning and related applications, including remote sensing imaging, video processing, fault diagnosis, etc. Existing STM approaches lack consideration for support [...] Read more.
As one of the supervised tensor learning methods, the support tensor machine (STM) for tensorial data classification is receiving increasing attention in machine learning and related applications, including remote sensing imaging, video processing, fault diagnosis, etc. Existing STM approaches lack consideration for support tensors in terms of data reduction. To address this deficiency, we built a novel sparse STM model to control the number of support tensors in the binary classification of tensorial data. The sparsity is imposed on the dual variables in the context of the feature space, which facilitates the nonlinear classification with kernel tricks, such as the widely used Gaussian RBF kernel. To alleviate the local risk associated with the constant width in the tensor Gaussian RBF kernel, we propose a two-stage classification approach; in the second stage, we advocate for a scaling strategy on the kernel function in a data-dependent way, using the information of the support tensors obtained from the first stage. The essential optimization models in both stages share the same type, which is non-convex and discontinuous, due to the sparsity constraint. To resolve the computational challenge, a subspace Newton method is tailored for the sparsity-constrained optimization for effective computation with local convergence. Numerical experiments were conducted on real datasets, and the numerical results demonstrate the effectiveness of our proposed two-stage sparse STM approach in terms of classification accuracy, compared with the state-of-the-art binary classification approaches. Full article
(This article belongs to the Special Issue Optimization Theory, Method and Application)
Show Figures

Figure 1

24 pages, 543 KiB  
Article
Heavy-Ball-Based Hard Thresholding Pursuit for Sparse Phase Retrieval Problems
by Yingying Li, Jinchuan Zhou, Zhongfeng Sun and Jingyong Tang
Mathematics 2023, 11(12), 2744; https://doi.org/10.3390/math11122744 - 16 Jun 2023
Viewed by 762
Abstract
We introduce a novel iterative algorithm, termed the Heavy-Ball-Based Hard Thresholding Pursuit for sparse phase retrieval problem (SPR-HBHTP), to reconstruct a sparse signal from a small number of magnitude-only measurements. Our algorithm is obtained via a natural combination of the Hard Thresholding Pursuit [...] Read more.
We introduce a novel iterative algorithm, termed the Heavy-Ball-Based Hard Thresholding Pursuit for sparse phase retrieval problem (SPR-HBHTP), to reconstruct a sparse signal from a small number of magnitude-only measurements. Our algorithm is obtained via a natural combination of the Hard Thresholding Pursuit for sparse phase retrieval (SPR-HTP) and the classical Heavy-Ball (HB) acceleration method. The robustness and convergence for the proposed algorithm were established with the help of the restricted isometry property. Furthermore, we prove that our algorithm can exactly recover a sparse signal with overwhelming probability in finite steps whenever the initialization is in the neighborhood of the underlying sparse signal, provided that the measurement is accurate. Extensive numerical tests show that SPR-HBHTP has a markedly improved recovery performance and runtime compared to existing alternatives, such as the Hard Thresholding Pursuit for sparse phase retrieval problem (SPR-HTP), the SPARse Truncated Amplitude Flow (SPARTA), and Compressive Phase Retrieval with Alternating Minimization (CoPRAM). Full article
(This article belongs to the Special Issue Optimization Theory, Method and Application)
Show Figures

Figure 1

6 pages, 247 KiB  
Communication
An Improved Convergence Condition of the MMS Iteration Method for Horizontal LCP of H+-Matrices
by Cuixia Li and Shiliang Wu
Mathematics 2023, 11(8), 1842; https://doi.org/10.3390/math11081842 - 13 Apr 2023
Viewed by 837
Abstract
In this paper, inspired by the previous work in (Appl. Math. Comput., 369 (2020) 124890), we focus on the convergence condition of the modulus-based matrix splitting (MMS) iteration method for solving the horizontal linear complementarity problem (HLCP) with H+-matrices. An improved [...] Read more.
In this paper, inspired by the previous work in (Appl. Math. Comput., 369 (2020) 124890), we focus on the convergence condition of the modulus-based matrix splitting (MMS) iteration method for solving the horizontal linear complementarity problem (HLCP) with H+-matrices. An improved convergence condition of the MMS iteration method is given to improve the range of its applications, in a way which is better than that in the above published article. Full article
(This article belongs to the Special Issue Optimization Theory, Method and Application)
Back to TopTop