Current Topics in Optimization, Inequalities and Convex Function Theory

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "E: Applied Mathematics".

Deadline for manuscript submissions: 31 December 2025 | Viewed by 5065

Special Issue Editors


E-Mail Website
Guest Editor
Department of Mathematics, University of Craiova, 200585 Craiova, Romania
Interests: operations research; nonlinear functional analysis; differential equations; real analysis; numerical analysis; matrix theory

E-Mail Website
Guest Editor
Department of Electronics, Computing and Mathematics, University of Derby, Derby, UK
Interests: optimization; computational mathematics; recurrence sequences and mathematical modelling; mathematical education and mathematical anxiety
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Over the past decade, several optimization techniques have been proposed to solve a wide range of complex problems. Several new properties of convex functions and new inequalities were discovered. Convex functions play an important role in many areas of mathematics, as well as in other areas of science, the economy, engineering, medicine, industry, and business. They are especially important in the study of optimization problems, since they have a great number of convenient properties. The optimization of convex functions has many practical applications (circuit design, controller design, modeling, etc.).

We invite you to submit your research in the areas of optimization, inequalities, and convex function theory and their applications to this Special Issue, “Current Topics in Optimization, Inequalities and Convex Function Theory”, in the journal Mathematics.

Dr. Sorin Rǎdulescu
Dr. Ovidiu Bagdasar
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • convex functions
  • biconvex functions
  • generalized convexity
  • subharmonic functions
  • absolute monotone functions
  • majorization theory
  • Schur convex functions
  • Jensen inequality
  • Hermite–Hadamard inequality
  • weighted inequalities
  • control theory
  • mathematical programming
  • differential inequalities
  • variational inequalities
  • equilibrium problems

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

27 pages, 331 KiB  
Article
Some Bounds for the Generalized Spherical Numerical Radius of Operator Pairs with Applications
by Najla Altwaijry, Silvestru Sever Dragomir, Kais Feki and Shigeru Furuichi
Mathematics 2025, 13(7), 1199; https://doi.org/10.3390/math13071199 - 5 Apr 2025
Viewed by 148
Abstract
This paper investigates a generalization of the spherical numerical radius for a pair (B,C) of bounded linear operators on a complex Hilbert space H. The generalized spherical numerical radius is defined as [...] Read more.
This paper investigates a generalization of the spherical numerical radius for a pair (B,C) of bounded linear operators on a complex Hilbert space H. The generalized spherical numerical radius is defined as wp(B,C):=supxH,x=1|Bx,x|p+|Cx,x|p1p, p1. We derive lower bounds for wp2(B,C) involving combinations of B and C, where p>1. Additionally, we establish upper bounds in terms of operator norms. Applications include the cases where (B,C)=(A,A*), with A* denoting the adjoint of a bounded linear operator A, and (B,C)=(R(A),I(A)), representing the real and imaginary parts of A, respectively. We also explore applications to the so-called Davis–Wielandt p-radius for p1, which serves as a natural generalization of the classical Davis–Wielandt radius for Hilbert-space operators. Full article
17 pages, 270 KiB  
Article
Neural Networks as Positive Linear Operators
by George A. Anastassiou
Mathematics 2025, 13(7), 1112; https://doi.org/10.3390/math13071112 - 28 Mar 2025
Viewed by 255
Abstract
Basic neural network operators are interpreted as positive linear operators and the related general theory applies to them. These operators are induced by a symmetrized density function deriving from the parametrized and deformed hyperbolic tangent activation function. I explore the space of continuous [...] Read more.
Basic neural network operators are interpreted as positive linear operators and the related general theory applies to them. These operators are induced by a symmetrized density function deriving from the parametrized and deformed hyperbolic tangent activation function. I explore the space of continuous functions on a compact interval of the real line to the reals. I study quantitatively the rate of convergence of these neural network operators to the unit operator. The studied inequalities involve the modulus of continuity of the function under approximation or its derivative. I produce uniform and Lp, p1, approximation results via these inequalities. The convexity of functions is also taken into consideration. Full article
16 pages, 271 KiB  
Article
Triple Mann Iteration Method for Variational Inclusions, Equilibria, and Common Fixed Points of Finitely Many Quasi-Nonexpansive Mappings on Hadamard Manifolds
by Lu-Chuan Ceng, Yun-Yi Huang, Si-Ying Li and Jen-Chih Yao
Mathematics 2025, 13(3), 444; https://doi.org/10.3390/math13030444 - 28 Jan 2025
Viewed by 664
Abstract
In this paper, we introduce a triple Mann iteration method for approximating an element in the set of common solutions of a system of quasivariational inclusion issues, which is an equilibrium problem and a common fixed point problem (CFPP) of finitely many quasi-nonexpansive [...] Read more.
In this paper, we introduce a triple Mann iteration method for approximating an element in the set of common solutions of a system of quasivariational inclusion issues, which is an equilibrium problem and a common fixed point problem (CFPP) of finitely many quasi-nonexpansive operators on a Hadamard manifold. Through some suitable assumptions, we prove that the sequence constructed in the suggested algorithm is convergent to an element in the set of common solutions. Finally, making use of the main result, we deal with the minimizing problem with a CFPP constraint and saddle point problem with a CFPP constraint on a Hadamard manifold, respectively. Full article
32 pages, 1291 KiB  
Article
A Subgradient Extragradient Framework Incorporating a Relaxation and Dual Inertial Technique for Variational Inequalities
by Habib ur Rehman, Kanokwan Sitthithakerngkiet and Thidaporn Seangwattana
Mathematics 2025, 13(1), 133; https://doi.org/10.3390/math13010133 - 31 Dec 2024
Viewed by 712
Abstract
This paper presents an enhanced algorithm designed to solve variational inequality problems that involve a pseudomonotone and Lipschitz continuous operator in real Hilbert spaces. The method integrates a dual inertial extrapolation step, a relaxation step, and the subgradient extragradient technique, resulting in faster [...] Read more.
This paper presents an enhanced algorithm designed to solve variational inequality problems that involve a pseudomonotone and Lipschitz continuous operator in real Hilbert spaces. The method integrates a dual inertial extrapolation step, a relaxation step, and the subgradient extragradient technique, resulting in faster convergence than existing inertia-based subgradient extragradient methods. A key feature of the algorithm is its ability to achieve weak convergence without needing a prior guess of the operator’s Lipschitz constant in the problem. Our method encompasses a range of subgradient extragradient techniques with inertial extrapolation steps as particular cases. Moreover, the inertia in our algorithm is more flexible, chosen from the interval [0,1]. We establish R-linear convergence under the added hypothesis of strong pseudomonotonicity and Lipschitz continuity. Numerical findings are presented to showcase the algorithm’s effectiveness, highlighting its computational efficiency and practical relevance. A notable conclusion is that using double inertial extrapolation steps, as opposed to the single step commonly seen in the literature, provides substantial advantages for variational inequalities. Full article
Show Figures

Figure 1

20 pages, 303 KiB  
Article
A New Algorithm for Variational Inequality Problems in CAT(0) Spaces
by Amna Kalsoom, Maliha Rashid, Ovidiu Bagdasar and Zaib Un Nisa
Mathematics 2024, 12(14), 2193; https://doi.org/10.3390/math12142193 - 12 Jul 2024
Viewed by 824
Abstract
Numerous strong and weak convergence results on variational inequality problems are known in the literature. We here study a variational inequality problem by using the viscosity approximation method in the nonlinear CAT(0) space, where some novel theorems are established for strong and Δ [...] Read more.
Numerous strong and weak convergence results on variational inequality problems are known in the literature. We here study a variational inequality problem by using the viscosity approximation method in the nonlinear CAT(0) space, where some novel theorems are established for strong and Δ-convergent sequences. Full article
Show Figures

Figure 1

15 pages, 268 KiB  
Article
Average Widths and Optimal Recovery of Multivariate Besov Classes in Orlicz Spaces
by Xinxin Li and Garidi Wu
Mathematics 2024, 12(9), 1400; https://doi.org/10.3390/math12091400 - 3 May 2024
Viewed by 823
Abstract
In this paper, we study the average Kolmogorov σ–widths and the average linear σ–widths of multivariate isotropic and anisotropic Besov classes in Orlicz spaces and give the weak asymptotic estimates of these two widths. At the same time, we also give [...] Read more.
In this paper, we study the average Kolmogorov σ–widths and the average linear σ–widths of multivariate isotropic and anisotropic Besov classes in Orlicz spaces and give the weak asymptotic estimates of these two widths. At the same time, we also give the asymptotic property of the optimal recovery of isotropic Besov classes in Orlicz spaces. Full article
Back to TopTop