Abstract
In this paper, by considering the parametric technique, we study a class of fractional optimization problems involving data uncertainty in the objective functional. We formulate and prove the robust Karush-Kuhn-Tucker necessary optimality conditions and provide their sufficiency by considering the convexity and/or concavity assumptions of the involved functionals. In addition, to complete the study, an illustrative example is presented.
Keywords:
convexity; concavity; fractional optimization problem; robust necessary optimality conditions; robust sufficient optimality conditions; robust optimal solution MSC:
26B25; 49J20; 90C32
1. Introduction
Optimization theory is widely used in concrete problems coming from decision theory, game theory, economics, data classification, production inventory, and portfolio selection. Since data in real-life problems are obtained, most of the time, by measurement or estimation, errors are bound to occur. But, the accumulation of errors may imply the computation results contradict the original problem. To overcome this shortcoming, the use of interval analysis, fuzzy numbers, and the robust technique to represent data has become a popular research direction in recent years.
The fractional optimization problem means to optimize the ratio of two objective (cost) functions (functionals). However, Dinkelbach [1] and Jagannathan [2] formulated a parametric technique to study a fractional optimization problem by converting it into an equivalent non-fractional optimization problem. Over time, many researchers used this technique to investigate various classes of fractional optimization problems. In this direction, we mention the works of Antczak and Pitea [3], Mititelu [4], Mititelu and Treanţă [5], Treanţă and Mititelu [6], and Antczak [7]. Guo et al. [8,9] proposed symmetric gH-derivative and its applications to dual interval-valued optimization problems. For other connected ideas on this topic, interested readers are directed to Nahak [10], Patel [11], Kim and Kim [12,13,14], and Manesh et al. [15] and references therein.
Uncertain optimization problems arise when we have a large volume of data, old sources, sample disparity, inadequate information, and other factors that lead to data uncertainty. Therefore, the robust technique has an important role in studying the optimization problem governed by data uncertainty. Many researchers studied different optimization problems, including data uncertainty, and tried to formulate new and efficient results (see, for instance, Beck and Tal [16], Jeyakumar et al. [17], Treanţă [18,19,20], Baranwal et al. [21], Jayswal et al. [22], Preeti et al. [23], and references therein).
In this paper, we state a constrained fractional optimization problem involving data uncertainty in the objective functional generated by curvilinear-type integral. Namely, by considering the parametric technique, we present the robust Karush-Kuhn-Tucker necessary optimality conditions and establish their sufficiency by using the convexity and concavity assumptions of the involved functionals. The present paper has several principal credits. We mention the most important: (i) defining, by using the parametric approach, of the notions of robust optimal solution for the case of curvilinear integral-type functionals, (ii) formulating novel proofs for the main results, and (iii) providing a framework determined by infinite dimensional normed spaces of functions and curvilinear integral-type functionals. These elements are original in the field of robust fractional optimization problems.
The paper continues as follows. In Section 2, we state the basic notations and concepts in order to formulate and prove the main results. We introduce the fractional optimization problem involving data uncertainty in the objective functional, the associated non-fractional optimization problem, and the corresponding robust counterparts. In Section 3, under suitable convexity and concavity assumptions, we establish the robust optimality (necessary and sufficient) conditions for the considered problem. Finally, Section 4 provides the conclusions and future research direction associated with this paper.
2. Notations, Assumptions and Problem Formulation
Next, we are considering some basic notations and assumptions in order to formulate and prove the new main theorems derived in the present study. In this regard, we start with the standard finite-dimensional Euclidean spaces , and , with , and as arbitrary points of , and , respectively. Let be a hyper-parallelepiped, having the diagonally opposite corners and , and let be a curve (piecewise differentiable), joining the points and in . Define
as the space of piecewise smooth functions (state variables), and the space of piecewise continuous functions (control variables), respectively, and assume the product space is endowed with the norm induced by the following inner product
Using the above mathematical notations and elements, by denoting , we formulate the following first-order -constrained fractional optimization problem, with data uncertainty in the objective functional, as follows
where f and g are some uncertainty parameters in the convex compact sets and , respectively, and , are assumed to be continuously differentiable functionals.
Definition 1.
The above functionals
and
are named path-independent if and , for .
Assumption 1.
By considering the above functionals
and
are path-independent, the following working hypothesis is assumed:
is a total exact differential, with .
The robust counterpart for , reducing the possible uncertainties in , is given as
where and are defined as in .
The set of all feasible solutions to , which is the same as the set of all feasible solutions to , is defined as
For , we assume that and . Further, by considering the positive real number
on the line of Jagannathan [2], Dinkelbach [1], and following Mititelu and Treanţă [5], we build a non-fractional optimization problem associated with , as
The robust counterpart for is given by
Next, for a simple presentation, we will use the following abbreviations throughout the paper: .
Definition 2.
A point is said to be a robust optimal solution to , if
for all .
Definition 3.
A point is said to be a robust optimal solution to , if
for all .
Remark 1.
We can observe that is the set of feasible solutions to (and, also, for ).
Remark 2.
The robust optimal solutions to (or ) are also robust optimal solutions to (or ).
Next, in order to prove the principal results of this paper, we present the definition of convex and concave curvilinear integral functionals (see, for instance, Treanţă [24]).
Definition 4.
A curvilinear integral functional is said to be convex at if the following inequality
holds, for all .
Definition 5.
A curvilinear integral functional is said to be concave at if the following inequality
holds, for all .
3. Robust Optimality Conditions
In this part of the present study, under suitable hypotheses, we establish the robust necessary and sufficient optimality conditions associated with the fractional optimization problem .
Now, we provide an auxiliary result that will be used to establish the robust sufficient optimality conditions for . More precisely, we present the equivalence between the robust optimal solutions to and .
Proposition 1.
If is a robust optimal solution to , then there exists the positive real number such that is a robust optimal solution to . Moreover, if is a robust optimal solution to and , then is a robust optimal solution to .
Proof.
By reductio ad absurdum, let us assume that is a robust optimal solution to , but it is not a robust optimal solution to . In consequence, there exists such that
Now, if we consider , we get
which is equivalent with
and this contradicts is a robust optimal solution to .
Conversely, let be a robust optimal solution to , with
and suppose that is not a robust optimal solution to . Thus, there exists such that
and, by taking into account the definition of , the above inequality becomes
or, equivalently,
which contradics that is a robust optimal solution to , and the proof is complete. □
The next theorem formulates the robust necessary conditions of optimality for .
Theorem 1.
Consider is a robust optimal solution for the robust fractional optimization problem and , . Then, there exist and the piecewise differentiable functions , satisfying
for , except at points of discontinuity.
Proof.
Let us consider some variations for and , respectively, as follows: and , where are the variational parameters. Therefore, we convert the involved integral functionals in functions depending on , defined as
and
By hypothesis, the pair is assumed to be a robust optimal solution to . Thus, the point becomes an optimal solution to the following robust optimization problem
subject to
In consequence, there exist the Lagrange multipliers , , fulfilling the Fritz John conditions
(see as the gradient of at ). The first relation fomulated in is rewritten as
or, in an equivalent manner, as follows
where we used the method of integration by parts, the boundary conditions, and the divergence formula.
In the following, taking into account one of the fundamental lemmas from the theory of calculus of variations, we get
or, in an equivalent way,
Finally, the second part formulated in ,
provides us
and this completes the proof. □
Remark 3.
The relations – in Theorem 1 are called robust necessary optimality conditions for the robust fractional optimization problem .
Definition 6.
The feasible solution is said to be a normal robust optimal solution to if (see Theorem 1).
Further, we state a result on the robust sufficient conditions associated with the considered fractional optimization problem.
Theorem 2.
If , , the robust necessary optimality conditions – are fulfilled, and the involved integral functionals , and are convex at , and is concave at , then the pair is a robust optimal solution to .
Proof.
By contrary, let us suppose the pair is not a robust optimal solution to . By considering Proposition 1, it results that the pair is not a robust optimal solution to , as well. Thus, there exists satisfying
and by taking and , we obtain
By hypothesis, the integral functional is convex at , and the integral functional is concave at . Therefore, it follows
and
Now, on multiplying the inequality with , and subtracting it from the inequality , it results
and, by using relation , we get
Also, by using the assumptions formulated in the theorem, since the integral functionals and are convex at , we obtain
and
Taking into account the feasibility of for and by using the relations –, we obtain
and
On adding the inequalities –, we obtain
On the other hand, after multiplying the robust necessary optimality conditions and with the terms and , and integrating them, by adding the resulting equations, we get
that contradicts the inequality and the proof is complete. □
Example 1.
The following application illustrates the theoretical developments formulated in the previous sections of this study. In this regard, we consider we are interested only in real-valued (that is, ) affine piecewise smooth control and state functions, , and (that is, ) is a square fixed by the diagonally opposite corners and . Let us introduce the following fractional optimization problem with data uncertainty in objective functional:
We can notice that the robust feasible solution set to is
and, by direct computation (see Theorem 1), we find , and at it satisfies the robust necessary optimality conditions – with , the uncertainty parameters , and Lagrange multipliers . Further, it can also be easily verified that all the conditions of Theorem 2 are satisfied, which ensure that is a robust optimal solution to .
4. Conclusions
In this paper, we have studied a class of fractional variational control problems involving data uncertainty in the objective functional. Under the convexity and concavity hypotheses of the involved functionals, we have established the associated robust Karush-Kuhn-Tucker necessary and sufficient optimality conditions. Concretely, we have defined, by using the parametric approach, the notions of a robust optimal solution for the case of curvilinear integral-type functionals. Also, we have formulated novel proofs for the main results, and we have provided a framework determined by infinite dimensional normed spaces of functions and curvilinear integral-type functionals. To the best of the authors’ knowledge, the results presented in this paper are new in the specialized literature. In addition, as future research directions of this paper, the author mentions the presence of data uncertainty in the constraints, the associated duality theory, and saddle-point optimality criteria.
Funding
This research received no external funding.
Data Availability Statement
Not applicable.
Conflicts of Interest
The author declares no conflict of interest.
References
- Dinkelbach, W. On nonlinear fractional programming. Manag. Sci. 1967, 13, 492–498. [Google Scholar] [CrossRef]
- Jagannathan, R. Duality for nonlinear fractional programs. Z. Fuer Oper. Res. 1973, 17, 1–3. [Google Scholar] [CrossRef]
- Antczak, T.; Pitea, A. Parametric approach to multitime multiobjective fractional variational problems under (f, ρ)-convexity. Optim. Control Appl. Methods 2016, 37, 831–847. [Google Scholar] [CrossRef]
- Mititelu, S. Efficiency and duality for multiobjective fractional variational problems with (ρ, b)-quasiinvexity. Yugosl. J. Oper. Res. 2016, 19, 85–99. [Google Scholar] [CrossRef]
- Mititelu, Ş.; Treanţă, S. Efficiency conditions in vector control problems governed by multiple integrals. J. Appl. Math. Comput. 2018, 57, 647–665. [Google Scholar] [CrossRef]
- Treanţă, S.; Mititelu, Ş. Duality with (ρ, b)–quasiinvexity for multidimensional vector fractional control problems. J. Inf. Optim. Sci. 2019, 40, 1429–1445. [Google Scholar] [CrossRef]
- Antczak, T. Parametric approach for approximate efficiency of robust multiobjective fractional programming problems. Math. Methods Appl. Sci. 2021, 44, 11211–11230. [Google Scholar] [CrossRef]
- Guo, Y.; Ye, G.; Liu, W.; Zhao, D.; Treanţă, S. On symmetric gH-derivative applications to dual interval-valued optimization problems. Chaos Solitons Fractals 2022, 158, 112068. [Google Scholar] [CrossRef]
- Guo, Y.; Ye, G.; Liu, W.; Zhao, D.; Treanţă, S. Optimality conditions and duality for a class of generalized convex interval-valued optimization problems. Mathematics 2021, 9, 2979. [Google Scholar] [CrossRef]
- Nahak, C. Duality for multiobjective variational control and multiobjective fractional variational control problems with pseudoinvexity. J. Appl. Math. Stoch. Anal. 2006, 2006, 62631. [Google Scholar] [CrossRef]
- Patel, R.B. Duality for multiobjective fractional variational control problems with (F, ρ)-convexity. Int. J. Stat. Manag. Syst. 2000, 3, 113–134. [Google Scholar] [CrossRef]
- Kim, G.A.; Kim, M.H. On sufficiency and duality for fractional robust optimization problems involving (g,ρ)-invex function. East Asian Math. J. 2016, 32, 635–639. [Google Scholar] [CrossRef]
- Kim, M.H.; Kim, G.A. On optimality and duality for generalized fractional robust optimization problems. East Asian Math. J. 2015, 31, 737–742. [Google Scholar] [CrossRef]
- Kim, M.H.; Kim, G.S. Optimality conditions and duality in fractional robust optimization problems. East Asian Math. J. 2015, 31, 345–349. [Google Scholar] [CrossRef]
- Manesh, S.S.; Saraj, M.; Alizadeh, M.; Momeni, M. On robust weakly ϵ-efficient solutions for multi-objective fractional programming problems under data uncertainty. AIMS Math. 2021, 7, 2331–2347. [Google Scholar] [CrossRef]
- Beck, A.; Tal, A.B. Duality in robust optimization: Primal worst equals dual best. Oper. Res. Lett. 2009, 37, 1–6. [Google Scholar] [CrossRef]
- Jeyakumar, G.; Li, G.; Lee, G.M. Robust duality for generalized convex programming problems under data uncertainty. Nonlinear Anal. Theory Methods Appl. 2012, 75, 1362–1373. [Google Scholar] [CrossRef]
- Treanţă, S. Efficiency in uncertain variational control problems. Neural. Comput. Appl. 2021, 33, 5719–5732. [Google Scholar] [CrossRef]
- Treanţă, S. Robust saddle-point criterion in second-order partial differential equation and partial differential inequation constrained control problems. Int. J. Robust Nonlinear Control. 2021, 31, 9282–9293. [Google Scholar] [CrossRef]
- Treanţă, S.; Jiménez, M.A. On generalized KT-pseudoinvex control problems involving multiple integral functionals. Eur. J. Control. 2018, 43, 39–45. [Google Scholar] [CrossRef]
- Baranwal, A.; Jayswal, A.; Kardam, P. Robust duality for the uncertain multitime control optimization problems. Int. J. Robust Nonlinear Control 2022, 32, 5837–5847. [Google Scholar] [CrossRef]
- Jayswal, A.; Preeti; Arana-Jiménez, M. Robust penalty function method for an uncertain multi-time control optimization problems. J. Math. Anal. Appl. 2022, 505, 125453. [Google Scholar] [CrossRef]
- Jayswal, A.; Preeti; Arana-Jiménez, M. An exact l1 penalty function method for a multitime control optimization problem with data uncertainty. Optim. Control Appl. Methods 2020, 41, 1705–1717. [Google Scholar] [CrossRef]
- Treanţă, S. On Controlled Variational Inequalities Involving Convex Functionals. In Optimization of Complex Systems: Theory, Models, Algorithms and Applications; Le Thi, H., Le, H., Pham Dinh, T., Eds.; WCGO 2019; Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2020; Volume 991, pp. 164–174. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).