1. Introduction
Optimization theory is widely used in concrete problems coming from decision theory, game theory, economics, data classification, production inventory, and portfolio selection. Since data in real-life problems are obtained, most of the time, by measurement or estimation, errors are bound to occur. But, the accumulation of errors may imply the computation results contradict the original problem. To overcome this shortcoming, the use of interval analysis, fuzzy numbers, and the robust technique to represent data has become a popular research direction in recent years.
The fractional optimization problem means to optimize the ratio of two objective (cost) functions (functionals). However, Dinkelbach [
1] and Jagannathan [
2] formulated a parametric technique to study a fractional optimization problem by converting it into an equivalent non-fractional optimization problem. Over time, many researchers used this technique to investigate various classes of fractional optimization problems. In this direction, we mention the works of Antczak and Pitea [
3], Mititelu [
4], Mititelu and Treanţă [
5], Treanţă and Mititelu [
6], and Antczak [
7]. Guo et al. [
8,
9] proposed symmetric gH-derivative and its applications to dual interval-valued optimization problems. For other connected ideas on this topic, interested readers are directed to Nahak [
10], Patel [
11], Kim and Kim [
12,
13,
14], and Manesh et al. [
15] and references therein.
Uncertain optimization problems arise when we have a large volume of data, old sources, sample disparity, inadequate information, and other factors that lead to data uncertainty. Therefore, the robust technique has an important role in studying the optimization problem governed by data uncertainty. Many researchers studied different optimization problems, including data uncertainty, and tried to formulate new and efficient results (see, for instance, Beck and Tal [
16], Jeyakumar et al. [
17], Treanţă [
18,
19,
20], Baranwal et al. [
21], Jayswal et al. [
22], Preeti et al. [
23], and references therein).
In this paper, we state a constrained fractional optimization problem involving data uncertainty in the objective functional generated by curvilinear-type integral. Namely, by considering the parametric technique, we present the robust Karush-Kuhn-Tucker necessary optimality conditions and establish their sufficiency by using the convexity and concavity assumptions of the involved functionals. The present paper has several principal credits. We mention the most important: (i) defining, by using the parametric approach, of the notions of robust optimal solution for the case of curvilinear integral-type functionals, (ii) formulating novel proofs for the main results, and (iii) providing a framework determined by infinite dimensional normed spaces of functions and curvilinear integral-type functionals. These elements are original in the field of robust fractional optimization problems.
The paper continues as follows. In
Section 2, we state the basic notations and concepts in order to formulate and prove the main results. We introduce the fractional optimization problem involving data uncertainty in the objective functional, the associated non-fractional optimization problem, and the corresponding robust counterparts. In
Section 3, under suitable convexity and concavity assumptions, we establish the robust optimality (necessary and sufficient) conditions for the considered problem. Finally,
Section 4 provides the conclusions and future research direction associated with this paper.
2. Notations, Assumptions and Problem Formulation
Next, we are considering some basic notations and assumptions in order to formulate and prove the new main theorems derived in the present study. In this regard, we start with the standard finite-dimensional Euclidean spaces
,
and
, with
, and
as arbitrary points of
,
and
, respectively. Let
be a hyper-parallelepiped, having the diagonally opposite corners
and
, and let
be a curve (piecewise differentiable), joining the points
and
in
. Define
as the space of piecewise smooth functions (
state variables), and the space of piecewise continuous functions (
control variables), respectively, and assume the product space
is endowed with the norm induced by the following inner product
Using the above mathematical notations and elements, by denoting
, we formulate the following first-order
-constrained fractional optimization problem, with data uncertainty in the objective functional, as follows
where
f and
g are some uncertainty parameters in the convex compact sets
and
, respectively, and
, are assumed to be continuously differentiable functionals.
Definition 1. The above functionalsandare named path-independent if and , for . Assumption 1. By considering the above functionalsandare path-independent, the following working hypothesis is assumed:is a total exact differential, with . The robust counterpart for
, reducing the possible uncertainties in
, is given as
where
and
are defined as in
.
The set of all feasible solutions to
, which is the same as the set of all feasible solutions to
, is defined as
For
, we assume that
and
. Further, by considering the positive real number
on the line of Jagannathan [
2], Dinkelbach [
1], and following Mititelu and Treanţă [
5], we build a non-fractional optimization problem associated with
, as
The robust counterpart for
is given by
Next, for a simple presentation, we will use the following abbreviations throughout the paper: .
Definition 2. A point is said to be a robust optimal solution to , iffor all . Definition 3. A point is said to be a robust optimal solution to , iffor all . Remark 1. We can observe that is the set of feasible solutions to (and, also, for ).
Remark 2. The robust optimal solutions to (or ) are also robust optimal solutions to (or ).
Next, in order to prove the principal results of this paper, we present the definition of convex and concave curvilinear integral functionals (see, for instance, Treanţă [
24]).
Definition 4. A curvilinear integral functional is said to be convex at if the following inequalityholds, for all . Definition 5. A curvilinear integral functional is said to be concave at if the following inequalityholds, for all . 3. Robust Optimality Conditions
In this part of the present study, under suitable hypotheses, we establish the robust necessary and sufficient optimality conditions associated with the fractional optimization problem .
Now, we provide an auxiliary result that will be used to establish the robust sufficient optimality conditions for . More precisely, we present the equivalence between the robust optimal solutions to and .
Proposition 1. If is a robust optimal solution to , then there exists the positive real number such that is a robust optimal solution to . Moreover, if is a robust optimal solution to and , then is a robust optimal solution to .
Proof. By
reductio ad absurdum, let us assume that
is a robust optimal solution to
, but it is not a robust optimal solution to
. In consequence, there exists
such that
Now, if we consider
, we get
which is equivalent with
and this contradicts
is a robust optimal solution to
.
Conversely, let
be a robust optimal solution to
, with
and suppose that
is not a robust optimal solution to
. Thus, there exists
such that
and, by taking into account the definition of
, the above inequality becomes
or, equivalently,
which contradics that
is a robust optimal solution to
, and the proof is complete. □
The next theorem formulates the robust necessary conditions of optimality for .
Theorem 1. Consider is a robust optimal solution for the robust fractional optimization problem and , . Then, there exist and the piecewise differentiable functions , satisfyingfor , except at points of discontinuity. Proof. Let us consider some variations for
and
, respectively, as follows:
and
, where
are the variational parameters. Therefore, we convert the involved integral functionals in functions depending on
, defined as
and
By hypothesis, the pair
is assumed to be a robust optimal solution to
. Thus, the point
becomes an optimal solution to the following robust optimization problem
subject to
In consequence, there exist the Lagrange multipliers
,
, fulfilling the Fritz John conditions
(see
as the gradient of
at
). The first relation fomulated in
is rewritten as
or, in an equivalent manner, as follows
where we used the method of integration by parts, the boundary conditions, and the divergence formula.
In the following, taking into account one of the fundamental lemmas from the theory of calculus of variations, we get
or, in an equivalent way,
Finally, the second part formulated in
,
provides us
and this completes the proof. □
Remark 3. The relations – in Theorem 1 are called robust necessary optimality conditions for the robust fractional optimization problem .
Definition 6. The feasible solution is said to be a normal robust optimal solution to if (see Theorem 1).
Further, we state a result on the robust sufficient conditions associated with the considered fractional optimization problem.
Theorem 2. If , , the robust necessary optimality conditions – are fulfilled, and the involved integral functionals , and are convex at , and is concave at , then the pair is a robust optimal solution to .
Proof. By contrary, let us suppose the pair
is not a robust optimal solution to
. By considering Proposition 1, it results that the pair
is not a robust optimal solution to
, as well. Thus, there exists
satisfying
and by taking
and
, we obtain
By hypothesis, the integral functional
is convex at
, and the integral functional
is concave at
. Therefore, it follows
and
Now, on multiplying the inequality
with
, and subtracting it from the inequality
, it results
and, by using relation
, we get
Also, by using the assumptions formulated in the theorem, since the integral functionals
and
are convex at
, we obtain
and
Taking into account the feasibility of
for
and by using the relations
–
, we obtain
and
On adding the inequalities
–
, we obtain
On the other hand, after multiplying the robust necessary optimality conditions
and
with the terms
and
, and integrating them, by adding the resulting equations, we get
that contradicts the inequality
and the proof is complete. □
Example 1. The following application illustrates the theoretical developments formulated in the previous sections of this study. In this regard, we consider we are interested only in real-valued (that is, ) affine piecewise smooth control and state functions, , and (that is, ) is a square fixed by the diagonally opposite corners and . Let us introduce the following fractional optimization problem with data uncertainty in objective functional: We can notice that the robust feasible solution set to isand, by direct computation (see Theorem 1), we find , and at it satisfies the robust necessary optimality conditions – with , the uncertainty parameters , and Lagrange multipliers . Further, it can also be easily verified that all the conditions of Theorem 2 are satisfied, which ensure that is a robust optimal solution to . 4. Conclusions
In this paper, we have studied a class of fractional variational control problems involving data uncertainty in the objective functional. Under the convexity and concavity hypotheses of the involved functionals, we have established the associated robust Karush-Kuhn-Tucker necessary and sufficient optimality conditions. Concretely, we have defined, by using the parametric approach, the notions of a robust optimal solution for the case of curvilinear integral-type functionals. Also, we have formulated novel proofs for the main results, and we have provided a framework determined by infinite dimensional normed spaces of functions and curvilinear integral-type functionals. To the best of the authors’ knowledge, the results presented in this paper are new in the specialized literature. In addition, as future research directions of this paper, the author mentions the presence of data uncertainty in the constraints, the associated duality theory, and saddle-point optimality criteria.