Optimality Conditions and Dualities for Robust Efﬁcient Solutions of Uncertain Set-Valued Optimization with Set-Order Relations

: In this paper, we introduce a second-order strong subdifferential of set-valued maps, and discuss some properties, such as convexity, sum rule and so on. By the new subdifferential and its properties, we establish a necessary and sufﬁcient optimality condition of set-based robust efﬁcient solutions for the uncertain set-valued optimization problem. We also introduce a Wolfe type dual problem of the uncertain set-valued optimization problem. Finally, we establish the robust weak duality theorem and the robust strong duality theorem between the uncertain set-valued optimization problem and its robust dual problem. Several main results extend to the corresponding ones in the literature


Introduction
Robust optimization is an important deterministic technique for studying optimization problems with data uncertainty, which is protected against data uncertainty and has grown significantly, see [1][2][3][4][5][6]. The optimization theory mainly includes multi-objective optimization and focuses on finding global optimal solutions or global efficient solutions. However, in real-world situations where the solutions are very susceptible to perturbations from the variables, we might not always be able to identify the global optimal solutions. To reduce the sensitivity to variable perturbations under these conditions, we are going to find the robust solutions.
The set-valued optimization problem: (SOP) min H(z) = {H 1 (z), H 2 (z), . . . , H k (z), . . . , H q (z)} s.t. z ∈ M, B j (z) ⊆ R − , j = 1, . . . , l has been widely studied by scholars, where M is a closed and convex subset of a real topological linear space X, H k : M → 2 R , k = 1, . . . , q and B j : M → 2 R , j = 1, . . . , l are given functions. Set-valued optimization is a thriving research field with numerous applications, for example in risk management [7,8], statistics [9], and others. Hamel and Heyde [7] defined set-valued (convex) measures of risk and their acceptance sets, and they gave dual representation theorems. Hamel et al. [8] defined set-valued risk measures on L p d with 0 p ∞ for conical market models, and primal and gave dual representation results. Hamel and Kostner [9] discussed relationships to families of univariate quantile functions and to depth functions, and introduced a corresponding Value at Risk for multivariate random variables as well as stochastic orders by the set-valued approach. The vectorial weak and strong duality. By means of multipliers and limiting subdifferentials of the related functions, Chuong [30] established necessary/sufficient optimality conditions for robust (weakly) Pareto solutions of a robust multiobjective optimization problem involving nonsmooth/nonconvex real-valued functions. In addition, they addressed a dual (robust) multiobjective problem to the primal one, and explored weak/strong duality. By virtue of subdifferential [31], Sun et al. [32] obtained optimality condition and established Wolfe type robust duality between the uncertain optimization problem and its uncertain dual problem under the conditions of continuity and cone-convex-concavity.
To the best of our knowledge, there are a few concepts of solutions for the uncertain set-valued optimization problem through set-order relation. Moreover, there is very little literature on the optimality condition and the dual theorem for set-based robust efficient solutions of uncertain set-valued optimization problems by terms of the second-order strong differential of a set-valued mapping. Lately, Som and Vetrivrl [33] introduced robustness for set-valued optimization to generalize some existing concepts of robustness for scalar and vector-valued optimization, and they followed the set approach for solutions to set-valued optimization problems.
To weaken the conditions of continuity and cone-convex-concavity [15,32], inspired by the subdifferential [20,22] and set-order relations [34], we introduce a new second-order strong subdifferential of set-valued mapping and define the set-based robust efficient solution for an uncertain set-valued optimization problem. Meanwhile, by using the second-order strong subdifferential of set-valued maps, we put forward Wolfe type dual problem and investigate the robust weak duality and robust strong duality of the set-based robust efficient solutions for uncertain set-valued optimization problems. This paper is organized as follows. We quickly go through the concepts in Section 2 before introducing a brand-new second-order strong subdifferential of a set-valued map. We derive some crucial new subdifferential features in Section 3. We obtain a necessary and sufficient condition for the set-based robust efficient solutions to the uncertain setvalued optimization problem in Section 4 thanks to the concept of the second-order strong subdifferential of set-valued mappings. The robust weak duality and robust strong duality of the uncertain set-valued optimization problem are established in Section 5. Section 6 is a short conclusion of the paper.

Preliminaries and Definitions
Throughout the paper, let X and Y be two real topological linear spaces with their topological dual spaces X * and Y * , respectively. 0 X and 0 Y denote the original points of X and Y, respectively. Let K ⊆ Y be a solid closed convex pointed cone. The dual cone of K is defined by K * = {y * ∈ Y * : y * , y 0, ∀y ∈ K}.
Let N be a natural number and n, m, l ∈ N. Let D ⊆ Y be a nonempty subset. clD and intD denote the closure and interior of D, respectively. T (Y)  (i) The lower set less order relation is defined by E l K S ⇔ E + K ⊇ S ⇔ ∀s ∈ S, ∃e ∈ E : e K s, E ≺ l K S ⇔ E + intK ⊇ S ⇔ ∀s ∈ S, ∃e ∈ E : e ≺ K s. (ii) The upper set less order relation is defined by

Definition 2 ([35]
). Let E, S ∈ T (Y) be arbitrarily chosen sets. Then the certainly less order relation is defined by E c K S ⇔ (E = S) or (E = S, ∀e ∈ E, ∀s ∈ S : e K s),

Definition 3 ([31]
). Let M be a nonempty subset of X. M is said to be convex if for any x, z ∈ M and for all β
Enlightened by the Borwein-strong subdifferential in [22,23], we put forward the new notion of second-order strong subdifferential for a set-valued map.
The set is said to be the second-order strong subdifferential of H at (x 1 , y 1 ). If ∂ 2 s H(x 1 , y 1 ) = ∅, then H is said to be second-order strong subdifferentiable at (x 1 , y 1 ).
The following example shows Definition 7.
is not complete. The following example shows the case.
Then it follows from Definition 7 that ξ does not exist, i.e., Therefore, the condition H(x 1 ) − y 1 ⊆ R + is necessary in Definition 7.

Remark 2.
Let H : M → 2 R be a set-valued map. Obviously, if the second-order strong subdifferential exists, then 0 ∈ ∂ 2 s H(x 1 , y 1 ). However, 0 ∈ ∂H(x 1 , y 1 ) may not necessarily be true. Now we give an example to illustrate the case.

Properties of a Second-Order Strong Subdifferential of Set-Valued Maps
In this section, we present some properties of a second-order strong subdifferential of set-valued maps. Firstly, we introduce the following lemma.
Then the set ∂ 2 s H(x 1 , y 1 ) is convex. Then, and and By Lemma 1, it follows from (1) and (2) that . This proof is complete.
Let H be second-order strong subdifferentiable at (x 1 , y 1 ). Then H has a global minimum at (x 1 , y 1 ) if and only if 0 X * ∈ ∂ 2 s H(x 1 , y 1 ).

Remark 3.
Let H and Q : M → 2 R be set-valued maps. If H and Q are strong subdifferentiable at (x 1 , y 1 ) and (x 1 , y 2 ), respectively, then However, √ 2 can not be omitted in Theorem 4.
We take into consideration the following examples to demonstrate Theorem 4 and Remark 3.

The Optimality Condition for the Uncertain Set-Valued Optimization Problem
Problem (SOP) has been studied extensively without taking into account data uncertainty. However, in most real-world practical applications, there are more uncertainties in optimization problems. To define an uncertain set-valued optimization problem (USOP), we assume that uncertainties in the objective function are given as scenarios from a known uncertainty set U = {u 1 , u 2 , . . . , u m } ⊆ R m , where u i is an uncertain parameter, i = 1, . . . , m.
The following uncertain set-valued optimization problem (USOP) can be used to describe the problem (SOP) when there is data uncertainty for both the objectives and the constraints: where H k : M × R m → 2 R , k = 1, . . . , q and B j : M × R l → 2 R , j = 1, . . . , l are given functions, and the uncertain parameter v j belongs to a compact and convex uncertainty set In this paper, we investigate problem (USOP) using a robust approach. As we all know, there is no proper method to directly solve problem (USOP), so it is necessary to replace problem (USOP) by the deterministic version, that is, the robust counterpart of problem (USOP). By this means, various concepts of robustness have been proposed on the basis of different robust counterparts to describe the preferences of decision makers.
The most celebrated and researched robustness concept is called worst-case robustness (also known as min-max robustness or strict robustness in the literature). The idea is to minimize the worst possible objective function value, and search for a solution that is good enough in the worst case. Meanwhile, the constraints should be satisfied for every parameter v j ∈ V j , j = 1, . . . , l. Worst-case robustness is a conservative concept and reveals the pessimistic attitude of a decision maker. Then, the robust (worst-case) counterpart of problem (USOP) is as follows :

Definition 8. The robust feasible set of problem (USOP) is defined by
We assume that A = ∅. Obviously, the set of all robust feasible solutions to problem (USOP) is the same as the set of all feasible solutions to problem (URSOP). Definition 9.z ∈ A is said to be a l R + -robust efficient solution to problem (USOP) ifz is a l R + -efficient solution to problem (URSOP), i.e., for all z ∈ A such that In this part, we create a necessary and sufficient optimality condition of the l R + -robust efficient solution to problem (USOP).

Theorem 5.
Let H k : M × R m → 2 R , k = 1, . . . , q and B j : M × R l → 2 R , j = 1, . . . , l be setvalued maps,z ∈ M,y ∈ u i ∈U H k (z, u i ) andy j ∈ v j ∈V j B j (z, v j ). Assume that the following conditions hold: (iii) for any i, j and k, H k (z, u i ) −y ⊆ R + and B j (z, v j ) −y j ⊆ R + ; (iv) for any j and k, H k , B j is second-order strong subdifferentiable at (z,y) and (z,y j ), respectively.
Thenz is a l R + -robust efficient solution to problem (USOP) if and only if for any i, j and k, there existȗ i ∈ U,v j ∈ V j andμ j ∈ R + such that Proof. (⇒) Letz be a l R + -robust efficient solution to problem (USOP). Thenz ∈ A. Hence, for all v j ∈ V j , we have B j (z, v j ) ⊆ R − . Thus, takev j ∈ V j such that Moreover, for any j, there existsμ j ∈ R + such that In fact, there are two cases to illustrate (5) as follows: Since U is a finite set and H k is bounded, there existsȗ i ∈ U such that According to the definition of the second-order strong subdifferential, one obtains 0 ∈ ∂ 2 s H k (·,ȗ i )(z,y) and 0 ∈ l ∑ j=1μ j ∂ 2 s B j (·,v j )(z,y j ).
(⇐) Assume that for any i, j and k, there existz ∈ A,ȗ i ∈ U,v j ∈ V j andμ j ∈ R + such that By Theorem 3 and Corollary 1, we get Obviously,y ∈ H k (z,ȗ i ),y j ∈ B j (z,v j ). Then by Definition 7, we get Since (μ j B j )(z,v j ) = {0} for any j, we calculate that ∑ l j=1 (μ 2 j B j )(z,v j ) = {0}, i.e., for the preceding elementy j ∈ B j (z,v j ), we have ∑ l j=1μ 2 jy j = 0. Together with ∑ l j=1 (μ 2 j B j )(z,v j ) ⊆ R − for all z ∈ A, i.e., ∑ l j=1μ 2 j y j ∈ R − for all z ∈ A and y j ∈ B j (z,v j ), it follows from (7) that y −y ∈ R + , ∀z ∈ A, y ∈ H k (z,ȗ i ), i.e., H k (z,ȗ i ) l R + H k (z,ȗ i ), ∀z ∈ A. Moreover, by the transitivity of l R + set-order relation, it follows from (6) and H k (z,ȗ i ) l R + max Thus,z is a l R + -robust efficient solution to problem (USOP). This proof is complete.

Remark 4.
(i) We extend the uncertain scalar optimization problem in [32] (Theorem 3.1) to the uncertain set-valued optimization problem (USOP) in Theorem 5. Data Availability Statement: Not applicable.

Conflicts of Interest:
The author declares no conflict of interest.