Novel Three-Way Decisions Models with Multi-Granulation Rough Intuitionistic Fuzzy Sets

: The existing construction methods of granularity importance degree only consider the direct inﬂuence of single granularity on decision-making; however, they ignore the joint impact from other granularities when carrying out granularity selection. In this regard, we have the following improvements. First of all, we deﬁne a more reasonable granularity importance degree calculating method among multiple granularities to deal with the above problem and give a granularity reduction algorithm based on this method. Besides, this paper combines the reduction sets of optimistic and pessimistic multi-granulation rough sets with intuitionistic fuzzy sets, respectively, and their related properties are shown synchronously. Based on this, to further reduce the redundant objects in each granularity of reduction sets, four novel kinds of three-way decisions models with multi-granulation rough intuitionistic fuzzy sets are developed. Moreover, a series of concrete examples can demonstrate that these joint models not only can remove the redundant objects inside each granularity of the reduction sets, but also can generate much suitable granularity selection results using the designed comprehensive score function and comprehensive accuracy function of granularities.


Introduction
Pawlak [1,2] proposed rough sets theory in 1982 as a method of dealing with inaccuracy and uncertainty, and it has been developed into a variety of theories [3][4][5][6]. For example, the multi-granulation rough sets (MRS) model is one of the important developments [7,8]. The MRS can also be regarded as a mathematical framework to handle granular computing, which is proposed by Qian et al. [9]. Thereinto, the problem of granularity reduction is a vital research aspect of MRS. Considering the test cost problem of granularity structure selection in data mining and machine learning, Yang et al. constructed two reduction algorithms of cost-sensitive multi-granulation decision-making system based on the definition of approximate quality [10]. Through introducing the concept of distribution reduction [11] and taking the quality of approximate distribution as the measure in the multi-granulation decision rough sets model, Sang et al. proposed an α-lower approximate distribution reduction algorithm based on multi-granulation decision rough sets, however, the interactions among multiple granularities were not considered [12]. In order to overcome the problem of updating reduction, when the large-scale data vary dynamically, Jing et al. developed an incremental attribute reduction approach based on knowledge granularity with a multi-granulation view [13]. Then other multi-granulation reduction methods have been put forward one after another [14][15][16][17].
The notion of intuitionistic fuzzy sets (IFS), proposed by Atanassov [18,19], was initially developed in the framework of fuzzy sets [20,21]. Within the previous literature, how to get reasonable membership and non-membership functions is a key issue. In the interest of dealing with fuzzy information better, many experts and scholars have expanded the IFS model. Huang et al. combined IFS with MRS to obtain intuitionistic fuzzy MRS [22]. On the basis of fuzzy rough sets, Liu et al. constructed covering-based multi-granulation fuzzy rough sets [23]. Moreover, multi-granulation rough intuitionistic fuzzy cut sets model was structured by Xue et al. [24]. In order to reduce the classification errors and the limitation of ordering by single theory, they further combined IFS with graded rough sets theory based on dominance relation and extended them to a multi-granulation perspective. [25]. Under the optimistic multi-granulation intuitionistic fuzzy rough sets, Wang et al. proposed a novel method to solve multiple criteria group decision-making problems [26]. However, the above studies rarely deal with the optimal granularity selection problem in intuitionistic fuzzy environments. The measure of similarity between intuitionistic fuzzy sets is also one of the hot areas of research for experts, and some similarity measures about IFS are summarized in references [27][28][29], whereas these metric formulas cannot measure the importance degree of multiple granularities in the same IFS.
For further explaining the semantics of decision-theoretic rough sets (DTRS), Yao proposed a three-way decisions theory [30,31], which vastly pushed the development of rough sets. As a risk decision-making method, the key strategy of three-way decisions is to divide the domain into acceptance, rejection, and non-commitment. Up to now, researchers have accumulated a vast literature on its theory and application. For instance, in order to narrow the applications limits of three-way decisions model in uncertainty environment, Zhai et al. extended the three-way decisions models to tolerance rough fuzzy sets and rough fuzzy sets, respectively, the target concepts are relatively extended to tolerance rough fuzzy sets and rough fuzzy sets [32,33]. To accommodate the situation where the objects or attributes in a multi-scale decision table are sequentially updated, Hao et al. used sequential three-way decisions to investigate the optimal scale selection problem [34]. Subsequently, Luo et al. applied three-way decisions theory to incomplete multi-scale information systems [35]. With respect to multiple attribute decision-making, Zhang et al. study the inclusion relations of neutrosophic sets in their case in reference [36]. For improving the classification correct rate of three-way decisions, Zhang et al. proposed a novel three-way decisions model with DTRS by considering the new risk measurement functions through the utility theory [37]. Yang et al. combined three-way decisions theory with IFS to obtain novel three-way decision rules [38]. At the same time, Liu et al. explored the intuitionistic fuzzy three-way decision theory based on intuitionistic fuzzy decision systems [39]. Nevertheless, Yang et al. [38] and Liu et al. [39] only considered the case of a single granularity, and did not analyze the decision-making situation of multiple granularities in an intuitionistic fuzzy environment. The DTRS and three-way decisions theory are both used to deal with decision-making problems, so it is also enlightening for us to study three-way decisions theory through DTRS. An extension version that can be used to multi-periods scenarios has been introduced by Liang et al. using intuitionistic fuzzy decision-theoretic rough sets [40]. Furthermore, they introduced the intuitionistic fuzzy point operator into DTRS [41]. The three-way decisions are also applied in multiple attribute group decision making [42], supplier selection problem [43], clustering analysis [44], cognitive computer [45], and so on. However, they have not applied the three-way decisions theory to the optimal granularity selection problem. To solve this problem, we have expanded the three-way decisions models.
The main contributions of this paper include four points: (1) The new granularity importance degree calculating methods among multiple granularities (i.e., sig ,∆ in (A i , A , D) and sig ,∆ out (A i , A , D)) are given respectively, which can generate more discriminative granularities.
(2) Optimistic optimistic multi-granulation rough intuitionistic fuzzy sets (OOMRIFS) model, optimistic pessimistic multi-granulation rough intuitionistic fuzzy sets (OIMRIFS) model, pessimistic optimistic multi-granulation rough intuitionistic fuzzy sets (IOMRIFS) model and pessimistic pessimistic multi-granulation rough intuitionistic fuzzy sets (IIMRIFS) model are constructed by combining intuitionistic fuzzy sets with the reduction of the optimistic and pessimistic multi-granulation rough sets. These four models can reduce the subjective errors caused by a single intuitionistic fuzzy set.
(3) We put forward four kinds of three-way decisions models based on the proposed four multi-granulation rough intuitionistic fuzzy sets (MRIFS), which can further reduce the redundant objects in each granularity of reduction sets.
(4) Comprehensive score function and comprehensive accuracy function based on MRIFS are constructed. Based on this, we can obtain the optimal granularity selection results.
The rest of this paper is organized as follows. In Section 2, some basic concepts of MRS, IFS, and three-way decisions are briefly reviewed. In Section 3, we propose two new granularity importance degree calculating methods and a granularity reduction Algorithm 1. At the same time, a comparative example is given. Four novel MRIFS models are constructed in Section 4, and the properties of the four models are verified by Example 2. Section 5 proposes some novel three-way decisions models based on above four new MRIFS, and the comprehensive score function and comprehensive accuracy function based on MRIFS are built. At the same time, through Algorithm 2, we make the optimal granularity selection. In Section 6, we use Example 3 to study and illustrate the three-way decisions models based on new MRIFS. Section 7 concludes this paper.

Preliminaries
The basic notions of MRS, IFS, and three-way decisions theory are briefly reviewed in this section. Throughout the paper, we denote U as a nonempty object set, i.e., the universe of discourse and A = {A 1 , A 2 , · · · , A m } is an attribute set.

Definition 1 ([9]).
Suppose IS =< U, A, V, f > is a consistent information system, A = {A 1 , A 2 , · · · , A m } is an attribute set. And R A i is an equivalence relation generated by A. [x] A i is the equivalence class of R A i , ∀X ⊆ U, the lower and upper approximations of optimistic multi-granulation rough sets (OMRS) of X are defined by the following two formulas: ) is referred to as an optimistic multi-granulation rough set of X. Definition 2 ([9]). Let IS =< U, A, V, f > be an information system, where A = {A 1 , A 2 , · · · , A m } is an attribute set, and R A i is an equivalence relation generated by A. [x] A i is the equivalence class of R A i , ∀X ⊆ U, the pessimistic multi-granulation rough sets (IMRS) of X with respect to A are defined as follows: ) is referred to as a pessimistic multi-granulation rough set of X.
Definition 3 ([18,19]). Let U be a finite non-empty universe set, then the IFS E in U are denoted by: , the basic operations of E 1 and E 2 are given as follows: Definition 4 ( [30,31]). Let U = {x 1 , x 2 , · · · , x n } be a universe of discourse, ξ = {ω P , ω N , ω B } represents the decisions of dividing an object x into receptive POS(X), rejective NEG(X), and boundary regions BND(X), respectively. The cost functions λ PP , λ NP and λ BP are used to represent the three decision-making costs of ∀x ∈ U, and the cost functions λ PN , λ NN and λ BN are used to represent the three decision-making costs of ∀x / ∈ U, as shown in Table 1. Table 1. Cost matrix of decision actions.

Decision Actions Decision Functions
According to the minimum-risk principle of Bayesian decision procedure, three-way decisions rules can be obtained as follows: Here α, β and γ represent respectively:

Granularity Reduction Algorithm Derives from Granularity Importance Degree
Definition 5 ( [10,12]). Let DIS = (U, C ∪ D, V, f ) be a decision information system, A = {A 1 , A 2 , · · · , A m } are m sub-attributes of condition attributes C. U/D = {X 1 , X 2 , · · · , X s } is the partition induced by the decision attributes D, then approximation quality of U/D about granularity set A is defined as: where |X| denotes the cardinal number of set X. ∆ ∈ {O, I} represents two cases of optimistic and pessimistic multi-granulation rough sets, the same as the following.
A m } are m sub-attributes of C, A ⊆ A. ∀A i ∈ A , on the granularity sets A , the internal importance degree of A i for D can be defined as follows: Definition 8 ( [10,12]). Let DIS = (U, C ∪ D, V, f ) be a decision information system, A = {A 1 , A 2 , · · · , A m } are m sub-attributes of C, A ⊆ A. ∀A i ∈ A − A , on the granularity sets A , the external importance degree of A i for D can be defined as follows: Theorem 1. Let DIS = (U, C ∪ D, V, f ) be a decision information system, A = {A 1 , A 2 , · · · , A m } are m sub-attributes of C, A ⊆ A.
(1) For ∀A i ∈ A , on the basis of attribute subset family A , the granularity importance degree of A i in A with respect to D is expressed as follows: where 1 ≤ k ≤ m, k = i, the same as the following.
(2) For ∀A i ∈ A − A , on the basis of attribute subset family A , the granularity importance degree of A i in A − A with respect to D, we have: Proof.
(1) According to Definition 7, then (2) According to Definition 8, we can get: In Definitions 7 and 8, only the direct effect of a single granularity on the whole granularity sets is given, without considering the indirect effect of the remaining granularities on decision-making. The following Definitions 9 and 10 synthetically analyze the interdependence between multiple granularities and present two new methods for calculating granularity importance degree.
on the attribute subset family, A, the new internal importance degree of A i relative to D is defined as follows: | respectively indicate the direct and indirect effects of granularity A i on decision-making. When |sig it is shown that the granularity importance degree of A k is increased by the addition of A i in attribute subset A − {A i }, so the granularity importance degree of A k should be added to A i . Therefore, when there are m sub-attributes, we should add 1

then it shows that there is no interaction between granularity A i and other granularities, which means sig
.
the new external importance degree of A i relative to D is defined as follows: Similarly, the new external importance degree calculation formula has a similar effect.
The improved internal importance can be rewritten as: .
The improved external importance can be expressed as follows: .

Theorems 2 and 3 show that when sig
. And each granularity importance degree is calculated on the basis of removing A k from A , which makes it more convenient for us to choose the required granularity.
According to [10,12], we can get optimistic and pessimistic multi-granulation lower approximations L O and L I . The granularity reduction algorithm based on improved granularity importance degree is derived from Theorems 2 and 3, as shown in Algorithm 1.

Algorithm 1. Granularity reduction algorithm derives from granularity importance degree
end 16: end 17: return granularity reduction set A ∆ i ; 18: end Therefore, we can obtain two reductions by utilizing Algorithm 1. Example 1. This paper calculates the granularity importance of 10 on-line investment schemes given in Reference [12]. After comparing and analyzing the obtained granularity importance degree, we can obtain the reduction results of 5 evaluation sites through Algorithm 1, and the detailed calculation steps are as follows.
According to [12], we can get (1) Reduction set of OMRS First of all, we can calculate the internal importance degree of OMRS by Theorem 2 as shown in Table 2. Table 2. Internal importance degree of optimistic multi-granulation rough sets (OMRS).
Then, according to Algorithm 1, we can deduce the initial granularity set is Table 2, when using the new method to calculate internal importance degree, more discriminative granularities can be generated, which are more convenient for screening out the required granularities. In literature [12], the approximate quality of granularity A 2 in the reduction set is different from that of the whole granularity set, so it is necessary to calculate the external importance degree again. When calculating the internal and external importance degree, References [10,12] only considered the direct influence of the single granularity on the granularity A 2 , so the influence of the granularity A 2 on the overall decision-making can't be fully reflected.
(2) Reduction set of IMRS Similarly, by using Theorem 2, we can get the internal importance degree of each site under IMRS, as shown in Table 3. Table 3. Internal importance degree of pessimistic multi-granulation rough sets (IMRS).
According to Algorithm 1, the sites 2, 4, and 5 with internal importance degrees greater than 0, which are added to the granularity reduction set as the initial granularity set, and then the approximate quality of it can be calculated as follows:

Namely, the reduction set of IMRS is
In this paper, when calculating the internal and external importance degree of each granularity, the influence of removing other granularities on decision-making is also considered. According to Theorem 2, after calculating the internal importance degree of OMRS and IMRS, if the approximate quality of each granularity in the reduction sets are the same as the overall granularities, it is not necessary to calculate the external importance degree again, which can reduce the amount of computation.

Novel Multi-Granulation Rough Intuitionistic Fuzzy Sets Models
In Example 1, two reduction sets are obtained under IMRS, so we need a novel method to obtain more accurate granularity reduction results by calculating granularity reduction.
In order to obtain the optimal determined site selection result, we combine the optimistic and pessimistic multi-granulation reduction sets based on Algorithm 1 with IFS, respectively, and construct the following four new MRIFS models. Definition 11 ([22,25]). Suppose IS = (U, A, V, f ) is an information system, A = {A 1 , A 2 , · · · , A m }. ∀E ⊆ U, E are IFS. Then the lower and upper approximations of optimistic MRIFS of A i are respectively defined by: where R A i is an equivalence relation of x in A, [x] A i is the equivalence class of R A i ,and ∨ is a disjunction operation. Definition 12 ([22,25]). Suppose IS =< U, A, V, f > is an information system, A = {A 1 , A 2 , · · · , A m }. ∀E ⊆ U, E are IFS. Then the lower and upper approximations of pessimistic MRIFS of A i can be described as follows: where [x] A i is the equivalence class of x about the equivalence relation R A i , and ∧ is a conjunction operation.
Let E be IFS of U and they can be characterized by a pair of lower and upper approximations: , then E can be called OOMRIFS.
i is an optimistic multi-granulation attribute reduction set. Then the lower and upper approximations of pessimistic MRIFS under optimistic multi-granulation environment can be defined as follows: According to Definitions 13 and 14, the following theorem can be obtained.
A m }, and E 1 , E 2 be IFS on U. Comparing with Definitions 13 and 14, the following proposition is obtained.
Proof. It is easy to prove by the Definitions 13 and 14.
Definition 15. Let IS =< U, A, V, f > be an information system, and E be IFS on U.
i is a pessimistic multi-granulation attribute reduction set. Then, the pessimistic optimistic lower and upper approximations of E with respect to equivalence relation R A i I are defined by the following formulas: , then E can be called IOMRIFS.

Definition 16.
Let IS =< U, A, V, f > be an information system, and E be IFS on U.
i is a pessimistic multi-granulation attribute reduction set. Then, the pessimistic lower and upper approximations of E under IMRS are defined by the following formulas: where R A i I is an equivalence relation of x about the attribute reduction set A I i under IMRS, [x] A i O is the equivalence class of R A i I .
According to Definitions 15 and 16, the following theorem can be captured.

Theorem 5.
Let IS =< U, A, V, f > be an information system, A I i = {A 1 , A 2 , · · · , A r } ⊆ A, A = {A 1 , A 2 , · · · , A m }, and E 1 , E 2 be IFS on U. Then IOMRIFS and IIOMRIFS models have the following properties: Proof. It can be derived directly from Definitions 15 and 16.
The characteristics of the proposed four models are further verified by Example 2 below. Example 2. (Continued with Example 1). From Example 1, we know that these 5 sites are evaluated by 10 investment schemes respectively. Suppose they have the following IFS with respect to 10  .
Regarding Figure 1, we can get, As shown in Figure 1, the rules of Theorem 4 are satisfied. By constructing the OOMRIFS and OIMRIFS models, we can reduce the subjective scoring errors of experts under intuitionistic fuzzy conditions.
From (3) and (4), we can obtain Figure 2 as shown: From (3) and (4), we can obtain Figure 2 as shown: For Figure 2, we can get, As shown in Figure 2, the rules of Theorem 5 are satisfied. Note that µ 5 = µ IO (x j ) and ν 5 = ν IO (x j ) represent the lower approximation of IOMRIFS; µ 6 = µ IO (x j ) and ν 6 = ν IO (x j ) represent the upper approximation of IOMRIFS; µ 7 = µ I I (x j ) and ν 7 = ν I I (x j ) represent the lower approximation of IIMRIFS; µ 8 = µ I I (x j ) and ν 8 = ν I I (x j ) represent the upper approximation of IIMRIFS. For Figure 2, we can get, As shown in Figure 2, the rules of Theorem 5 are satisfied.
Through the Example 2, we can obtain four relatively more objective MRIFS models, which are beneficial to reduce subjective errors.

Three-Way Decisions Models Based on MRIFS and Optimal Granularity Selection
In order to obtain the optimal granularity selection results in the case of optimistic and pessimistic multi-granulation sets, it is necessary to further distinguish the importance degree of each granularity in the reduction sets. We respectively combine the four MRIFS models mentioned above with three-way decisions theory to get four new three-way decisions models. By extracting the rules, the redundant objects in the reduction sets are removed, and the decision error is further reduced. Then the optimal granularity selection results in two cases are obtained respectively by constructing the comprehensive score function and comprehensive accuracy function measurement formulas of each granularity of the reduction sets.

Three-Way Decisions Model Based on OOMRIFS
Suppose A O i is the reduction set under OMRS. According to reference [46], the expected loss function R OO (ω * |[x] A O i )( * = P, B, N) of object x can be obtained: The minimum-risk decision rules derived from the Bayesian decision process are as follows: , then x ∈ BND(X). Thus, the decision rules (P )-(B ) can be re-expressed concisely as: (P ) rule satisfies: ); (N ) rule satisfies: ); (B ) rule satisfies: ).
Therefore, the three-way decisions rules based on OOMRIFS are as follows: ) · α, then x ∈ BND(X).

Three-Way Decisions Model Based on OIMRIFS
Suppose A O i is the reduction set under OMRS. According to reference [46], the expected loss functions R OO (ω * |[x] A O i )( * = P, B, N) of an object x are presented as follows: Therefore, the three-way decisions rules based on OIMRIFS are as follows:

Three-Way Decisions Model Based on IOMRIFS
Suppose A I i is the reduction set under IMRS. According to reference [46], the expected loss functions R IO (ω * |[x] A I i )( * = P, B, N) of an object x are as follows: Therefore, the three-way decisions rules based on IOMRIFS are as follows:

Three-Way Decisions Model Based on IIMRIFS
Suppose A I i is the reduction set under IMRS. Like Section 5.1, the expected loss functions R I I (ω * |[x] A I i )( * = P, B, N) of an object x are as follows: Therefore, the three-way decisions rules based on IIMRIFS are captured as follows: (P4): If µ I I (x) ≥ (1 − π I I (x)) · α, then x ∈ POS(X); (N4): If µ I I (x) ≤ (1 − π I I (x)) · β, then x ∈ NEG(X); (B4): If (1 − π I I (x)) · β ≤ µ I I (x) and µ I I (x) ≤ (1 − π I I (x)) · α, then x ∈ BND(X). By constructing the above three decision models, the redundant objects in the reduction sets can be removed, which is beneficial to the optimal granular selection.

Comprehensive Measuring Methods of Granularity
Definition 17 ([40]). Let an intuitionistic fuzzy number E( f 1 ) = (µ E ( f 1 ), ν E ( f 1 )), f 1 ∈ U, then the score function of E( f 1 ) is calculated as: S The accuracy function of E( f 1 ) is defined as: Definition 18. Let DIS = (U, C ∪ D) be a decision information system, A = {A 1 , A 2 , · · · , A m } are m sub-attributes of C. Suppose E are IFS on the universe U = {x 1 , x 2 , · · · , x n }, defined by µ A i (x j ) and ν A i (x j ), where µ A i (x j ) and ν A i (x j ) are their membership and non-membership functions respectively. |[x j ] A i | is the number of equivalence classes of x j on granularity A i , U/D = {X 1 , X 2 , · · · , X s } is the partition induced by the decision attributes D. Then, the comprehensive score function of granularity A i is captured as: The comprehensive accuracy function of granularity A i is captured as: With respect to Definition 19, according to references [27,39], we can deduce the following rules.

Definition 19.
Let two granularities A 1 , A 2 , then we have: If CSF A 1 (E) > CSF A 2 (E), then A 2 is smaller than A 1 , expressed as A 1 > A 2 ; (iii) If CSF A 1 (E) < CSF A 2 (E), then A 1 is smaller than A 2 , expressed as A 1 < A 2 .

Optimal Granularity Selection Algorithm to Derive Three-Way Decisions from MRIFS
Suppose the reduction sets of optimistic and IMRS are A O i and A I i respectively. In this section, we take the reduction set under OMRS as an example to make the result A O i of optimal granularity selection. Algorithm 2. Optimal granularity selection algorithm to derive three-way decisions from MRIFS

Example Analysis 3 (Continued with Example 2)
In Example 1, only site 1 can be ignored under optimistic and pessimistic multi-granulation conditions, so it can be determined that site 1 does not need to be evaluated, while sites 2 and 3 need to be further investigated under the environment of optimistic multi-granulation. At the same time, with respect to the environment of pessimistic multi-granulation, comprehensive considera-tion site 3 can ignore the assessment and sites 2, 4 and 5 need to be further investigated.
According to Example 1, we can get that the reduction set of OMRS is {A 2 , A 3 }, but in the case of IMRS, there are two reduction sets, which are contradictory. Therefore, two reduction sets should be reconsidered simultaneously, so the joint reduction set under IMRS is {A 2 , A 4 , A 5 }.
Where the corresponding granularity structures of sites 2, 3, 4 and 5 are divided as follows: According to reference [11], we can get: (1) Optimal site selection based on OOMRIFS According to the Example 2, we can get the values of evaluation functions µ OO (x j ), (1 − π OO (x j )) · α, (1 − π OO (x j )) · β, µ OO (x j ), (1 − π OO (x j )) · α and (1 − π OO (x j )) · β of OOMRIFS, as shown in Table 4.  We can get decision results of the lower and upper approximations of OOMRIFS by three-way decisions of the Section 5.1, as follows: In the light of three-way decisions rules based on OOMRIFS, after getting rid of the objects in the rejection domain, we choose to fuse the objects in the delay domain with those in the acceptance domain for the optimal granularity selection. Therefore, the new granularities A 2 , A 3 are as follows:  = 0.1125, From the above results, in OOMRIFS, we can see that we can't get the selection result of sites 2 and 3 only according to the comprehensive score function of granularities A 2 and A 3 . Therefore, we need to further calculate the comprehensive accuracies to get the results as follows: = 0.8375, Through calculation above, we know that the comprehensive accuracy of the granularity A 3 is higher, so the site 3 is selected as the selection result.
In reality, we are more inclined to select the optimal granularity in the case of more stringent requirements. According to (1) and (2), we can find that the granularity A 3 is a better choice when the requirements are stricter in four cases of OMRS. Therefore, we choose site 3 as the optimal evaluation site.
(3) Optimal site selection based on IOMRIFS Similar to (1), we can obtain the values of evaluation functions µ IO (x j ), (1 − π IO (x j )) · α, (1 − π IO (x j )) · β, µ IO (x j ), (1 − π IO (x j )) · α and (1 − π IO (x j )) · β of IOMRIFS, as described in Table 6. Table 6. The values of evaluation functions for IOMRIFS. We can get decision results of the lower and upper approximations of IOMRIFS by three-way decisions in the Section 5.3, as follows: In summary, the comprehensive score function of the granularity A 2 is higher than the granularity A 3 in IOMRIFS, so we choose site 2 as the result of granularity selection.
(1−π II (x j ))·α (1−π II (x j ))·β µ II (x j ) (1−π II (x j ))·α (1−π II (x j ))·β We can get decision results of the lower and upper approximations of IIMRIFS by three-way decisions in the Section 5. In IIMRIFS, the values of the comprehensive score and comprehensive accuracy of granularity A 4 are higher than A 2 and A 5 , so site 4 is chosen as the evaluation site.
Considering (3) and (4) synthetically, we find that the results of granularity selection in IOMRIFS and IIMRIFS are inconsistent, so we need to further compute the comprehensive accuracies of IIMRIFS. Through the above calculation results, we can see that the comprehensive score and comprehensive accuracy of granularity A 4 are higher than A 2 and A 5 in the case of pessimistic multi-granulation when the requirements are stricter. Therefore, the site 4 is eventually chosen as the optimal evaluation site.

Conclusions
In this paper, we propose two new granularity importance degree calculating methods among multiple granularities, and a granularity reduction algorithm is further developed. Subsequently, we design four novel MRIFS models based on reduction sets under optimistic and IMRS, i.e., OOMRIFS, OIMRIFS, IOMRIFS, and IIMRIFS, and further demonstrate their relevant properties. In addition, four three-way decisions models with novel MRIFS for the issue of internal redundant objects in reduction sets are constructed. Finally, we designe the comprehensive score function and the comprehensive precision function for the optimal granularity selection results. Meanwhile, the validity of the proposed models is verified by algorithms and examples. The works of this paper expand the application scopes of MRIFS and three-way decisions theory, which can solve issues such as spam e-mail filtering, risk decision, investment decisions, and so on. A question worth considering is how to extend the methods of this article to fit the big data environment. Moreover, how to combine the fuzzy methods based on triangular or trapezoidal fuzzy numbers with the methods proposed in this paper is also a research problem. These issues will be investigated in our future work.