S-Score Table-Based Parameter-Reduction Approach for Fuzzy Soft Sets

: A fuzzy soft set is a mathematical tool used to deal with vagueness and uncertainty. Parameter reduction is an important issue when applying a fuzzy soft set to handle decision making. However, existing methods neglect newly added parameters and have higher computational com-plexities. In this paper, we propose a new S-Score table-based parameter-reduction approach for fuzzy soft sets. Compared with two existing methods of parameter reduction for a fuzzy soft set, our method takes newly added parameters into account, which brings about greater ﬂexibility and is beneﬁcial to the extension of fuzzy soft sets and a combination of multiple fuzzy soft sets. Addi-tionally, our method accesses fewer elements from the dataset, which results in lower computation compared with the two existing approaches. The experimental results from two applications show the availability and feasibility of our approach.


Introduction
Molodtsov [1,2] proposed soft set theory as a novel concept in 1999 to handle vagueness and uncertainty. Mathematical tools using a combination of the soft set model and other mathematical models have since been developing rapidly, such as the fuzzy soft set [3], the intuitionistic fuzzy soft set [4], the interval-valued fuzzy soft set [5], the interval-valued intuitionistic fuzzy soft set [6,7], the belief interval-valued soft set [8], the confidence soft sets [9], the linguistic value soft set [10], separable fuzzy soft sets [11], dual hesitant fuzzy soft sets [12], the Z-soft fuzzy rough set [13], the fuzzy parameterized fuzzy soft set [14][15][16][17], interval-valued q-rung orthopair fuzzy soft sets [18], the interval-valued multi-polar fuzzy soft set [19], the soft rough set [20], etc. The fuzzy soft set is one of the most important branches of soft sets. Maji et al. [21] was the first to combine a fuzzy set with a soft set and put forward the idea of a fuzzy soft set. This concept has been further developed in [22]. There are many practical and valuable applications based on fuzzy soft sets. Sadiq et al. [23] proposed an approach for ranking the functional requirements of software using fuzzy soft set theory. A novel time-varying weight determination method based on a fuzzy soft set was given in [24]. A combination of the association rule method and the fuzzy soft set model was proposed in [25]. It is worth mentioning that the concept of a fuzzy soft set has been widely used in the field of decision making. Maji and Roy [26] proposed a target-recognition method based on a fuzzy soft set for imprecise multi-observer data, which has been improved in [27]. Another author [28] discussed the fuzzy soft aggregation operator, which supports the creation of more effective decision making approaches. Using the horizontal soft set in [29], an adjustable decision method based on the fuzzy soft set was proposed. Authors of [30] described the concept of fuzzy soft matrices and their related operations, which allowed them to propose a new decision-making method. The authors of [31] showed a process of information fusion that provides a more reliable resultant fuzzy soft set from an input dataset. Tang et al. [32] proposed the gray relational analysis method based on a fuzzy soft set in decision making. Deng et al. [33] proposed an object-parameter method to predict missing data in incomplete fuzzy soft sets. Uncertainty handling is one of the most important and difficult tasks in medical decision-making. The authors of [34] improved the decision algorithm based on a fuzzy soft set using a fuzzy measure and D-S evidence theory, which is often applied to medical diagnoses. The authors of [35] proposed a chest X-ray-enhanced diagnosis method for pneumonia malformations based on a fuzzy soft set and D-S evidence theory. Chen et al. [36] proposed a group decision-making algorithm based on an extended fuzzy soft set in order to identify cognitive differences among different decision makers.
Nevertheless, there are some redundant parameters in the actual decision-making process. A parameter-reduction set is the smallest subset of parameters that exhibit the same reduction results or descriptions as the original parameter set. Parameter reduction is one of important research issues involving applications of these tools that deal with uncertainty [37,38]. Kong et al. [39] first proposed the normal parameter reduction of fuzzy soft set theory. Ma et al. [40] proposed an efficient distance-based parameter-reduction algorithm for this model. In [41], a reduction in the parameters of fuzzy soft sets was studied from a new perspective on scoring criteria and was improved in [42]. However, the two existing algorithms do not consider newly added parameters and have higher computation, which lead to low extendibility.
To address these issues, we propose a S-Score table-based parameter-reduction method for fuzzy soft sets. Our contributions are as follows: (1) A new S-Score table-based parameter-reduction method for fuzzy soft sets is presented.
(2) The proposed approach has relatively lower computation in comparison with the two existing algorithms in [41,42]. (3) The proposed approach considers newly added parameters. Due to this consideration of the added parameters, our proposed approach has much better flexibility and is beneficial to the extension of fuzzy soft sets and a combination of multiple fuzzy soft sets. (4) The experimental results on two real-life applications show the availability and feasibility of our approach.
The rest of this paper is organized as follows. Section 2 reviews the basic concepts of soft set theory and fuzzy soft set theory and discusses the two parameter-reduction methods for fuzzy soft set proposed in [41,42]. In Section 3, our parameter-reduction algorithm for fuzzy soft sets based on an S-Score table is proposed. In Section 4, this newly proposed algorithm is compared with the two existing methods in two real-life applications. Finally, Section 5 concludes the paper.

Related Work
In the current section, we briefly recall the basic ideas and notion of soft sets, fuzzy soft sets, and related parameter-reduction methods for fuzzy soft sets.

Basic Notions
First, we recall the basic definition of fuzzy sets initially developed by Zadeh [43] in 1965.

Definition 1. ([43]).
A fuzzy set F in the universe U is defined as U F is called the membership function of F, and U F indicates the membership degree from X to F. The family of all fuzzy sets on U is denote by F(U).
Molodtsov [1] defined soft sets in the following way. Let U be an initial universe of objects and E be the set of parameters in relation to objects in U.
Parameters are often regarded as attributes, characteristics, or properties of objects. Let P(U) denote the power set of U and A ⊆ E.

Definition 2.
A pair (F, A) is called a soft set over U, where F is a mapping given by F : A → P(U) .
Maji et al. [21] initiated the study on hybrid structures involving both fuzzy sets and soft sets. They introduced the notion of fuzzy soft sets, which can be seen as the fuzzy generalization of a classical soft set. Maji et al. [21] proposed the concept of a fuzzy soft set as follows.
Definition 3. (See [21]). Let U be an initial universe of objects, E be a set of parameters in relation to objects in U, and ξ(U) be the set of all fuzzy subsets of U. A pair (F, E) is called a fuzzy soft set over ξ(U), where F is a mapping given by F : E → ξ(U) .

Existing Parameter-Reduction Methods for Fuzzy Soft Sets
Parameter reduction is an important process of decision-making applications for fuzzy soft sets. Here, we mainly recall two existing methods of parameter reduction.
Kong et al. [41] combined the score decision criterion with the standard parameter reduction method of a soft set and developed a score decision criterion parameter-reduction algorithm based on a fuzzy soft set, which is abbreviated as the S-normal reduction algorithm (SNR).
However, this method has high computation. To simplify the calculation, an improved parameter-reduction method for score decision criteria of a fuzzy soft set (ISNR) was presented in [42].
However, the two existing Algorithm 1 and Algorithm 2 presented above do not consider newly added parameters and has higher computation, which lead to low extendibility when multiple datasets are combined. As a result, we propose a new parameter-reduction method that takes the added parameters into account. Algorithm 1: S-normal reduction algorithm [41] (SNR).
Step 1: Input a fuzzy soft set (F, A); Step 2: Compute the comparison table W E−T and check the matrix W E −W E−T for any subset T ⊂ U. If this matrix is symmetric, T is put into redundant set R; Step 3: Check if T is the largest non-essential subset of E; in that case, E-T is an S-normal parameter reduction of R.

Algorithm 2: ISNR [42].
Step 1: Input a fuzzy soft set (F, A); Step 2: Compute the comparison matrix W T (T ⊂ U). If this matrix is symmetric, T can be reduced; Step 3 : E − T is the parameter-reduction result of a fuzzy soft set.

Our Proposed Method
By introducing a new concept called an S-Score table in this section, we provide a new approach that successfully overcomes the limitations of the S-normal and I-S-normal methods.
Let U = {α 1 , α 2 , . . . , α n } be the universe and A = {e 1 , e 2 , . . . e m } be the attribute set. µ F(e l ) (α i ) is the membership value of object α i for parameter e l . R(α i )(e l ) is the number of objects for which the membership value of α i is equal to or greater than the membership value of α j and T(α i )(e l ) is the number of objects for which the membership value of α i is equal to or less than the membership value of α j .
The S-Score value of object α i on e l is denoted by S(α i )(e l ) and defined by The overall S-Score of object α i is denoted by S i and defined as The S-Score table is a table in which the rows are labeled by the attribute e 1 , e 2 , . . . e m and the columns are labeled by the objects α 1 , α 2 , . . . , α n . An entry corresponding to attribute e l and object α i is denoted by S(α i )(e l ).
To illustrate our method in Algorithm 3, we give the following example.

Algorithm 3:
The proposed reduction algorithm based on S-Score table.
Step 1: Input a fuzzy soft set (F, A); Step 2: Compute the S-Score value S(a i )(e l ) of a i to attribute e l and create the S-Score table for (F, A); Step 3: Compute S T for any subset of T ⊂ A; if S T = 0 for all objects, then T is called a non-essential set in E.
Step 4: Check whether the non-essential subset T is the largest non-essential subset. If so, A-T is the final parameter-reduction result.
According to our Algorithm 3, the following steps are given: Step 1: Input a fuzzy soft set (S, P) as shown in Table 1; Step 2: Compute the S-Score valve S(p i )(e l ) for each object P i to the attribute e l using Equation (1); Hence, we can obtain the following: Similarly, we calculate the remaining S-Score valves as shown in Table 2.
Step 3: Compute S T for any subset of T ⊂ E; if S T = 0 for all objects, then T is called a non-essential set in E.
In this process, we find that T = {e 2 , e 3 } for all of objects S T = 0, which is illustrated in Table 3.
Step 4: We find that the reduced subset T = {e 2 , e 3 } is the largest non-required subset. If so, the remaining parameter subset E-T = {e 1 , e 4, e 5 , e 6, e 7 } is the final reduction result.
We also apply SNR and ISNR to Example 4.1. As a result, the parameter-reduction results are {e 1 , e 4, e 5 , e 6, e 7 }. That is, three methods provide equivalent reduction results. In order to verify this point, we give the following Theorem. Theorem 1. Suppose that (F, E) is a fuzzy soft set on U; U = P 1 , P 2 , . . . , P n ; and E = e 1 , e 2 , . . . e m . For any P i ∈ U, calculating its score S i and priority ranking PR i using the SNR algorithm, its score S i and priority rank PR i using the ISNR algorithm, and its overall S-Score S i and priority rank PR i using our algorithm, we have S i = S i = S i and PR i = PR i = PR i .
, we can obtain the following: In the same way, we can obtain S i = S i . To sum up, we can have S i = S i = S i . We use Equations (1) and (2) to create the S-Score table of the fuzzy soft set (F, E), which is shown in Table 4. We can obtain the rank of objects PR i according to S E i .
S(a n−1 )(e 1 ) S(a n−1 )(e 2 ) . . . S(a n−1 )(e m−1 ) S(a n−1 )(e m ) S E n−1 P n S(a n )(e 1 ) S(a n )(e 2 ) . . . S(a n )(e m−1 ) S(a n )(e m ) S E n According to our method, we should find the subset T ⊂ E that satisfies S T 1 = S T 2 = . . . = S T n = 0. Because S T 1 = S T 2 = . . . = S T n = 0, it is clear that the rank of objects PR i based on s E i is the same as the rank of objects according to s E−T i . That is, the object priority remains unchanged after the redundant parameter set is reduced, so PR i = PR i = PR i . This completes the proof.
From the above theorem, we can conclude that the three reduction algorithms provide equivalent reduction results.

Comparison Results among Three Methods
In this section, first, we compare the proposed algorithm with the two existing methods-SNR (Algorithm 1) and ISNR (Algorithm 1)-on two real-life applications.
As a result, we summarize the comparison among three methods from aspects such as consideration of the added parameters, flexibility, and computational complexity.

Case 1: Personal Postgraduate Enrollment for the Supervisors
After the postgraduate entrance examination, one supervisor who works at Northwest Normal University plans to recruit one graduate student majoring in computer science. He receives five emails with resumes. That is, there are five candidates. Furthermore, this supervisor examines the five resumes and summarizes seven appraisal items as diverse as "reputation of the university at which the candidate studied their bachelor's degree", "international ranking of their computer science major", "GPA", "English reading ability", "English writing ability", "academic performance", and "internship experience" to evaluate the five candidates. We apply a fuzzy soft set to display the performances of the five candidates about the seven aspects. Suppose that U = {p 1 , p 2 , p 3 , p 4 , p 5 } is the set of five candidates, and E = {e 1 , e 2 , e 3 , e 4 , e 5 , e 6 , e 7 } is the set of seven appraisal items. Table 5 presents the data records for the personal postgraduate enrollment system used by the supervisor as a fuzzy soft set (F, E). According to the algorithm of SNR, first, we calculate the comparison and score table of a fuzzy soft set (F, E), which is shown in Table 6. We can see that the scores of each object are −2, −6, 4, 4, and 0, respectively. The priority ranking is p 3 = p 4 > p 5 > p 1 > p 2 . We check the matrix W E -W E-T for subset T = e 1 , e 2 , e 5 , e 7 ⊂ U and find that this matrix is symmetric. As a result, the final S-normal reduction result is { e 3 , e 4 , e 6 }.
In this algorithm, we first calculate the comparison table of the fuzzy soft set, the number of elements accessed is 25 × 7 = 175. Next, we check the matrix W E −W E−T for subset T = {e 1 , e 2 , e 5 , e 7 } ⊂ U, In this step, the number of elements accessed is 2 × 25 × 5 + 6 × 25 = 400. From the above steps, we can conclude that the total number of elements accessed for S-normal is 575.

ISNR
First, the special subset T = e 1 , e 2 , e 5 , e 7 is found; then, we calculate the fuzzy soft set (F, T ) from the comparison table and the score table, which are shown in Table 7. The number of elements accessed for this step is 100. The difference is calculated from the score table, and the number of elements accessed is 50. From the above steps, we can obtain that the total number of elements accessed by ISNR is 150. The New Proposed Algorithm According to our proposed algorithm, the following steps are given: a special subset T = {e 1 , e 2 , e 5 , e 7 } is found, the number of elements accessed is 80 when comparing the difference between the membership degrees of different objects using Equation (1) in this step. Additionally, the number of elements accessed, shown in Table 8 using Equation (2) is 20. From the above steps, the total number of elements assessed is up to 100.
Compared with the SNR algorithm, it is clear that the improvement in the total number of elements accessed is up to 42% in this process, while the improvement in the total number of elements accessed is 21% compared with ISNR.

Three Methods on the Extended Dataset
Suppose that (F, E) is an original fuzzy soft set and that the new attribute set E = e 1 , e 2 . . . , e r should be added to E. If parameter reduction is performed using SNR, you have to assemble two parameter sets into a parameter set and to compute a new comparison table for the new fuzzy soft set (H, E ∪ E ). Here, after face-to-face interviews for the five candidates, the supervisor considers adding two new attributes such as "expression ability" and "interest in research" to evaluate the applicants shown in Table 9. However, for the newly added parameters e 1 , e 2 , three methods have different reduction processes and number of elements accessed. Table 9. Fuzzy soft set (F, E ).

SNR
According to SNR, the following steps are given: Step 1: Combine Tables 5 and 9 into a new fuzzy soft set, as shown in Table 10. Step 2: Table 11 presents a comparison table of the fuzzy soft set (F, E ∪ E ), and the number of elements accessed for this step is 250. Step 3: Calculate the score table according to the comparison table in Step 2, the number of elements accessed for this step is 2 × 25 = 50.
As you can see from Table 11, after adding a new attribute, the object score list and prioritization are consistent with the results of the original dataset, so the newly added attribute set is not a necessary set and can be reduced. In the extended dataset, the total elements accessed for S-normal is 300.

ISNR
According to ISNR, the following steps are given: Step 1: Calculate the comparison and score tables of the new attribute sets, as shown in Table 12, and the number of elements accessed in this step is 50; Step 2: Calculate the difference based on the score table in Step 1, and the number of elements accessed is 50; Step 3: As you can see from Table 12, the newly added attribute set can be reduced because all of the object scores are 0. Therefore, the total number of elements accessed is 100.

Our Proposed Algorithm
According to our proposed algorithm, the following steps are given: Step 1: Compute the S-Score table for the two newly added attributes, as shown in Table 13; Step 2: For the newly added parameters e 1 , e 2 , obtain S i = 0 for all of objects, so the newly added attributes can be reduced; Using this method, we create an S-Score table for the two newly added attributes, and the number of elements accessed in this step is 40. Additionally, then, we obtain S i for the two added parameters, and the number of elements accessed is 10. Finally, the total number of elements accessed from the extended dataset is 40 + 10 = 50. Table 14 shows the comparative results of the three reduction algorithms. Both the original dataset and the extended dataset, the three methods have the same reduction results, so the three reduction algorithms are equivalent. Compared with SNR, the improvement of the total number of element access is up to 83% in this decision process after adding new parameters; while the improvement of the total number of element access is up to 50% compared with ISNR. The proposed approach considers the newly added parameters. Due to considering the added parameters, our proposed approach has the much higher flexibility and is beneficial to the extension of fuzzy soft set and combination of multiple fuzzy soft sets.

Case 2: Evaluation for Academic Papers
Researchers usually use many measurement indicators to evaluate the academic papers. A researcher as a beginner wants to read an excellent scientific journal paper on the research topic of "data mining", so he collects five academic papers from Baidu scholar. He cares about the performances of these academic papers from the seven aspects including "downloads", "cited frequency", "number of results", "H-index", "number of cited", "reading volume" and "impact factor". Here, we employ a fuzzy soft-set model to describe five academic papers. U is a collection of five academic papers, and U = {p 1 , p 2 , p 3 , p 4 , p 5 } = {"Design and Application of Teaching Quality Monitoring and Evaluation System Based on Data Mining", "Study on Data Mining for Combat Simulation", "Research on Intrusion Detection of Data Mining Model Based on Improved Apriori Algorithm", "Construction of Cloud Service Platform for Network Big Data Mining", "Overview of Data Mining"} [42]. E = {e 1 , e 2 , e 3 , e 4 , e 5 , e 6 , e 7 } = {"downloads", "cited frequency", "number of results" , "H-index", "number of cited", "reading volume", "impact factor"} as a set of parameters. All of datasets are normalized and transformed into a unit entity between 0 and 1. The specific data are shown in Table 15 below. However, because some measurements tend to be similar, there are some redundant data in the evaluation. We use the three methods to get the parameter reduction results. By comparing the above three algorithms, it can be found: (1) The three algorithms can obtain the same reduction results as {e 2 , e 3 , e 4 , e 5 , e 7 }; (2) In this case, the number of elements accessed by SNR, ISNR and our method are 350, 100 and 60, respectively; (3) For the number of element access, the newly proposed reduction algorithm are improved by 83% and 40% over SNR and ISNR, respectively.
The comparison results of the three reduction algorithms on the evaluation of academic papers are shown in Table 16 below.

Computational Complexity
Assuming a fuzzy soft set with an initial theoretical field is U, in fuzzy soft set (F, E), the objects and parameters are n rows and m columns, respectively. There are non-essential parameter sets in the original dataset E. The next three reduction algorithms accessed the elements only for a special subset of the original dataset, and the number of columns is recorded as m . The number of elements accessed by the three reduction algorithms from the original dataset is analyzed as follows:

SNR
If a special subset T exists in the original parameters set E, m represents the number of columns of a special subset T ,using big O notation, the computational complexity of SNR is O(n 2 m).

ISNR
For ISNR, if special subset T has m columns, using big O notation, the computational complexity of ISNR is O(n 2 m ).

The Newly Proposed Algorithm
For the reduction algorithm of this study, only the comparison table of special subset T is calculated, special subset T has m columns, the number of elements accessed for this step is n 2 m , we need to sum up the score table, and the calculation number of this step is nm . In summary, the total number of elements accessed for the reduction algorithm in this study is n 2 m . Using big O notation, the computational complexity of the proposed algorithm is O(n 2 m ).
Finally, we summarize the comparison results among the three methods as shown in Table 17.

Conclusions
In this paper, we proposed a new parameter-reduction method for fuzzy soft sets. As can be learned from the above two datasets, our proposed method has the same reduction results as two existing parameter reduction approaches for fuzzy soft sets; SNR and ISNR. However, it is clear that our method outperforms SNR and ISNR in terms of the number of elements accessed. When we have to add new parameters, our method takes the added parameters into account. Hence, our method has better flexibility and extendibility compared with SNR and ISNR. Our method can be applied to the extension of fuzzy soft sets and a combination of multiple evaluation systems. However, the proposed approach has limitations regarding computation when the number of attributes is very large. In future work, we will extend this parameter-reduction method to other mathematical models such as the interval-valued fuzzy soft set, the soft rough set, etc.