Next Article in Journal
Highly Accurate Compact Finite Difference Schemes for Two-Point Boundary Value Problems with Robin Boundary Conditions
Previous Article in Journal
Two-Dimensional Car-Following Control Strategy for Electric Vehicle Based on MPC and DQN
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

S-Score Table-Based Parameter-Reduction Approach for Fuzzy Soft Sets

College of Computer Science and Engineering, Northwest Normal University, Lanzhou 730070, China
*
Author to whom correspondence should be addressed.
Symmetry 2022, 14(8), 1719; https://doi.org/10.3390/sym14081719
Submission received: 3 July 2022 / Revised: 6 August 2022 / Accepted: 9 August 2022 / Published: 17 August 2022

Abstract

:
A fuzzy soft set is a mathematical tool used to deal with vagueness and uncertainty. Parameter reduction is an important issue when applying a fuzzy soft set to handle decision making. However, existing methods neglect newly added parameters and have higher computational complexities. In this paper, we propose a new S-Score table-based parameter-reduction approach for fuzzy soft sets. Compared with two existing methods of parameter reduction for a fuzzy soft set, our method takes newly added parameters into account, which brings about greater flexibility and is beneficial to the extension of fuzzy soft sets and a combination of multiple fuzzy soft sets. Additionally, our method accesses fewer elements from the dataset, which results in lower computation compared with the two existing approaches. The experimental results from two applications show the availability and feasibility of our approach.

1. Introduction

Molodtsov [1,2] proposed soft set theory as a novel concept in 1999 to handle vagueness and uncertainty. Mathematical tools using a combination of the soft set model and other mathematical models have since been developing rapidly, such as the fuzzy soft set [3], the intuitionistic fuzzy soft set [4], the interval-valued fuzzy soft set [5], the interval-valued intuitionistic fuzzy soft set [6,7], the belief interval-valued soft set [8], the confidence soft sets [9], the linguistic value soft set [10], separable fuzzy soft sets [11], dual hesitant fuzzy soft sets [12], the Z-soft fuzzy rough set [13], the fuzzy parameterized fuzzy soft set [14,15,16,17], interval-valued q-rung orthopair fuzzy soft sets [18], the interval-valued multi-polar fuzzy soft set [19], the soft rough set [20], etc. The fuzzy soft set is one of the most important branches of soft sets. Maji et al. [21] was the first to combine a fuzzy set with a soft set and put forward the idea of a fuzzy soft set. This concept has been further developed in [22]. There are many practical and valuable applications based on fuzzy soft sets. Sadiq et al. [23] proposed an approach for ranking the functional requirements of software using fuzzy soft set theory. A novel time-varying weight determination method based on a fuzzy soft set was given in [24]. A combination of the association rule method and the fuzzy soft set model was proposed in [25]. It is worth mentioning that the concept of a fuzzy soft set has been widely used in the field of decision making. Maji and Roy [26] proposed a target-recognition method based on a fuzzy soft set for imprecise multi-observer data, which has been improved in [27]. Another author [28] discussed the fuzzy soft aggregation operator, which supports the creation of more effective decision making approaches. Using the horizontal soft set in [29], an adjustable decision method based on the fuzzy soft set was proposed. Authors of [30] described the concept of fuzzy soft matrices and their related operations, which allowed them to propose a new decision-making method. The authors of [31] showed a process of information fusion that provides a more reliable resultant fuzzy soft set from an input dataset. Tang et al. [32] proposed the gray relational analysis method based on a fuzzy soft set in decision making. Deng et al. [33] proposed an object-parameter method to predict missing data in incomplete fuzzy soft sets. Uncertainty handling is one of the most important and difficult tasks in medical decision-making. The authors of [34] improved the decision algorithm based on a fuzzy soft set using a fuzzy measure and D-S evidence theory, which is often applied to medical diagnoses. The authors of [35] proposed a chest X-ray-enhanced diagnosis method for pneumonia malformations based on a fuzzy soft set and D-S evidence theory. Chen et al. [36] proposed a group decision-making algorithm based on an extended fuzzy soft set in order to identify cognitive differences among different decision makers.
Nevertheless, there are some redundant parameters in the actual decision-making process. A parameter-reduction set is the smallest subset of parameters that exhibit the same reduction results or descriptions as the original parameter set. Parameter reduction is one of important research issues involving applications of these tools that deal with uncertainty [37,38]. Kong et al. [39] first proposed the normal parameter reduction of fuzzy soft set theory. Ma et al. [40] proposed an efficient distance-based parameter-reduction algorithm for this model. In [41], a reduction in the parameters of fuzzy soft sets was studied from a new perspective on scoring criteria and was improved in [42]. However, the two existing algorithms do not consider newly added parameters and have higher computation, which lead to low extendibility.
To address these issues, we propose a S-Score table-based parameter-reduction method for fuzzy soft sets. Our contributions are as follows:
(1)
A new S-Score table-based parameter-reduction method for fuzzy soft sets is presented.
(2)
The proposed approach has relatively lower computation in comparison with the two existing algorithms in [41,42].
(3)
The proposed approach considers newly added parameters. Due to this consideration of the added parameters, our proposed approach has much better flexibility and is beneficial to the extension of fuzzy soft sets and a combination of multiple fuzzy soft sets.
(4)
The experimental results on two real-life applications show the availability and feasibility of our approach.
The rest of this paper is organized as follows. Section 2 reviews the basic concepts of soft set theory and fuzzy soft set theory and discusses the two parameter-reduction methods for fuzzy soft set proposed in [41,42]. In Section 3, our parameter-reduction algorithm for fuzzy soft sets based on an S-Score table is proposed. In Section 4, this newly proposed algorithm is compared with the two existing methods in two real-life applications. Finally, Section 5 concludes the paper.

2. Related Work

In the current section, we briefly recall the basic ideas and notion of soft sets, fuzzy soft sets, and related parameter-reduction methods for fuzzy soft sets.

2.1. Basic Notions

First, we recall the basic definition of fuzzy sets initially developed by Zadeh [43] in 1965.
Definition 1. 
([43]). A fuzzy set F in the universe  U is defined as F = { ( x , μ F ( x ) / x U , μ F ( x ) [ 0 , 1 ] ) } . U F is called the membership function of F, and U F indicates the membership degree from X to F . The family of all fuzzy sets on U is denote by F ( U ) .
Molodtsov [1] defined soft sets in the following way. Let U be an initial universe of objects and E be the set of parameters in relation to objects in U .
Parameters are often regarded as attributes, characteristics, or properties of objects. Let P ( U ) denote the power set of U and A E .
Definition 2.
A pair  ( F , A ) is called a soft set over  U , where  F is a mapping given by  F : A P ( U ) .
Maji et al. [21] initiated the study on hybrid structures involving both fuzzy sets and soft sets. They introduced the notion of fuzzy soft sets, which can be seen as the fuzzy generalization of a classical soft set. Maji et al. [21] proposed the concept of a fuzzy soft set as follows.
Definition 3. 
(See [21]). Let U be an initial universe of objects, E be a set of parameters in relation to objects in U , and ξ ( U ) be the set of all fuzzy subsets of U . A pair ( F , E ) is called a fuzzy soft set over ξ ( U ) , where F ˜ is a mapping given by F ˜ : E ξ ( U ) .

2.2. Existing Parameter-Reduction Methods for Fuzzy Soft Sets

Parameter reduction is an important process of decision-making applications for fuzzy soft sets. Here, we mainly recall two existing methods of parameter reduction.
Kong et al. [41] combined the score decision criterion with the standard parameter reduction method of a soft set and developed a score decision criterion parameter-reduction algorithm based on a fuzzy soft set, which is abbreviated as the S-normal reduction algorithm (SNR).
However, this method has high computation. To simplify the calculation, an improved parameter-reduction method for score decision criteria of a fuzzy soft set (ISNR) was presented in [42].
However, the two existing Algorithm 1 and Algorithm 2 presented above do not consider newly added parameters and has higher computation, which lead to low extendibility when multiple datasets are combined. As a result, we propose a new parameter-reduction method that takes the added parameters into account.
Algorithm 1: S-normal reduction algorithm [41] (SNR).
Step 1: Input a fuzzy soft set ( F , A ) ;
Step 2: Compute the comparison table WE−T and check the matrix WEWE−T for any subset T U . If this matrix is symmetric, T is put into redundant set R;
Step 3: Check if T is the largest non-essential subset of E; in that case, E-T is an S-normal parameter reduction of R.
Algorithm 2: ISNR [42].
Step 1: Input a fuzzy soft set ( F , A ) ;
Step 2: Compute the comparison matrix W T   ( T U ) . If this matrix is symmetric, T can be reduced;
Step   3 :   E T is the parameter-reduction result of a fuzzy soft set.

3. Our Proposed Method

By introducing a new concept called an S-Score table in this section, we provide a new approach that successfully overcomes the limitations of the S-normal and I-S-normal methods.
Let U = { α 1 , α 2 , , α n } be the universe and A = { e 1 , e 2 , e m } be the attribute set. μ F ( e l ) ( α i ) is the membership value of object α i for parameter e l . R ( α i ) ( e l ) is the number of objects for which the membership value of α i is equal to or greater than the membership value of α j and T ( α i ) ( e l ) is the number of objects for which the membership value of α i is equal to or less than the membership value of α j .
The S-Score value of object α i on e l is denoted by S ( α i ) ( e l ) and defined by
S ( a i ) ( e l ) = R ( a i ) ( e l ) T ( a i ) ( e l )
The overall S-Score of object α i is denoted by S i and defined as
S i = l = 1 m S ( a i ) ( e l )
The S-Score table is a table in which the rows are labeled by the attribute e 1 , e 2 , e m and the columns are labeled by the objects α 1 , α 2 , , α n . An entry corresponding to attribute e l and object α i is denoted by S ( α i ) ( e l ) .
To illustrate our method in Algorithm 3, we give the following example.
Algorithm 3: The proposed reduction algorithm based on S-Score table.
Step 1: Input a fuzzy soft set ( F , A ) ;
Step 2: Compute the S-Score value S ( a i ) ( e l ) of a i to attribute e l and create the S-Score table for ( F , A ) ;
Step 3: Compute S T for any subset of T A ; if S T = 0 for all objects, then T is called a non-essential set in E.
Step 4: Check whether the non-essential subset T is the largest non-essential subset. If so, A-T is the final parameter-reduction result.
Example 1.
There are six objects U = { P 1 , P 2 , P 3 , P 4 , P 5 , P 6   } , and E = { e 1 , e 2 , e 3 , e 4 , e 5 , e 6 , e 7   } is a collection of parameters. The fuzzy soft set ( S , P ) is shown in Table 1.
According to our Algorithm 3, the following steps are given:
Step 1: Input a fuzzy soft set ( S , P ) as shown in Table 1;
Step 2: Compute the S-Score valve S ( p i ) ( e l ) for each object P i to the attribute e l using Equation (1);
Hence, we can obtain the following:
S ( p 1 ) ( e 1 ) = R ( p i ) ( e 1 ) T ( p i ) ( e 1 ) = 2 4 = 2 ; S ( p 1 ) ( e 2 ) = R ( p i ) ( e 2 ) T ( p i ) ( e 2 ) = 0 5 = 5 ; S ( p 1 ) ( e 3 ) = R ( p i ) ( e 3 ) T ( p i ) ( e 3 ) = 5 0 = 5 ; S ( p 1 ) ( e 4 ) = R ( p i ) ( e 4 ) T ( p i ) ( e 4 ) = 2 3 = 1 ; S ( p 1 ) ( e 5 ) = R ( p i ) ( e 5 ) T ( p i ) ( e 5 ) = 1 5 = 4 ; S ( p 1 ) ( e 6 ) = R ( p i ) ( e 6 ) T ( p i ) ( e 6 ) = 3 2 = 1 ; S ( p 1 ) ( e 7 ) = R ( p i ) ( e 7 ) T ( p i ) ( e 7 ) = 3 2 = 1
Similarly, we calculate the remaining S-Score valves as shown in Table 2.
Step 3: Compute S T for any subset of T E ; if S T = 0 for all objects, then T is called a non-essential set in E.
In this process, we find that T = { e 2 , e 3 } for all of objects S T = 0, which is illustrated in Table 3.
Step 4: We find that the reduced subset T = { e 2 , e 3 } is the largest non-required subset. If so, the remaining parameter subset E-T = { e 1 , e 4 ,   e 5 , e 6 ,   e 7   } is the final reduction result.
We also apply SNR and ISNR to Example 4.1. As a result, the parameter-reduction results are { e 1 , e 4 ,   e 5 , e 6 ,   e 7   } . That is, three methods provide equivalent reduction results. In order to verify this point, we give the following Theorem.
Theorem 1.
Suppose that ( F , E ) is a fuzzy soft set on U; U = P 1 , P 2 , , P n ; and E = e 1 , e 2 , e m . For any P i U , calculating its score S i and priority ranking P R i using the SNR algorithm, its score S i and priority rank P R i using the ISNR algorithm, and its overall S-Score S i and priority rank P R i using our algorithm, we have S i = S i = S i and P R i = P R i = P R i .
Proof .
Since j = 1 n c i j = l = 1 m R ( p i ) ( e i ) and j = 1 n c j i = l = 1 m T ( p i ) ( e l ) , we can obtain the following:
S i = r i t i = j = 1 n c i j j = 1 n c j i = l = 1 m R ( p i ) ( e l ) l = 1 m T ( p i ) ( e l ) = l = 1 m ( R ( p i ) ( e i ) T ( p i ) ( e l ) ) = l = 1 m S ( p i ) ( e l ) = S i .
In the same way, we can obtain S i = S i . To sum up, we can have S i = S i = S i .
We use Equations (1) and (2) to create the S-Score table of the fuzzy soft set ( F , E ) , which is shown in Table 4. We can obtain the rank of objects P R i   according to S E i . □
According to our method, we should find the subset T E   that   satisfies   ST1 = ST2 = … = STn = 0. Because ST1 = ST2 = … = STn = 0, it is clear that the rank of objects P R i based on sEi is the same as the rank of objects according to sE−Ti. That is, the object priority remains unchanged after the redundant parameter set is reduced, so P R i = P R i = P R i . This completes the proof.
From the above theorem, we can conclude that the three reduction algorithms provide equivalent reduction results.

4. Comparison Results among Three Methods

In this section, first, we compare the proposed algorithm with the two existing methods—SNR (Algorithm 1) and ISNR (Algorithm 1)—on two real-life applications. As a result, we summarize the comparison among three methods from aspects such as consideration of the added parameters, flexibility, and computational complexity.

4.1. Case 1: Personal Postgraduate Enrollment for the Supervisors

After the postgraduate entrance examination, one supervisor who works at Northwest Normal University plans to recruit one graduate student majoring in computer science. He receives five emails with resumes. That is, there are five candidates. Furthermore, this supervisor examines the five resumes and summarizes seven appraisal items as diverse as “reputation of the university at which the candidate studied their bachelor’s degree”, “international ranking of their computer science major”, “GPA”, “English reading ability”, “English writing ability”, “academic performance”, and “internship experience” to evaluate the five candidates. We apply a fuzzy soft set to display the performances of the five candidates about the seven aspects. Suppose that   U = { p 1 , p 2 , p 3 , p 4 , p 5 } is the set of five candidates, and E = { e 1 , e 2 , e 3 , e 4 , e 5 , e 6 , e 7 }   is the set of seven appraisal items. Table 5 presents the data records for the personal postgraduate enrollment system used by the supervisor as a fuzzy soft set ( F ,   E ) .

4.1.1. Three Methods on the Original Dataset

SNR

According to the algorithm of SNR, first, we calculate the comparison and score table of a fuzzy soft set (F, E), which is shown in Table 6. We can see that the scores of each object are −2, −6, 4, 4, and 0, respectively. The priority ranking is p 3 = p 4 > p 5 > p 1 > p 2 .
We check the matrix WE-WE-T for subset T = e 1 , e 2 , e 5 , e 7 U and find that this matrix is symmetric. As a result, the final S-normal reduction result is {   e 3 , e 4 , e 6   } .
In this algorithm, we first calculate the comparison table of the fuzzy soft set, the number of elements accessed is 25 × 7 = 175. Next, we check the matrix WE−WE−T for subset   T = { e 1 , e 2 , e 5 , e 7 } U , In this step, the number of elements accessed is 2 × 25 × 5 + 6 × 25 = 400. From the above steps, we can conclude that the total number of elements accessed for S-normal is 575.

ISNR

First, the special subset   T = e 1 , e 2 , e 5 , e 7 is found; then, we calculate the fuzzy soft set ( F , T ) from the comparison table and the score table, which are shown in Table 7. The number of elements accessed for this step is 100. The difference is calculated from the score table, and the number of elements accessed is 50. From the above steps, we can obtain that the total number of elements accessed by ISNR is 150.

The New Proposed Algorithm

According to our proposed algorithm, the following steps are given: a special subset T = { e 1 , e 2 , e 5 , e 7 } is found, the number of elements accessed is 80 when comparing the difference between the membership degrees of different objects using Equation (1) in this step. Additionally, the number of elements accessed, shown in Table 8 using Equation (2) is 20. From the above steps, the total number of elements assessed is up to 100.
Compared with the SNR algorithm, it is clear that the improvement in the total number of elements accessed is up to 42% in this process, while the improvement in the total number of elements accessed is 21% compared with ISNR.

4.1.2. Three Methods on the Extended Dataset

Suppose that ( F , E ) is an original fuzzy soft set and that the new attribute set E = { e 1 , e 2 , e r } should be added to E . If parameter reduction is performed using SNR, you have to assemble two parameter sets into a parameter set and to compute a new comparison table for the new fuzzy soft set ( H , E E ) . Here, after face-to-face interviews for the five candidates, the supervisor considers adding two new attributes such as “expression ability” and “interest in research” to evaluate the applicants shown in Table 9. However, for the newly added parameters { e 1 , e 2 } , three methods have different reduction processes and number of elements accessed.

SNR

According to SNR, the following steps are given:
Step 1: Combine Table 5 and Table 9 into a new fuzzy soft set, as shown in Table 10.
Step 2: Table 11 presents a comparison table of the fuzzy soft set ( F , E E ) , and the number of elements accessed for this step is 250.
Step 3: Calculate the score table according to the comparison table in Step 2, the number of elements accessed for this step is 2 × 25 = 50.
As you can see from Table 11, after adding a new attribute, the object score list and prioritization are consistent with the results of the original dataset, so the newly added attribute set is not a necessary set and can be reduced. In the extended dataset, the total elements accessed for S-normal is 300.

ISNR

According to ISNR, the following steps are given:
Step 1: Calculate the comparison and score tables of the new attribute sets, as shown in Table 12, and the number of elements accessed in this step is 50;
Step 2: Calculate the difference based on the score table in Step 1, and the number of elements accessed is 50;
Step 3: As you can see from Table 12, the newly added attribute set can be reduced because all of the object scores are 0. Therefore, the total number of elements accessed is 100.

Our Proposed Algorithm

According to our proposed algorithm, the following steps are given:
Step 1: Compute the S-Score table for the two newly added attributes, as shown in Table 13;
Step 2: For the newly added parameters { e 1 , e 2 } , obtain S i = 0   for all of objects, so the newly added attributes can be reduced;
Using this method, we create an S-Score table for the two newly added attributes, and the number of elements accessed in this step is 40. Additionally, then, we obtain S i   for the two added parameters, and the number of elements accessed is 10. Finally, the total number of elements accessed from the extended dataset is 40 + 10 = 50.
Table 14 shows the comparative results of the three reduction algorithms. Both the original dataset and the extended dataset, the three methods have the same reduction results, so the three reduction algorithms are equivalent.
Compared with SNR, the improvement of the total number of element access is up to 83% in this decision process after adding new parameters; while the improvement of the total number of element access is up to 50% compared with ISNR. The proposed approach considers the newly added parameters. Due to considering the added parameters, our proposed approach has the much higher flexibility and is beneficial to the extension of fuzzy soft set and combination of multiple fuzzy soft sets.

4.2. Case 2: Evaluation for Academic Papers

Researchers usually use many measurement indicators to evaluate the academic papers. A researcher as a beginner wants to read an excellent scientific journal paper on the research topic of “data mining”, so he collects five academic papers from Baidu scholar. He cares about the performances of these academic papers from the seven aspects including “downloads”, “cited frequency”, “number of results”, “H-index”, “number of cited”, “reading volume” and “impact factor”. Here, we employ a fuzzy soft-set model to describe five academic papers. U is a collection of five academic papers, and U = { p 1 , p 2 , p 3 , p 4 , p 5 } = {“Design and Application of Teaching Quality Monitoring and Evaluation System Based on Data Mining”, “Study on Data Mining for Combat Simulation”, “Research on Intrusion Detection of Data Mining Model Based on Improved Apriori Algorithm”, “Construction of Cloud Service Platform for Network Big Data Mining”, “Overview of Data Mining”} [42]. E = { e 1 , e 2 , e 3 , e 4 , e 5 , e 6 , e 7 } = { downloads ,   cited   frequency ,   number   of   results ,   H - index ,   number   of   cited ,   reading   volume ,   impact   factor } as a set of parameters. All of datasets are normalized and transformed into a unit entity between 0 and 1. The specific data are shown in Table 15 below. However, because some measurements tend to be similar, there are some redundant data in the evaluation. We use the three methods to get the parameter reduction results.
By comparing the above three algorithms, it can be found:
(1)
The three algorithms can obtain the same reduction results as { e 2 , e 3 , e 4 , e 5 , e 7 } ;
(2)
In this case, the number of elements accessed by SNR, ISNR and our method are 350, 100 and 60, respectively;
(3)
For the number of element access, the newly proposed reduction algorithm are improved by 83% and 40% over SNR and ISNR, respectively.
The comparison results of the three reduction algorithms on the evaluation of academic papers are shown in Table 16 below.

4.3. Computational Complexity

Assuming a fuzzy soft set with an initial theoretical field is U, in fuzzy soft set (F, E), the objects and parameters are n rows and m columns, respectively. There are non-essential parameter sets in the original dataset E. The next three reduction algorithms accessed the elements only for a special subset of the original dataset, and the number of columns is recorded as m . The number of elements accessed by the three reduction algorithms from the original dataset is analyzed as follows:

4.3.1. SNR

If a special subset T exists in the original parameters set E, m represents the number of columns of a special subset T ,using big O notation, the computational complexity of SNR is O( n 2 m ).

4.3.2. ISNR

For ISNR, if special subset T has m columns, using big O notation, the computational complexity of ISNR is O( n 2 m ).

4.3.3. The Newly Proposed Algorithm

For the reduction algorithm of this study, only the comparison table of special subset T is calculated, special subset   T has m columns, the number of elements accessed for this step is n 2 m , we need to sum up the score table, and the calculation number of this step is n m . In summary, the total number of elements accessed for the reduction algorithm in this study is   n 2 m . Using big O notation, the computational complexity of the proposed algorithm is O( n 2 m ).
Finally, we summarize the comparison results among the three methods as shown in Table 17.

5. Conclusions

In this paper, we proposed a new parameter-reduction method for fuzzy soft sets. As can be learned from the above two datasets, our proposed method has the same reduction results as two existing parameter reduction approaches for fuzzy soft sets; SNR and ISNR. However, it is clear that our method outperforms SNR and ISNR in terms of the number of elements accessed. When we have to add new parameters, our method takes the added parameters into account. Hence, our method has better flexibility and extendibility compared with SNR and ISNR. Our method can be applied to the extension of fuzzy soft sets and a combination of multiple evaluation systems. However, the proposed approach has limitations regarding computation when the number of attributes is very large. In future work, we will extend this parameter-reduction method to other mathematical models such as the interval-valued fuzzy soft set, the soft rough set, etc.

Author Contributions

Conceptualization, H.Q. and X.M..; methodology, W.W.; software, C.G.; validation, X.M. and Y.W.; investigation, C.G.; data curation, C.G.; writing—original draft preparation, C.G. and H.Q.; writing—review and editing, X.M. and Y.W.; supervision, H.Q.; project administration, H.Q.; funding acquisition, H.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Science Foundation of China, grant number 62162055, and by the National Science Foundation of Gansu province, grant number 21JR7RA115.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Molodtsov, D. Soft set theory-first results. Comput. Math. Appl. 1999, 37, 19–31. [Google Scholar] [CrossRef]
  2. Han, B.; Li, Y.; Geng, S. 0–1 Linear programming methods for optimal normal and pseudo parameter reductions of soft sets. Appl. Soft Comput. 2017, 54, 467–484. [Google Scholar] [CrossRef]
  3. Yang, Y.; Tan, X.; Meng, C. The multi-fuzzy soft set and its application in decision making. Appl. Math. Model. 2013, 37, 4915–4923. [Google Scholar] [CrossRef]
  4. Agarwal, M.; Biswas, K.K.; Hanmandlu, M. Generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Soft Comput. 2013, 13, 3552–3566. [Google Scholar] [CrossRef]
  5. Ma, X.; Qin, H.; Sulaiman, N.; Herawan, T.; Abawajy, J. The parameter reduction of the interval-valued fuzzy soft sets and its related algorithms. IEEE Trans. Fuzzy Syst. 2014, 22, 57–71. [Google Scholar] [CrossRef]
  6. Jiang, Y.; Tang, Y.; Chen, Q.; Liu, H.; Tang, J. Interval-valued intuitionistic fuzzy soft sets and their properties. Comput. Math. Appl. 2010, 60, 906–918. [Google Scholar] [CrossRef]
  7. Ma, X.; Qin, H.J.; Abawajy, J. Interval-valued intuitionistic fuzzy soft sets based decision making and parameter reduction. IEEE Trans. Fuzzy Syst. 2022, 30, 357–369. [Google Scholar] [CrossRef]
  8. Vijayabalaji, S.; Ramesh, A. Belief interval-valued soft set. Expert Syst. Appl. 2019, 119, 262–271. [Google Scholar] [CrossRef]
  9. Aggarwal, M. Confidence soft sets and applications in supplier selection. Comput. Ind. Eng. 2019, 127, 614–624. [Google Scholar] [CrossRef]
  10. Sun, B.; Ma, W.; Li, X. Linguistic value soft set-based approach to multiple criteria group decision-making. Appl. Soft Comput. 2017, 58, 285–296. [Google Scholar] [CrossRef]
  11. Alcantud, J.C.R.; Mathew, T.J. Separable fuzzy soft sets and decision making with positive and negative attributes. Appl. Soft Comput. 2017, 59, 586–595. [Google Scholar] [CrossRef]
  12. Arora, R.; Garg, H. A robust correlation coefficient measure of dual hesitant fuzzy soft sets and their application in decision making. Eng. Appl. Artif. Intell. 2018, 72, 80–92. [Google Scholar] [CrossRef]
  13. Zhan, J.; Irfan Ali, M.; Mehmood, N. On a novel uncertain soft set model: Z-soft fuzzy rough set model and corresponding decision making methods. Appl. Soft Comput. 2017, 56, 446–457. [Google Scholar] [CrossRef]
  14. Memis, S.; Enginoglu, S.; Erkan, U. Numerical Data Classification via Distance-Based Similarity Measures of Fuzzy Parameterized Fuzzy Soft Matrices. IEEE Access 2021, 9, 88583–88601. [Google Scholar] [CrossRef]
  15. Memi, S.; Enginolu, S.; Erkan, U. A Classification Method in Machine Learning Based on Soft Decision-Making via Fuzzy Parameterized Fuzzy Soft Matrices. Soft Comput. 2021, 26, 1165–1180. [Google Scholar] [CrossRef]
  16. Memiş, S.; Enginoğlu, S.; Erkan, U. Fuzzy Parameterized Fuzzy Soft k-Nearest Neighbor Classifier. Neurocomputing 2022, 500, 351–378. [Google Scholar] [CrossRef]
  17. Memiş, S.; Enginoğlu, S.; Erkan, U. A New Classification Method Using Soft Decision-Making Based on an Aggregation Operator of Fuzzy Parameterized Fuzzy Soft Matrices. Turk. J. Electr. Eng. Comput. Sci. 2022, 3, 871–890. [Google Scholar] [CrossRef]
  18. Ghous, A.; Muhammad, A.; Muhammad, A.; Adeel, S. Attribute reduction approaches under interval-valued q-rung orthopair fuzzy soft framework. Appl. Intell. 2022, 52, 8975–9000. [Google Scholar]
  19. Akram, M.; Ali, G.; Alcantud, J.C.R. Parameter reduction analysis under interval-valued m-polar fuzzy soft information. Artif. Intell. Rev. 2021, 54, 5541–5582. [Google Scholar] [CrossRef]
  20. Zhan, J.; Liu, Q.; Herawan, T. A novel soft rough set: Soft rough hemirings and corresponding multicriteria group decision making. Appl. Soft Comput. 2017, 54, 392–402. [Google Scholar] [CrossRef]
  21. Maji, P.; Biswas, K.R.; Roy, A.R. Fuzzy soft sets. J. Fuzzy Math. 2001, 9, 589–602. [Google Scholar]
  22. Majumdar, P.; Samanta, S.K. Generalized fuzzy soft sets. Comput. Math. Appl. 2010, 59, 1425–1432. [Google Scholar] [CrossRef]
  23. Sadiq, M.; Devi, V. Fuzzy-soft set approach for ranking the functional requirements of software. Expert Syst. Appl. 2022, 193, 116452. [Google Scholar] [CrossRef]
  24. Li, H.; Xiong, S. Time-varying weight coefficients determination based on fuzzy soft set in combined prediction model for travel time. Expert Syst. Appl. 2021, 189, 115198. [Google Scholar] [CrossRef]
  25. Dede, R.; Noor, A.S.; Mustafa, M.D. Association rules of fuzzy soft set based classification for text classification problem. J. King Saud Univ.-Comput. Inf. Sci. 2020, 34, 801–812. [Google Scholar]
  26. Maji, P.; Roy, A.R. A fuzzy soft set theoretic approach to decision making problems. J. Comput. Appl. Math. 2007, 203, 412–418. [Google Scholar]
  27. Kong, Z.; Gao, L.; Wang, L. Comment on ‘a fuzzy soft set theoretic approach to decision making problems’. J. Comput. Appl. Math. 2009, 223, 540–542. [Google Scholar] [CrossRef]
  28. Çağman, N.; Enginoğlu, S.; Çıtak, F. Fuzzy Soft Set Theory and Its Applications. Iran. J. Fuzzy Syst. 2011, 8, 137–147. [Google Scholar]
  29. Feng, F.; Jun, Y.; Liu, X.; Li, L. An adjustable approach to fuzzy soft set based decision making. J. Comput. Appl. Math. 2010, 234, 10–20. [Google Scholar] [CrossRef]
  30. Çağman, N.; Enginoğlu, S. Fuzzy Soft Matrix Theory and Its Application in Decision Making. Iran. J. Fuzzy Syst. 2012, 9, 109–119. [Google Scholar]
  31. Carlos, J.; Alcantud, R. A novel algorithm for fuzzy soft set based decision making from multiobserver input parameter data set. Inf. Fusion 2016, 29, 142–148. [Google Scholar]
  32. Tang, H. A novel fuzzy soft set approach in decision making based on grey relational analysis and Dempster–Shafer theory of evidence. Appl. Soft Comput. 2015, 31, 317–325. [Google Scholar] [CrossRef]
  33. Deng, T.; Wang, X. An object-parameter approach to predicting unknown data in incomplete fuzzy soft sets. Appl. Math. Model. 2013, 37, 4139–4146. [Google Scholar] [CrossRef]
  34. Wang, J.; Hu, Y.; Xiao, F.; Deng, X.; Deng, Y. A novel method to use fuzzy soft sets in decision making based on ambiguity measure and Dempster–Shafer theory of evidence: An application in medical diagnosis. Artif. Intell. Med. 2016, 69, 1–11. [Google Scholar] [CrossRef] [PubMed]
  35. Biswajit, B.; Swarup, K.G.; Siddhartha, B.; Jan, P.; Vaclav, S.; Amlan, C. Chest X-ray enhancement to interpret pneumonia malformation based on fuzzy soft set and Dempster–Shafer theory of evidence. Appl. Soft Comput. 2020, 86, 105889. [Google Scholar]
  36. Chen, W.; Zou, Y. Group decision making under generalized fuzzy soft sets and limited cognition of decision makers. Eng. Appl. Artif. Intell. 2020, 87, 103344. [Google Scholar] [CrossRef]
  37. Akram, M.; Ali, G.; Alcantud, J.C.R. Attributes reduction algorithms for m-polar fuzzy relation decision systems. Int. J. Approx. Reason. 2022, 140, 232–254. [Google Scholar] [CrossRef]
  38. Akram, M.; Ali, G.; Alcantud, J.C.R.; Fatimah, F. Parameter reductions in N-soft sets and their applications in decision-making. Expert Syst. 2021, 38, e12601. [Google Scholar] [CrossRef]
  39. Kong, Z.; Gao, L.; Wang, L.; Li, S. The normal parameter reduction of soft sets and its algorithm. Comput. Math. Appl. 2008, 56, 3029–3037. [Google Scholar] [CrossRef]
  40. Ma, X.; Qin, H. A distance-based parameter reduction algorithm of fuzzy soft sets. IEEE Access 2018, 6, 10530–10539. [Google Scholar] [CrossRef]
  41. Kong, Z.; Ai, J.; Wang, L.; Li, P.; Ma, L.; Lu, F. New normal parameter reduction method in fuzzy soft set theory. IEEE Access 2018, 7, 2986–2998. [Google Scholar] [CrossRef]
  42. Ma, X.; Fei, Q.; Qin, H.; Zhou, X.; Li, H. New Improved Normal Parameter Reduction Method for Fuzzy Soft Set. IEEE Access 2019, 7, 154912–154921. [Google Scholar] [CrossRef]
  43. Zadeh, L.A. Fuzzy sets. Inf. Control. 1965, 8, 338–353. [Google Scholar] [CrossRef]
Table 1. Fuzzy soft set ( S , P ) .
Table 1. Fuzzy soft set ( S , P ) .
Ue1e2e3e4e5e6e7
P10.30.10.80.30.20.40.7
P20.30.30.60.20.50.20.3
P30.40.20.70.80.20.90.9
P40.70.40.50.50.40.30.8
P50.20.50.40.40.70.10.2
P60.60.70.30.10.60.50.4
Table 2. The S-Score table of (S, P).
Table 2. The S-Score table of (S, P).
Ue1e2e3e4e5e6e7
P1−2−55−1−411
P2−2−11−31−3−3
P31−335−255
P451−13−1−13
P5−53−315−5−5
P635−5−533−1
Table 3. The reduced parameters.
Table 3. The reduced parameters.
Ue2e3ST
P1−550
P2−110
P3−330
P41−10
P53−30
P65−50
Table 4. S-Score table of fuzzy soft set ( F , E ) .
Table 4. S-Score table of fuzzy soft set ( F , E ) .
UE1E2 E m 1 E m S E i
P1 S ( a 1 ) ( e 1 ) S ( a 1 ) ( e 2 ) S ( a 1 ) ( e m 1 ) S ( a 1 ) ( e m ) S E 1
P2 S ( a 2 ) ( e 1 ) S ( a 2 ) ( e 2 ) S ( a 2 ) ( e m 1 ) S ( a 2 ) ( e m ) S E 2
P n 1 S ( a n 1 ) ( e 1 ) S ( a n 1 ) ( e 2 ) S ( a n 1 ) ( e m 1 ) S ( a n 1 ) ( e m ) S E n 1
P n S ( a n ) ( e 1 ) S ( a n ) ( e 2 ) S ( a n ) ( e m 1 ) S ( a n ) ( e m ) S E n
Table 5. Fuzzy soft set ( F ,   E ) .
Table 5. Fuzzy soft set ( F ,   E ) .
Ue1e2e3e4e5e6e7
P10.180.820.450.450.150.550.85
P20.540.750.700.250.450.350.55
P30.620.500.300.750.150.850.85
P40.850.450.780.600.350.450.80
P50.890.320.890.500.800.250.25
P60.180.820.450.450.150.550.85
Table 6. The comparison and score table of ( F ,   E ) .
Table 6. The comparison and score table of ( F ,   E ) .
UP1P2P3P4P5Row-Sum ( r i ) Column-Sum ( t i ) Score ( s i )
P1744332123−2
P2373231824−6
P35474424204
P44537423194
P54433721210
Table 7. The comparison and score table of ( F , T ) .
Table 7. The comparison and score table of ( F , T ) .
UP1P2P3P4P5Row-Sum ( r i ) Column-Sum ( t i ) Score ( s i )
P14212211110
P22422212120
P31242211110
P42224212120
P52222412120
Table 8. The S-Score table of ( F , T ) .
Table 8. The S-Score table of ( F , T ) .
Ue1e2e5e7Sk
P1−44−330
P2−222−20
P300−330
P42−2000
P54−44−40
Table 9. Fuzzy soft set ( F , E ) .
Table 9. Fuzzy soft set ( F , E ) .
U e 1 e 2
P1−44
P2−22
P300
P42−2
P54−4
Table 10. Fuzzy soft set ( F , E E ) .
Table 10. Fuzzy soft set ( F , E E ) .
Ue1e2e3e4e5e6e7 e 1 e 2
P10.180.820.450.450.150.550.850.700.20
P20.540.750.700.250.450.350.550.350.80
P30.620.500.300.750.150.850.850.800.15
P40.850.450.780.600.350.450.800.600.50
P50.890.320.890.500.800.250.250.400.60
Table 11. The comparison and score table of ( F , E E ) .
Table 11. The comparison and score table of ( F , E E ) .
UP1P2P3P4P5Row-Sum ( r i ) Column-Sum ( t i ) Score ( s i )
P1955443234−2
P2595452935−6
P37696635314
P46759634304
P56655932320
Table 12. The comparison and score table of ( F , E ) .
Table 12. The comparison and score table of ( F , E ) .
UP1P2P3P4P5Row-Sum ( r i ) Column-Sum ( t i ) Score ( s i )
P121111660
P212111660
P311211660
P411121660
P511112660
Table 13. The S-Score table of ( F , E ) .
Table 13. The S-Score table of ( F , E ) .
U e 1 e 2 S i
P12−20
P2−440
P34−40
P4000
P5−220
Table 14. Comparison results among the three algorithms for case 1.
Table 14. Comparison results among the three algorithms for case 1.
ComparisonSNRISNROur AlgorithmImprovement vs. SNR/ISNR
Reduction result { e 1 , e 2 , e 5 , e 7 , e 1 , e 2 } { e 1 , e 2 , e 5 , e 7 , e 1 , e 2 } { e 1 , e 2 , e 5 , e 7 , e 1 , e 2 } The same
Flexibility and extendibilityWeakWeakStrongStronger
Considering the added parametersNoNoYes-
The total number of element access on the original dataset57515010082.6%/33.3%
The total number of element access on the extended dataset3001005083.3%/50%
Table 15. Fuzzy soft set of Evaluation System for Academic papers [42].
Table 15. Fuzzy soft set of Evaluation System for Academic papers [42].
Ue1e2e3e4e5e6e7
P10.050.090.120.160.050.950.44
P20.170.950.950.950.230.750.95
P30.220.400.110.610.230.550.05
P40.720.430.080.500.050.150.18
P50.950.050.050.050.950.050.63
Table 16. Comparison result for case 2.
Table 16. Comparison result for case 2.
Algorithm ComparisonSNRISNROur Algorithm
Parameter reduction results { e 2 , e 3 , e 4 , e 5 , e 7 } { e 2 , e 3 , e 4 , e 5 , e 7 } { e 2 , e 3 , e 4 , e 5 , e 7 }
Number of element access35010060
Table 17. Summary of comparison results.
Table 17. Summary of comparison results.
ComparisonSNRISNROur Algorithm
Reduction result the   same the   same the   same
Flexibility and extendibilityWeakWeakStrong
Considering the added parametersNoNoYes
Computational complexity O ( n 2 m ) O ( n 2 m ) O ( n 2 m )
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Qin, H.; Gu, C.; Ma, X.; Wei, W.; Wang, Y. S-Score Table-Based Parameter-Reduction Approach for Fuzzy Soft Sets. Symmetry 2022, 14, 1719. https://doi.org/10.3390/sym14081719

AMA Style

Qin H, Gu C, Ma X, Wei W, Wang Y. S-Score Table-Based Parameter-Reduction Approach for Fuzzy Soft Sets. Symmetry. 2022; 14(8):1719. https://doi.org/10.3390/sym14081719

Chicago/Turabian Style

Qin, Hongwu, Chengjun Gu, Xiuqin Ma, Weiyi Wei, and Yibo Wang. 2022. "S-Score Table-Based Parameter-Reduction Approach for Fuzzy Soft Sets" Symmetry 14, no. 8: 1719. https://doi.org/10.3390/sym14081719

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop