Abstract
Parameter reduction is a very important technique in many fields, including pattern recognition. Many reduction techniques have been reported for fuzzy soft sets to solve decision-making problems. However, there is almost no attention to the parameter reduction of bipolar fuzzy soft sets, which take advantage of the fact that membership and non-membership degrees play a symmetric role. This methodology is of great importance in many decision-making situations. In this paper, we provide a novel theoretical approach to solve decision-making problems based on bipolar fuzzy soft sets and study four types of parameter reductions of such sets. Parameter reduction algorithms are developed and illustrated through examples. The experimental results prove that our proposed parameter reduction techniques delete the irrelevant parameters while keeping definite decision-making choices unchanged. Moreover, the reduction algorithms are compared regarding the degree of ease of computing reduction, applicability, exact degree of reduction, applied situation, and multi-use of parameter reduction. Finally, a real application is developed to describe the validity of our proposed reduction algorithms.
1. Introduction
There are many decision-making issues in physical sciences, applied sciences, social sciences, and life sciences often containing datasets having uncertain and vague information. Fuzzy set theory [1] and rough set theory [2] are classical mathematical tools to characterize uncertain and inaccurate data, but as indicated in [3,4], each of these theories lacks theoretical parametric tools. Molodtsov [3] initiated the concept of the soft set as a new mathematical tool for handling vague and uncertain information. Molodtsov [3] efficiently implemented soft set theory in multiple directions, for example operations research, game theory, Riemann integral, and probability. Currently, the research on soft set theory is proceeding rapidly and has achieved many fruitful results. Some fundamental algebraic properties of soft sets were proposed by Maji et al. [5]. Feng et al. [6] proposed new hybrid models by combining fuzzy sets, rough sets, and soft sets. Maji and Roy [7] presented a technique to solve decision-making problem-based fuzzy soft sets. Xiao et al. [8] developed a forecasting method based the fuzzy soft set model.
Henceforth, much research on parameter reduction has been completed, and several results have been derived [9,10,11,12,13,14]. Research based on soft set theory to solve decision-making problems derives from the concept of parameter reduction. The reduction of parameters in soft set theory is designed to remove redundant parameters while preserving the original decision choices. Maji et al. [15] first solved soft set decision-making problems using rough set-based reduction [16]. To improve decision-making problems in [15], Chen et al. [17] and Kong et al. [18] respectively proposed parameterization reduction and normal parameter reduction of soft sets. Ma et al. [19] proposed a new efficient algorithm of normal parameter reduction to improve [15,16,17]. Roy and Maji [7] proposed a new method for dealing with decision-making problems with fuzzy soft sets. The method deals with a comparison table derived from a fuzzy soft set in the sense of parameters to make a decision. Kong et al. [20] indicated that the Roy and Maji technique [7] was inaccurate, and they proposed a modified approach to solve this issue. They described the effectiveness of the Roy and Maji technique [7] and demonstrated its boundaries. Ma et al. [21] proposed extensive parameter reduction methods for interval-valued fuzzy soft sets. Decision-making research for the reduction of fuzzy soft sets has been given considerable attention. Using the idea of the level soft set, Feng et al. [22] gave the idea of the parameter reduction of fuzzy soft sets and proposed an adaptable method to decision-making based on fuzzy soft sets. Moreover, Feng et al. [23] presented another intuition about decision-making-based interval-valued fuzzy soft sets. Jiang et al. [24] proposed a reduction method of intuitionistic fuzzy soft sets for decision-making using the level soft sets of intuitionistic fuzzy sets. The theory of fuzzy systems has rich applications in different areas, including engineering [25,26,27]. Zhang [28,29] first proposed the idea of bipolar fuzzy sets (Yin Yang bipolar fuzzy sets) in the space as an extension of fuzzy sets. In the case of bipolar fuzzy sets, membership degree range is enlarged from the interval –. The idea behind this description is related to the existence of bipolar information. For example, profit and loss, feedback and feed-forward, competition and cooperation, etc., are usually two aspects of decision-making. In Chinese medicine, Yin and Yang are the two sides. Yin is the negative side of a system, and Yang is the positive side of a system. Bipolar fuzzy set theory has many applications in different fields, including pattern recognition and machine learning. Saleem et al. [30] presented a new hybrid model, namely bipolar fuzzy soft sets, by combining bipolar fuzzy sets with soft sets. Motivated by these concerns, in this paper, we present four ways to reduce parameters in bipolar fuzzy soft sets by developing another bipolar fuzzy soft set theoretical approach to solve decision-making problems. In particular, we solve the decision-making problem in [30] by our proposed decision-making algorithm-based bipolar fuzzy soft sets. We propose an algorithm of each reduction technique. Furthermore, we compare these reduction methods and discuss their pros and cons in detail. We also present a real-life application to show the validity of our proposed reduction algorithms. For other terminologies not mentioned in the paper, the readers are referred to [31,32,33,34,35,36,37,38,39].
The rest of this paper is structured as follows. Section 2 introduces the basic definitions and develops a new technique for decision-making-based bipolar fuzzy soft sets. Section 3 defines four kinds of parameter reductions of bipolar fuzzy soft sets and presents their reduction algorithms, which are illustrated by corresponding examples. A comparison among the reduction algorithms is presented in Section 4. Section 5 is devoted to solving a real-life decision-making application. In the end, the conclusions of this paper are provided in Section 6. Throughout this paper, the following notations given in Table 1 will be used.
Table 1.
Notations.
2. Another Bipolar Fuzzy Soft Sets Approach to Decision-Making Problems
Saleem et al. [30] presented an efficient approach to solve practical decision-making problems based on bipolar fuzzy soft sets. In this section, we first review the definitions of bipolar fuzzy sets and bipolar fuzzy soft sets, and then, we introduce a novel approach based on bipolar fuzzy soft sets, which can effectively solve decision-making problems, followed by an algorithm. Moreover, we use our proposed algorithm to solve the decision-making application presented by Saleem et al. [30] and observe that the optimal decisions obtained by both methods are the same.
Definition 1.
[29,40] Let O be a nonempty universe of objects. A bipolar fuzzy set B in O is defined as:
where and are mappings. The positive membership degree denotes the satisfaction degree of an object o for the property corresponding to a bipolar fuzzy set B, and the negative membership degree denotes the satisfaction degree of an object o for some implicit counter-property corresponding to a bipolar fuzzy set B.
Definition 2.
[30] Let O be a nonempty universe of objects and R a universe of parameters related to objects in O. A pair is called a BFSS over universe O, where G is a mapping from R into . It is defined as follows:
Assume that is a universe of objects and is a universe of parameters related to objects in O. Then, a BFSS can also be presented by tabular arrangement, as shown in Table 2.
Table 2.
Tabular representation of the BFSS .
Definition 3.
Let be a universe of objects and a universal set of parameters associated with objects in O. For a BFSS , and are the membership and non-membership degrees of each element to and , respectively. We define the score of the positive and negative membership degrees for each as:
Definition 4.
Let be a universe of objects and a universal set of parameters associated with objects in O. For a BFSS , the score of the membership degrees for is given by:
where and are the scores of positive and negative membership degrees for each , respectively.
Definition 5.
Let be a universe of objects, and a universal set of parameters associated with objects in O. For a BFSS , the final score for each object denoted by is defined as follows:
We present a new decision-making technique based on BFSSs as follows:
Example 1.
Reconsider Example 8 in [30]. Let be the set of four cars, a collection of parameters, and . Then, a BFSS is given by Table 3. We proceed to applying Algorithm 1 to .
Table 3.
Tabular representation of BFSS .
Table 4.
Score of the positive membership degrees of .
Table 5.
Score of the negative membership degrees of .
Now, by using Definition 4, the tabular arrangement for the score of the membership degrees of BFSS is given by Table 6.
Table 6.
Score of the membership degrees of .
By Definition 5, the final score of each car is given by Table 7. By way of illustration,
Table 7.
Final score of the membership degrees of .
Clearly, is the maximum score for the object . Thus, is the decision object, which coincides with the decision obtained in [30].
| Algorithm 1 Selection of an object based on BFSSs. |
|
Example 2.
Let be a collection of five objects under consideration and a collection of parameters related to the objects in O. Then, a BFSS is given by Table 8.
Table 8.
Tabular representation of BFSS .
By using (2) and (3), the score of the positive and negative membership degrees for and are given by Table 9 and Table 10, respectively.
Table 9.
Score of the positive membership degrees of .
Table 10.
Score of the positive membership degrees of .
Now, by using Definition 4, the score of membership degrees for and of BFSS is given by Table 11.
Table 11.
Score of the membership degrees of .
Using , the final score of each object is given by Table 12.
Table 12.
Final score of the membership degrees of .
Clearly, is the maximum score for the object , which coincides with the decision object obtained using the decision-making algorithm in [30].
From the above analysis, it can be easily perceived that our proposed decision-making approach based on BFSSs is efficient and reliable. However, in a realistic perspective, it contains redundant parameters for decision-making. To overcome this issue, the parameter reduction of BFSS is proposed. A parameter reduction is a technique in which the set of parameters is reduced to obtain a minimal subset that gives the same decision as the whole set.
3. Four Types of Parameter Reductions of BFSSs
1. OCB-PR:
We first define OCB-PR and then provide an algorithmic approach to obtain it, which is illustrated via an example.
Definition 6.
Let be a universe of objects and a universal set of parameters associated with objects in O. For a BFSS , denote as the family of objects in O, which takes the maximum value of . For each , if , then B is said to be dispensable in R, else B is called indispensable in the set R. The parameter set R is called independent if every is indispensable in R, else R is dependent. A subset P of R is said to be an OCB-PR of R if the following axioms hold.
- 1.
- P is independent (that is, P is the smallest subset of R that keeps the optimal decision object invariant).
- 2.
- .
Based on Definition 6, we propose an OCB-PR algorithm that deletes redundant parameters while keeping the optimal decision object unchanged.
Example 3.
Let be the set of four cars and a collection of parameters. Reconsider a BFSS as in Example 8 [30], where . We proceed to applying Algorithm 2 to the BFSS .
Table 13.
OCB-PR.
From Table 13, it can be easily observed that is the optimal decision object after reduction. Clearly, the subset is minimal, which keeps the optimal decision object unchanged.
| Algorithm 2 OCB-PR. |
|
2. IRDCB-PR:
There are several real situations in which our main task is to compute the rank of optimal and suboptimal choices. The suboptimal choices are not considered by the OCB-PR method because OCB-PR only studies the optimal choice. To overcome this drawback, we define IRDCB-PR and present an algorithmic approach that keeps the rank of optimal and suboptimal choices unchanged after deleting the irrelevant parameters.
Definition 7.
Let be a universe of objects, a universe of parameters, and . For a BFSS , an indiscernibility relation is given by:
where . For an arbitrary BFSS over , the decision partition is given by:
where for each subclass and that is, there are z subclasses. Actually, objects are ranked with respect to the score value of , where .
Definition 8.
Let be a universe of objects and a universal set of parameters associated with objects in O, and let be a BFSS. For each , if , then B is said to be dispensable in R, else B is referred to as an indispensable set in the set R. The parameter set R is said to be independent if each is indispensable in R, else R is dependent. A subset P of R is said to be an IRDCB-PR of R if the following axioms hold.
- 1.
- P is independent (that is, P is the minimal subset of R that keeps the rank of optimal and suboptimal decision choices unchanged).
- 2.
- .
Based on Definition 8, we propose an IRDCB-PR algorithm (see Algorithm 3) that deletes irrelevant parameters while keeping the rank of optimal and suboptimal decision choice objects unchanged.
| Algorithm 3 IRDCB-PR. |
|
Example 4.
Let be a universal set of four objects and a set of parameters related to the objects in O. Then, a BFSS is given by Table 14.
Table 14.
Tabular arrangement of BFSS .
By using (2) and (3), the scores of positive and negative membership degrees for and are given by Table 15 and Table 16, respectively.
Table 15.
Score of the positive membership degrees of .
Table 16.
Score of the negative membership degrees of .
Now, by using Definition 4, the tabular arrangement for the score of membership degrees where and of is given by Table 17.
Table 17.
Score of the membership degrees of .
From , the final score of each object is given by Table 18.
Table 18.
Final score of the membership degrees of .
Clearly, is the maximum score for the object . Thus, is the optimal decision object, which coincides with the decision obtained using the algorithm in [30]. From Table 16, it can readily be computed:
Using Algorithm 3, we can proceed further by examining the subsets of R. Thus, for , we have with . Note that after reduction, the rank and partition of objects are not changed. Hence, (not all) is the IRDCB-PR of BFSS given by Table 19.
Table 19.
IRDCB-PR.
Clearly, is minimal that keeps the rank of decision choices unchanged.
3. N-PR:
The parameter reduction techniques such as OCB-PR and IRDCB-PR are not always workable in many practical applications. Therefore, we provide the normal parameter reduction of BFSSs, which studies the issues of added parameters and suboptimal choice. We present a definition of N-PR and provide an algorithmic method to obtain it, which are illustrated via an example.
Definition 9.
Let be a universe of objects and a universal set of parameters associated with objects in O. For a BFSS , B is called dispensable if there exist that satisfy the following expression.
Else, B is indispensable. A subset is called N-PR of R, if the following axioms hold.
- 1.
- P is indispensable.
- 2.
Based on Definition 9, we propose the N-PR algorithm (Algorithm 4) as follows.
| Algorithm 4 N-PR. |
|
Example 5.
Let be the set of six objects and a set of parameters. Then, a BFSS is defined by Table 20.
Table 20.
Tabular representation of BFSS .
By using (2) and (3), the scores of the positive and negative membership degrees for and are described by Table 21 and Table 22, respectively.
Table 21.
Score of the positive membership degrees of .
Table 22.
Score of the negative membership degrees of .
Now, by using Definition 9, the tabular arrangement for the score of membership degrees where and of is given by Table 23.
Table 23.
Score of the membership degrees of .
From , the final score of each object is given by Table 24.
Table 24.
Final score of the membership degrees of .
Clearly, is the maximum score for the object . Thus, is the optimum decision object. From Table 24, it can be easily observed that , satisfying:
Thus, (not all) is the N-PR of BFSS given by Table 25.
Table 25.
N-PR.
Clearly, N-PR method maintains the invariable rank of decision choices, as well as takes into account immutable differences between the decision choice objects. Thus, if we add new parameters in the set of parameters, there is no need to compute new reduction again. The issue of added parameters is discussed by examples in Section 4.
4. AN-PR:
N-PR is an outstanding technique for the reduction of parameters. It is very difficult to compute N-PR taking into account that BFSS provides bipolar information to explain membership degrees. To improve this method, we propose a new reduction method, namely AN-PR, which is a compromise between IRDCB-PR and N-PR.
Definition 10.
Let be a universe of objects and a universal set of parameters associated with objects in O. For a BFSS , given an arbitrary error value α, if there exists such that:
inside the range of α and , then B is dispensable, else B is indispensable. The subset is called an AN-PR of BFSS , when the following three axioms hold.
- 1.
- P is indispensable.
- 2.
- inside the range of α.
- 3.
- .
We are ready to propose the AN-PR algorithm (Algorithm 5 below):
| Algorithm 5 AN-PR. |
|
As mentioned earlier, AN-PR is a compromise between IRDCB-PR and N-PR. Note that if there is no limitation from (that is, without ), AN-PR is IRDCB-PR, and when , AN-PR is N-PR. It can be easily observed that the reduction set by AN-PR relies on the outcomes of IRDCB-PR and the provided range , because the AN-PR algorithm is relying on IRDCB-PR. In other words, reduction sets through AN-PR are computed based on the reduction sets through IRDCB-PR. If the computing difference between the highest and lowest sum of scores of reduced parameters is lower than , the set of reduction is referred to as parameter reduction through AN-PR, else it is not referred to as the parameter reduction through AN-PR. Note that the AN-PR method preserves the rank of decision choices.
Example 6.
Let be a universal set of six objects and a set of parameters Then, a BFSS is defined by Table 26.
Table 26.
Tabular representation of BFSS .
By using (2) and (3), the scores of the positive and negative membership degrees for and are given by Table 27 and Table 28, respectively.
Table 27.
Score of the positive membership degrees of .
Table 28.
Score of the negative membership degrees of .
Now, by using Definition 4, the tabular arrangement for the score of membership degrees where and of BFSS is given by Table 29.
Table 29.
Score of the membership degrees of .
From , the final score of each object is given by Table 30.
Table 30.
Final score of the membership degrees of .
Clearly, is the maximum score for the object . Thus, is an optimal decision object. Given an error value , using Table 30, we can easily compute that , satisfying:
Table 31.
AN-PR.
4. Comparison
This section compares our proposed parameter reduction algorithms regarding the EDCR, applicability, exact degree of reduction, reduction result, multi-use of the reduction set, and applied situation.
1. Comparison of EDCR and applicability:
Assume that a coefficient q represents the ratio of correctly-computed parameter reduction in different datasets. In other words, q represents the applicability of our proposed reduction techniques in practical applications and will be interpreted as EDCR. OCB-PR only preserves the optimal decision object. Therefore, a parameter reduction is easy to compute with OCB-PR. For example, is the OCB-PR in Example 3; and are the OCB-PR in Example 4; and are the OCB-PR in Example 5; is the OCB-PR in Example 6. Hence, .
IRDCB-PR is designed to delete the irrelevant parameters by preserving the partitioning and rank of objects. Obviously, parameter reduction using IRDCB-PR is more difficult than OCB-PR. For instance, is the IRDCB-PR in Example 4, and and are the IRDCB-PR in Example 5. We can observe that there is no IRDCB-PR in Examples 3 and 6. Thus, .
N-PR maintains both invariable rank and unchangeable differences between decision choices. Using the N-PR algorithm is the most difficult to obtain parameter reduction as compared to other proposed reduction methods. We can see that and are the N-PRs in Example 5. Unfortunately, there is no N-PR in Examples 3, 4, and 6. Thus,
AN-PR is a compromise between IRDCB-PR and N-PR. Without , AN-PR is IRDCB-PR, and when , AN-PR is N-PR. Thus, the EDCR of AN-PR depends on . Therefore,
2. Comparison of the exact degree of reduction and reduction results:
The exact degree of parameter reduction considers the precision of parameter reduction and its impact on the post-reduction decision object. OCB-PR only keeps the optimal decision object unchanged after reduction (that is, the rank of decision choices may be changed after reduction). Therefore, the exact degree of reduction is lower. IRDCB-PR reduces redundant parameters by preserving the partitioning and rank of objects. Therefore, the exact degree of reduction is higher as compared to OCB-PR. N-PR preserves both rank and unchangeable differences between decision choices. Therefore, its exact degree of reduction is highest.
3. Comparison of the multiple use of the parameter reduction and applied situation:
The multiple use of parameter reduction means that the reduction sets can be reused when the expert demands suboptimal parameters and when he/she adds some new parameters.
(i) Comparison of the multiple use of the parameter reduction and applied situation of OCB-PR:
OCB-PR usually has a wider range of applications. As we know, it only provides the optimal option. After selecting the best choice, if the data of the optimal object are deleted from the dataset, then, for the next decision, we need to make a new reduction again, which wastes much time on the parameter reduction. Furthermore, the added parameter set has not been considered. If new parameters are added to the parameter set, a new reduction is required. We explain these issues by the following example.
Example 7.
From Example 2, clearly, is the best option in Table 12. An OCB-PR of is , which is given by Table 32. When the object is deleted from Table 12, the suboptimal choice object is . From Table 32, it can be easily observed that the suboptimal choice is . It is clear that the suboptimal choice has changed.
Table 32.
OCB-PR in Example 2.
Let be the set of added parameters for the BFSS in Example 2, given by:
For the parameters and , the score of membership degrees where and of BFSS is given by Table 33.
Table 33.
Added parameters’ scores.
By combining Table 12 and Table 33, we can observe that is the optimal decision object from Table 34, while by combining Table 32 and Table 33, is the best option from Table 35. Clearly, these two optimal options are different. Thus, OCB-PR has a lower degree of the multiple use of parameter reduction.
(ii) Comparison of multi-use of parameter reduction and applied situation of IRDCB-PR:
IRDCB-PR maintains the rank of suboptimal decision choices. However, the issue of added parameters is not solved by the IRDCB-PR method. We give the following example to explain this idea.
Example 8.
Let be the set of added parameters for the BFSS in Example 4, given by:
For the parameters and , the score of membership degrees is given by Table 36.
Table 36.
Added parameters’ scores.
Combine Table 18 and Table 19 with Table 36. From Table 37, we see that . Similarly, using Table 38, we get . Clearly, the ranks of choice objects in Table 37 and Table 38 are different. From Table 18, we observe that . IRDCB-PR of the BFSS is . We can compute that . Thus, IRDCB-PR preserves the partition and rank of the objects after parameter reduction. From the above analysis, we observe that the issue of suboptimal choice can be solved by the IRDCB-PR method, while the issue of added parameters cannot be solved by the IRDCB-PR technique.
(iii) Comparison of the multiple use of parameter reduction and applied situation of N-PR:
The problems of the suboptimal choice and rank of decision choice objects can be solved using N-PR. The following example addresses this issue.
Example 9.
Let be the set of added parameters for the BFSS in Example 5, which are given by:
For the parameters and , the score of membership degrees is given by Table 39.
Table 39.
Added parameters’ scores.
Combine Table 24 (final score table of BFSS ) and Table 25 (N-PR for the BFSS ) with Table 39 (added parameters’ score table). From Table 40 and Table 41, we can easily compute that and , respectively. Hence, the ranks of decision choices are the same. Thus, N-PR has the highest degree of the multiple use of reduction sets.
Table 40.
Final score of the membership degrees of with added parameters.
Table 41.
N-PR.
(iv) Comparison of the multiple use of parameter reduction and applied situation of AN-PR
No doubt, N-PR is a suitable approach for parameter reduction, but it is very hard to compute the N-PR because BFSS provides bipolar information to describe membership degrees. To reduce this computational difficulty, AN-PR is given as a compromise between IRDCB-PR and N-PR.
5. Application
To demonstrate our proposed techniques, they were applied to a practical application.
Let be a set of twelve investment avenues, where:
‘’ represents “Bank Deposits”,
‘’ represents “Insurance”,
‘’ represents “Foreign or Overseas Mutual Fund”,
‘’ represents “Bonds Offered by the Government and Corporates”,
‘’ represents “Equity Mutual Funds”,
‘’ represents “Precious Objects”,
‘’ represents “Postal Savings”,
‘’ represents “Shares and Stocks”,
‘’ represents “Employee Provident Fund”,
‘’ represents “Company Deposits”,
‘’ represents “Real Estate”,
‘’ represents “Money Market Instruments”,
and be a collection of parameters associated with the objects in O (’s are basically factors influencing investment decision), where:
‘’ denotes “Safety of Funds”,
‘’ denotes “Liquidity of Funds”,
‘’ denotes “State Policy”,
‘’ denotes “Maximum Profit in Minimum Period”,
‘’ denotes “Stable Return”,
‘’ denotes “Easy Accessibility”,
‘’ denotes “Tax Concession”,
‘’ denotes “Minimum Risk of Possession”,
‘’ denotes “Political Climate”,
‘’ denotes “Level of Income”.
An investor Z wants to invest in a most suitable investment avenue from the above-mentioned investment avenues. The information between the investment avenues and influenced factors is given in the form of a BFSS , which is given by Table 44.
Table 44.
Tabular representation of BFSS .
By using (2) and (3), the score of the positive and negative membership degrees for and are described by Table 45 and Table 46, respectively.
Table 45.
Score of the positive membership degrees of .
Table 46.
Score of the negative membership degrees of .
Now, by using Definition 9, the tabular arrangement for the score of membership degrees where and of BFSS is given by Table 47.
Table 47.
Score of the membership degrees of .
From , the final score of each object is given by Table 48.
Table 48.
Final score of the membership degrees of .
Clearly, is the maximum score for the object . Thus, the investment avenue, , namely real estate, is the best choice for the investor Z. Our proposed reduction algorithms were executed by the investment avenue dataset. Consequently, the parameter reduction sets were readily computed by OCB-PR, and the minimal reduction was (not all) that kept the optimal decision invariant. Regrettably, we obtained no parameter reduction through IRDCB-PR, AN-PR, and N-PR. This means that OCB-PR can be applied in many real-life decision-making situations as compared to IRDCB-PR, AN-PR, and N-PR.
6. Conclusions
Parameter reduction is one of the main issues in soft set modelization and its hybrid models, including fuzzy soft set theory. Parameter reduction preserves the decision by removing the irrelevant parameters. In this paper, a novel approach for decision-making based on BFSSs was introduced, and some decision-making problems were solved by this newly-proposed approach to prove its validity, including a decision-making problem presented in [30]. It was also observed that the results were the same by applying this novel decision-making approach. Using this concept, four novel definitions of parameter reductions, namely, OCB-PR, IRDCB-PR, N-PR, and AN-PR, of BFSSs were presented and illustrated through examples. Due to the existence of bipolar information in many real-world problems, the newly-proposed decision-making method based on BFSSs and parameter reductions of BFSSs were very efficient approaches to solve such problems, when compared to some existing methods, including fuzzy soft sets [32] and their parameter reduction [33]. An algorithm for each parameter reduction approach was developed. Moreover, our proposed reduction methods were compared with respect to the theoretical and experimental points of view as displayed in Table 49. Finally, an application was studied to show the feasibility of our proposed reduction algorithms. In the future, we expect to extend our research work to (1) parameter reduction of the Pythagorean fuzzy soft sets, (2) parameter reduction of the Pythagorean fuzzy bipolar soft sets, and (3) parameter reduction of m-polar fuzzy soft sets.
Table 49.
Comparison table.
Author Contributions
G.A., M.A., A.N.A.K., and J.C.R.A. conceived of the presented concept. G.A. and M.A. developed the theory and performed the computations. A.N.A.K. and J.C.R.A. verified the analytical methods.
Funding
This research received no external funding.
Acknowledgments
The authors are grateful to the Editor of the Journal and the anonymous referees for their valuable comments.
Conflicts of Interest
The authors declare that they have no conflict of interest.
References
- Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
- Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
- Molodtsov, D. Soft set theory: First results. Comput. Math. Appl. 1999, 37, 19–31. [Google Scholar] [CrossRef]
- Molodtsov, D. The Theory of Soft Sets; URSS Publishers: Moscow, Russia, 2004. (In Russian) [Google Scholar]
- Maji, P.K.; Biswas, R.; Roy, A.R. Soft set theory. Comput. Math. Appl. 2003, 45, 555–562. [Google Scholar] [CrossRef]
- Feng, F.; Li, C.; Davvaz, B.; Ali, M.I. Soft sets combined with fuzzy sets and rough sets: A tentative approach. In Soft Computing A Fusion of Foundations, Methodologies and Applications; Springer: Berlin/Heidelberg, Germany, 2009; pp. 899–911. [Google Scholar]
- Roy, A.R.; Maji, P.K. A fuzzy soft set theoretic approach to decision-making problems. J. Comput. Appl. Math. 2007, 203, 412–418. [Google Scholar] [CrossRef]
- Xiao, Z.; Gong, K.; Zou, Y. A combined forecasting approach based on fuzzy soft sets. J. Comput. Appl. Math. 2009, 228, 326–333. [Google Scholar] [CrossRef]
- Ali, M.I. Another view on reduction of parameters in soft sets. Appl. Soft Comput. 2012, 12, 1814–1821. [Google Scholar] [CrossRef]
- Danjuma, S.; Ismail, M.A.; Herawan, T. An alternative approach to normal parameter reduction algorithm for soft set theory. IEEE Access 2017, 5, 4732–4746. [Google Scholar] [CrossRef]
- Danjuma, S.; Herawan, T.; Ismail, M.A.; Chiroma, H.; Abubakar, A.I.; Zeki, A.M. A review on soft set-based parameter reduction and decision-making. IEEE Access 2017, 5, 4671–4689. [Google Scholar] [CrossRef]
- Deng, T.; Wang, X. Parameter significance and reductions of soft sets. Int. J. Comput. Math. 2012, 89, 1979–1995. [Google Scholar] [CrossRef]
- Kong, Z.; Jia, W.; Zhang, G.; Wang, L. Normal parameter reduction in soft set based on particle swarm optimization algorithm. Appl. Math. Model. 2015, 39, 4808–4820. [Google Scholar] [CrossRef]
- Zhan, J.; Alcantud, J.C.R. A survey of parameter reduction of soft sets and corresponding algorithms. Artif. Intell. Rev. 2017. [Google Scholar] [CrossRef]
- Maji, P.K.; Roy, A.R. An application of soft sets in a decision-making problem. Comput. Math. Appl. 2002, 44, 1077–1083. [Google Scholar] [CrossRef]
- Pawlak, Z.; Skowron, A. Rudiments of rough sets. Inf. Sci. 2007, 177, 3–27. [Google Scholar] [CrossRef]
- Chen, D.; Tsang, E.C.C.; Yeung, D.S.; Wang, X. The parameterization reduction of soft sets and its applications. Comput. Math. Appl. 2005, 49, 757–763. [Google Scholar]
- Kong, Z.; Gao, L.; Wang, L.; Li, S. The normal parameter reduction of soft sets and its algorithm. Comput. Math. Appl. 2008, 56, 3029–3037. [Google Scholar] [CrossRef]
- Ma, X.; Sulaiman, N.; Qin, H.; Herawan, T.; Zain, J.M. A new efficient normal parameter reduction algorithm of soft sets. Comput. Math. Appl. 2011, 62, 588–598. [Google Scholar] [CrossRef]
- Kong, Z.; Gao, L.Q.; Wang, L.F. Comment on “A fuzzy soft set theoretic approach to decision making problems”. J. Comput. Appl. Math. 2009, 223, 540–542. [Google Scholar] [CrossRef]
- Ma, X.; Qin, H.; Sulaiman, N.; Herawan, T.; Abawajy, J. The parameter reduction of the interval-valued fuzzy soft sets and its related algorithms. IEEE Trans. Fuzzy Syst. 2014, 22, 57–71. [Google Scholar] [CrossRef]
- Feng, F.; Jun, Y.B.; Liu, X.; Li, L. An adjustable approach to fuzzy soft set based decision-making. J. Comput. Appl. Math. 2010, 234, 10–20. [Google Scholar] [CrossRef]
- Feng, F.; Li, Y.; Fotea, V.L. Application of level soft sets in decision-making based on interval-valued fuzzy soft sets. Comput. Math. Appl. 2010, 60, 1756–1767. [Google Scholar] [CrossRef]
- Jiang, Y.; Tang, Y.; Chen, Q. An adjustable approach to intuitionistic fuzzy soft sets based decision-making. Appl. Math. Model. 2011, 35, 824–836. [Google Scholar] [CrossRef]
- Giorleo, G.; Minutolo, F.M.C.; Sergi, V. Fuzzy logic modeling and control of steel rod quenching after hot rolling. J. Mater. Eng. Perform. 1997, 6, 599–604. [Google Scholar] [CrossRef][Green Version]
- Kahraman, C.; Gulbay, M.; Kabak, O. Applications of fuzzy sets in industrial engineering: A topical classification. In Fuzzy Applications in Industrial Engineering. Studies in Fuzziness and Soft Computing; Kahraman, C., Ed.; Springer: Berlin/Heidelberg, Germany, 2006; Volume 201. [Google Scholar]
- Ross, T.J. Fuzzy Logic with Engineering Applications; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2010. [Google Scholar]
- Zhang, W.R. YinYang bipolar fuzzy sets. In Proceedings of the IEEE International Conference on Fuzzy Systems Proceedings and the IEEE World Congress on Computational Intelligence (FUZZ-IEEE ’98), Anchorage, AK, USA, 4–9 May 1998; Volume 1, pp. 835–840. [Google Scholar]
- Zhang, W.R. Bipolar fuzzy sets and relations: A computational framework for cognitive modeling and multiagent decision analysis. In Proceedings of the First International Joint Conference of The North American Fuzzy Information Processing Society Biannual Conference. The Industrial Fuzzy Control and Intellige, San Antonio, TX, USA, 18–21 December 1994; pp. 305–309. [Google Scholar]
- Saleem, A.; Aslam, M.; Ullah, K. Bipolar fuzzy soft sets and its applications in decision-making problem. J. Intell. Fuzzy Syst. 2014, 27, 729–742. [Google Scholar]
- Luqman, A.; Akram, M.; Koam, A.N.A. Granulation of hypernetwork models under the q-rung picture fuzzy environment. Mathematics 2019, 7, 496. [Google Scholar] [CrossRef]
- Maji, P.K.; Biswas, R.; Roy, A.R. Fuzzy soft sets. J. Fuzzy Math. 2001, 9, 589–602. [Google Scholar]
- Zhang, Z. The parameter reduction of fuzzy soft sets based on soft fuzzy rough sets. Adv. Fuzzy Syst. 2013, 2013, 12. [Google Scholar] [CrossRef]
- Adeel, A.; Akram, M.; Ahmad, I.; Nazar, K. Novel m-polar fuzzy linguistic ELECTRE-I method for group decision-making. Symmetry 2019, 11, 471. [Google Scholar] [CrossRef]
- Adeel, A.; Akram, M.; Koam, A.N. Multi-criteria decision-making under mHF ELECTRE-I and HmF ELECTRE-I. Energies 2019, 12, 1661. [Google Scholar] [CrossRef]
- Adeel, A.; Akram, M.; Koam, A.N. Group decision-making based on m-polar fuzzy linguistic TOPSIS method. Symmetry 2019, 11, 735. [Google Scholar] [CrossRef]
- Akram, M.; Ali, G.; Alshehri, N.O. A new multi-attribute decision-making method based on m-polar fuzzy soft rough sets. Symmetry 2017, 9, 271. [Google Scholar] [CrossRef]
- Akram, M.; Adeel, A.; Alcantud, J.C.R. Multi-criteria group decision-making using an m-polar hesitant fuzzy TOPSIS approach. Symmetry 2019, 11, 795. [Google Scholar] [CrossRef]
- Akram, M.; Habib, A.; Koam, A.N.A. A novel description on edge-regular q-rung picture fuzzy graphs with application. Symmetry 2019, 11, 489. [Google Scholar] [CrossRef]
- Lee, K.M. Bipolar-valued fuzzy sets and their basic operations. In Proceedings of the International Conference, Bangkok, Thailand, 16–21 January 2000; pp. 307–317. [Google Scholar]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).