A New Approach for Normal Parameter Reduction Using σ -Algebraic Soft Sets and Its Application in Multi-Attribute Decision Making

: The soft set is one of the key mathematical tools for uncertainty description and has many applications in real-world decision-making problems. However, most of the time, these decision-making problems involve less important and redundant parameters, which make the decision making process more complex and challenging. Parameter reduction is a useful approach to eliminate such irrelevant and redundant parameters during soft set-based decision-making problems without changing their decision abilities. Among the various reduction methods of soft sets, normal parameter reduction (NPR) can reduce decision-making problems without changing the decision order of alternatives. This paper mainly develops a new algorithm for NPR using the concept of σ -algebraic soft sets. Before this, the same concept was used to introduce the idea of intersectional reduced soft sets (IRSSs). However, this study clariﬁes that the method of IRSSs does not maintain the decision order of alternatives. Thus, we need to develop a new approach that not only keeps the decision order invariant but also makes the reduction process more simple and convenient. For this reason, we propose a new algorithm for NPR using σ -algebraic soft sets that not only overcome the existing problems of IRSSs method but also reduce the computational complexity of the NPR process. We also compare our proposed algorithm with one of the existing algorithms of the NPR in terms of computational complexity. It is evident from the experimental results that the proposed algorithm has greatly reduced the computational complexity and workload in comparison with the existing algorithm. At the end of the paper, an application of the proposed algorithm is explored by a real-world decision-making problem.


Introduction
As one of the basic activities of human society, decision making generally exists in all aspects of today's life. It is usually defined as a mental process that involves judging multiple options or alternatives to select one, so as to best fulfill the aims or goals of the decision makers. As the real world is complex and changeable, the relationships between things are mostly random, imprecise and fuzzy, which are the main source of uncertainty in our daily life decision making. A number of mathematical theories have choice is deleted. Similarly, in some cases, we do not have sufficient parameters to fully characterize the alternatives in decision-making problems so that we need to add some more parameters to existing parameter sets. However, in this case, we may also need a new reduction, as the new parameters may change the decision order of decision-making problems. To overcome these two drawbacks, Kong et al. [25] introduced the concept of normal parameter reduction (NPR) of (fuzzy) soft sets and presented its heuristic algorithm. NPR can reduce the number of parameters without changing the entire ranking (or decision) order of decision alternatives. However, the algorithm of NPR, as proposed in [25], was based on the parameter importance degree, which was hard to compute and involved a great amount of computation. Therefore, Ma et al. [26] proposed the new efficient normal parameter reduction algorithm (NENPR) for soft sets to reduce the computational complexity of the NPR process. Renukadevi and Sangeetha [27] discussed some interesting characterizations of NPR of soft sets. Kong et al. [28] used the particle swarm optimization algorithm to provide proper mathematical representation to the problem of NPR of soft sets. Danjuma et al. [29] considered the case of repeated columns in soft set reduction and proposed the alternative normal parameter reduction (ANPR) algorithm of soft sets. Ma and Qin [30] introduced soft set-based parameter value reduction which keeps the entire decision ability of decision alternative with a high success rate of finding reduction and low amount of computation. Khan and Zhu [31] developed another improved algorithm for NPR of soft sets. Akram et al. [32] proposed four different algorithms for parameter reduction of N-soft set and discussed their application in decision making. For more study about soft set reduction and its applications, we refer to [33][34][35].
Kandamir [36] introduced the concept of σ-algebraic soft sets by taking the cardinality of sets as a measure on all subsets of the universal set. Furthermore, he defined two different relations (i.e., preferability and indiscernibility relations) on the parameter set, which further led to the idea of the intersectional reduced soft sets (IRSSs). However, in this study, we show that the IRSSs method is unable to maintain the entire decision order of alternatives. The main contributions of the study are summarized below: • We present some useful examples to show that the IRSSs method does not keep the decision order invariant. • We propose a new algorithm for NPR using σ-algebraic soft sets that not only overcomes the existing problems of IRSSs method, but also makes the reduction process more simple and convenient. • We provide a comparative study to show that the proposed algorithm has less computational complexity and workload as compared to the previous algorithm of Kong et al. [25]. • We present an application of the proposed algorithm in a real-life decision-making problem.
The rest of the paper is organized as follows. Section 2 recalls some basic definitions and results related to soft set theory. In Section 3, We discuss the basic idea of NPR of soft sets and give its initial algorithm proposed by Kong et al. [25]. Section 4 highlights some setbacks of Kandamir's approach to soft set reduction. In Section 5, first we derive some useful results, and then develop a new algorithm for NPR of soft set. In Section 6, we compare our new algorithm with Kong et al's algorithm in terms of computational complexity. Section 7 provides an application of the proposed algorithm in a real-world decision-making problem. Finally, Section 8 presents the conclusion of the paper.

Preliminaries
This section briefly reviews some basic definitions and results related to soft set theory. Let U denote a finite universe of objects, E represent the set of parameters which can describe the properties of objects in U, and P(U) denote the power set of U.

Definition 1 ([6]).
A pair (F, E) is called a soft set over U, where F is a mapping given by F : E → P(U).
The following example will clarify the concept of soft sets. Example 1. Suppose that U = {u 1 , u 2 , u 3 , u 4 } is the set of four houses under consideration, and E = {e 1 , e 2 , e 3 , e 4 , e 5 } is the set of parameters where each e i for 1 ≤ i ≤ 5 stands for beautiful, new, cheap, reliable and well-furnished, respectively. A soft set (F, E) can be defined to describe "the attractiveness of the houses" by: Maji et al. [7] represented a soft set by a binary table to store it in computer memory. The choice value of each u i ∈ U is defined by f E (u i ) = ∑ j (u ij ), where u ij are the entries in the table of the soft set (F, E) for 1 ≤ i ≤ n and 1 ≤ j ≤ m. For example, the tabular representation of the soft set (F, E) as defined in Example 1 is given by Table 1, where the last column shows the choice values of all u i ∈ U. From Table 1, it is clear that u 4 has the maximum choice value in the table. Therefore, by choice value criteria, the optimal choice object is u 4 and it can be selected as the best house among the four houses.
It is seen in the last example that soft sets can be applied to decision-making problems under uncertain environment. However, sometimes, these decision-making problems involve such parameters which do not take any part in the decision-making process. For example, if we consider e 3 in Table 1, then we see that it has no role in the decision-making process. That is, E − e 3 provides the same decision ability (order) as the entire set of parameters. Therefore, it is necessary to reduce such useless parameters from E to minimize the workload and processing time in the decision-making process. Some successful implementations have been made by different researchers towards soft set reduction. Normal parameter reduction is one of them, which is described in the next section.

Normal Parameter Reduction of Soft Sets
Normal parameter reduction is a good approach to soft set reduction which was introduced by Kong et al. [25]. It eliminates unnecessary parameters from E without changing the decision order of decision alternatives.
For simplicity, we take . . , E − e i fś } are the decision partition and the decision partition deleted e i , respectively, then for each parameter e i , the parameter importance degree r e i is defined by r e i = 1 |U| (α 1,e i + α 2,e i + . . . + α s,e i ), where: and |.| denotes the cardinality of set.
Based on Theorem 1, an algorithm for NPR of soft sets was proposed by Kong et al. [25] which is labeled by Algorithm 1. The following example will illustrate Algorithm 1.
Input the (F, E) and its parameter set E; Step 2.
Calculate r e j for all e j , where 1 ≤ j ≤ m; Step 3. Find A ⊂ E such that ∑ e j ∈A r e j is a nonnegative integer and put it into the feasible parameter reduction set (FPRS); Step 4.
If the condition f A (u 1 ) = f A (u 2 ) = . . . = f A (u n ) is satisfied for a subset A in the FPRS, then A is saved, otherwise it will be deleted; Step 5. Calculate E − A as the optimal NPR of (F, E), where A has the maximum cardinality in the FPRS.
Example 2. If we consider the soft set (F, E) as given by Table 1, then according to Algorithm 1: Step 1. Take (F, E) and its parameter set E as an input.
Step 2. Compute the choice values for all u i ∈ U and obtain the decision partition C E as given by: Similarly, obtain the deleted-decision partitions C E−e j , for all e j ∈ E as: Compute the importance degrees for all e j ∈ E by Definition 4 as: Similarly, we can compute r e 2 = 1 2 , r e 3 = 0, r e 4 = 3 4 , r e 5 = 1 2 .
Step 3. Find A ⊂ E such that ∑ e j ∈A r e j is a nonnegative integer and put A into FPRS. In this way we obtain the subsets, such as {e 3 }, {e 1 , e 4 }, {e 1 , e 3 , e 4 }, {e 2 , e 3 , e 5 } and so on.
is satisfied for a subset A, then it will be saved. Otherwise, it will be deleted from the FPRS. In this way, we obtain only three subsets which satisfy the given condition, such as Step 5. Finally, select A 3 = {e 1 , e 3 , e 4 } as the maximum cardinality in the FPRS. Thus, B = E − A 3 = {e 2 , e 5 } is the optimal NPR of (F, E) as given by: Table 2. Table 2. NPR of (F, E) in Example 3.

U/B
e 2 e 5 f B (.) We can verify that NPR can solve the problems of suboptimal choices and update the parameter set. For this, we consider the decision partition of (F, E) as given by (1). Similarly, the decision partition for the reduced soft set (F, B) is given by By comparing (1) with (2), we observe that the optimal choice and all the levels of suboptimal choices are invariant after the NPR. This shows that NPR not only maintains the optimal choice but also keep the entire ranking order of decision alternatives to be invariant.
We next discuss the problem of updated parameter sets. We assume that the character of objects (houses) in U cannot be completely embodied by the given parameter set E. Assume that we add some new parametersÊ = {ê 1 ,ê 2 ,ê 3 } to the existing parameter set E, where eachê i stand for good color, in a hilly area and near the road, respectively. The updated soft set (F, E ∪Ê) is represented by Table 3. From Table 3, the decision partition for (F, E ∪Ê) is given by: Similarly, if we add the new parameter setÊ to the reduced soft set (F, B), then we obtain another soft set (F, B ∪Ê), which is represented by Table 4. From Table 4, the decision partition for (F, B ∪Ê) is given by: It is clear from (3) and (4) that after adding the new parameters, we obtain the same decision partition for (F, E) and its NPR (F, B). This implies that one can use (F, B) instead of (F, E) as the new parameters have the same affect on both of the soft sets. This shows that NPR can support the case of updated parameter sets. Table 3. Tabular form of (F, E ∪Ê). Table 4. Tabular form of (F, B ∪Ê).

Intersectional Reduced Soft Set and Its Limitations
In this section, we analyze the method of the intersectional reduced soft set proposed by Kandamir [36]. We also provide an example to show that the intersectional reduced soft set method does not overcome the problems of suboptimal choices and updated parameter sets. We start the section with some basic definitions from measure theory.
Definition 6 ([37]). A function ξ : L → R ∪ ∞ is called a measure if the following axioms are satisfied: Definition 7 ([37]). The triplet (U, L, ξ) is called a measure space, where L is a σ-algebra on U and ξ is a measure on L. The elements of L are called measurable sets.
The concept of σ-algebraic soft sets is defined as follows.

Definition 8 ([36]
). Let (F, E) be a soft set over U and L ba a σ-algebra on U.
The relation ∼ is an equivalence relation on E and the indiscernibility class of e ∈ E is denoted by [e]. According to Kandemir [36], if the cardinality of set is taken as a measure on P(U), then every soft set (F, E) over U can be regarded as a σ-algebraic soft set over the measurable universe (U, P(U), ξ). In this way, we can reduce a soft set (F, E) to IRSS (F * , [E]) by using the relation ∼. The following example will illustrate the idea of IRSS. [36]). We consider the same soft set (F, E) over U as defined in Example 1. The tabular representation of (F, E) is given by Table 1. If we take the cardinality of sets as a measure on P(U), then clearly (F, E) is a σ-algebraic soft set over the measure universe (U, L, ξ). Furthermore:

Example 3 (Example 3.38 in
That is, e 2 ∼ e 5 and by Definition 10, the IRSS of (F, E) is given by: The tabular representation of the IRSS (F * , [E]) is given by Table 5.

Limitations of the IRSS Method
By analyzing Example 3, we observe that the IRSS method does not overcome the problems of suboptimal choices and updated parameter sets. To verify this, we consider the decision partition for (F * , [E]) as given by: If we compare (5) with (1), then we see that the optimal choice for the soft set (F, E) and its IRSS (F * , [E]) is the same, but their suboptimal choices are different from each other. This shows that the IRSS method does not overcome the problem of suboptimal choices.
Next, we show that the IRSS method does not solve the problem of updated parameter sets. For this, we add the new parameter setÊ with [E] and obtain another soft set (F, [E] ∪Ê), which is represented by Table 6. From Table 6, the decision partition for the soft set (F, [E] ∪Ê) is given by: By comparing (6) with (3), we observe that the decision partitions for the soft sets (F, E ∪Ê) and (F, [E] ∪Ê) are different from each other. This shows that the IRSS method does not overcome the problem of updated parameter sets.  From the above discussion, we conclude that although the IRSS method is a simple approach towards soft set reduction, it does not overcome the problems of suboptimal choices and updated parameter sets, that is, it does not maintain the entire ranking or decision order of alternatives after the reduction process. On the other hand, NPR can solve the above-mentioned problems, but estimating the parameter importance degree in Algorithm 1 is a complex process which makes the algorithm very difficult to understand and requires a great amount of computations. Therefore, we need to develop a new approach for soft set reduction that can solve the above-mentioned drawbacks of the two approaches. The next section will present our new approach to NPR that not only resolves the problems of decision order but also reduces the processing time of NPR.

An Approach towards Normal Parameter Reduction Using σ-Algebraic Soft Sets
It is mentioned in the last section that every soft set can be regarded as a σ-algebraic soft set (see Example 3). Therefore, from onwards, every soft set will be considered as a σ-algebraic soft set over the measurable universe (U, P(U), ξ), where U is the initial universe, P(U) is the power set of U and ξ is the cardinality of set defined as measure on P(U).

Definition 11.
For any soft set (F, E) over the measurable universe (U, P(U), ξ), the impact of a parameter e ∈ E is defined by: For any nonempty subset A ⊂ E, we have γ A = ∑ e∈A γ e .

Definition 12.
For any soft set (F, E) over the measurable universe (U, P(U), ξ), a parameter e ∈ E is called a universal parameter denoted by e U , if γ e = 1. Similarly, e ∈ E is called a null parameter denoted by e φ , if γ e = 0.
Proof. The proof is straightforward by using Definition 11.
The second part can be proved in similar way.
Theorem 2. For a soft set (F, E) over the measurable universe (U, P(U), ξ), if A = {ē 1 ,ē 2 , . . . ,ē p } ⊂ E such that E − A is the NPR of E, then γ A = nonnegative integer and γ A = f A (u i ).
Proof. Suppose that E − A is the NPR of E. Then, by Definition 3, f A (u 1 ) = f A (u 2 ) = .
. . = f A (u n ) and obviously f A (u i ) = nonnegative integer. Case 1. Suppose that f A (u i ) = 0. Then for allē ∈ A, we have: This completes the proof.
Based on the result of Theorem 2, we propose a new algorithm for NPR of soft sets as labeled by Algorithm 2. To illustrate the idea of Algorithm 2, we present the following example.

Algorithm 2 The proposed algorithm
Step 1.
Input (F, E) and its parameter set E; Step 2.
Compute γ e j for all e j ∈ E, where (1 ≤ j ≤ m); Step 3. Identify the parameters e U j and e φ j in E and put them into the reduced parameter set Z; Step 4.
Find A ⊂Ē = E − Z such that γ A is a nonnegative integer, and put A into the FPRS; Step 5.
If the condition f A (u 1 ) = f A (u 2 ) = . . . = f A (u n ) is satisfied for a subset A in the FPRS, then A is saved, otherwise delete A from the FPRS; Step 6. Calculate E − (A ∪ Z) as the optimal NPR of (F, E), where A has the maximum cardinality in the FPRS.

Example 4.
Once again, we consider the same soft set (F, E) over U as given by Table 1. According to Algorithm 2: Step 1. Take (F, E) and its parameter set E as an input.
Using Definition 11, we can compute γ e j for all e j ∈ E, which are listed in the last row of Table 7.
Step 3. From Table 7, γ e 3 = 0, so it can be put into the reduced parameter set Z.
Step 4. Find A ⊂Ē = E − Z such that γ A is a nonnegative integer and put it into FPRS. In this way, we obtain only two subsets, such as A 1 = {e 1 , e 4 } and A 2 = {e 2 , e 5 }.
Step 6. Finally, B = E − (A 1 ∪ Z) = {e 2 , e 5 } is the required NPR of (F, E), which is the same as obtained by Algorithm 1 in Example 2. Table 7. Tabular form of (F, E) with impact of parameters. It is evident from the last example that our proposed algorithm has greatly reduced the computational complexity of Algorithm 1 by computing the impact of parameters rather than parameter importance degrees. This shows that the proposed algorithm not only overcomes the existing problems of the IRSS method (already verified in Example 2), but also minimizes the workload of the NPR process.

Comparative Analysis
In this section, we compare the proposed algorithm with Algorithm 1 in terms of computational complexity. We also provide some experimental results to show that the proposed algorithm is more efficient than Algorithm 1 in capturing the NPR of soft sets.

Computational Complexity
We compare the computational complexity of both algorithms from the following three aspects.

Estimating the parameter importance degrees and impact of parameters:
It is clear from Algorithms 1 and 2 that both of the algorithms follow the same footsteps to reach the NPR of soft sets. However, Algorithm 1 uses parameter importance degrees while Algorithm 2 uses impact of parameters to calculate the FPRS. For estimating the parameter importance degrees, Algorithm 1 first needs to obtain the decision partition C E and all deleted-decision partitions C E−e j for e j ∈ E. In this process, the total number of access elements is given by m 2 n + mn + n. Then, for estimating α k,e j and r e j for each e j ∈ E, it needs to access 2n elements. Since there are total m parameters in E, the total number of access elements in this step is given by 2mn. That is, for computing all parameter importance degrees, Algorithm 1 needs to access m 2 n + mn + n + 2mn = m 2 n + 3mn + n elements. On the other hand, to estimate the impact of parameters, Algorithm 2 first computes ξ(F(e j )) for each e j ∈ E, and then obtains γ e j for all e j ∈ E. The number of access elements in this whole process is mn + n, which is much less compared to m 2 n + 3mn + n.

Estimating the FPRS:
To compute the FPRS, Algorithm 1 needs to test the sum of all possible combinations of parameter importance degrees from combination-1 to combination-m, that is, the number of access parameter importance degrees is given by On the other hand, Algorithm 2 first puts the parameters e U j and e φ j into the reduced parameter set Z. Suppose that the parameter number in e U j and e φ j is z. Then, Algorithm 2 tests the sum of all possible combinations of the parameter impacts from combination-1 to combination-ḿ, whereḿ = m − z. That is, the number of access impact of parameters is given by C 1ḿ + C 2ḿ + . . . + Cḿ m . This shows that if the value of z is increasing, then the number of accessed entries for Algorithm 2 will be decreasing.
3. Filtering the PFRS: Suppose that there are k FPRSs for Algorithm 2 and z is the total parameter number of e U j and e φ j . Then, the total number of FPRSs for Algorithm 1 must be equal to k(2 z ) + z (can be verified from Table 8). The difference between FPRSs of both algorithms is given by k(2 z − 1) + z. Thus, once again, a large value of z will cause a large difference between the FPRSs of Algorithms 1 and 2.

Experimental Results and Discussion
Here, we consider some experimental results to compare the computational complexity of Algorithm 1 with Algorithm 2. We apply both algorithms to the same soft set (F, E), whose tabular representation is given by Table 9. The results obtained from both algorithms are summarized in Table 8. According to Table 8, the optimal NPR of E obtained from both algorithms is the same, which is given by Table 10. However, Algorithm 1 accesses 3688 entries to estimate the parameter importance degrees, while Algorithm 2 accesses just 168 entries to estimate all parameter impacts. Similarly, Algorithm 1 accesses 1,048,575 parameter importance degrees to estimate the PFRS, while Algorithm 2 accesses only 131,071 parameter impacts for the same PFRS. Furthermore, Algorithm 1 checks 122,879 PFRSs for the dispensability condition f A (u 1 ) = f A (u 2 ) = . . . = f A (u n ), while Algorithm 2 only checks 16,383 PFRSs for the same dispensability condition. This shows that Algorithm 2 has reduced the computational complexity at every stage in the NPR process and provides better results than Algorithm 1. Only addition and division Algorithm 2 requires fewer operations than Algorithm 1 Table 9. Tabular form of the soft set (F, E).

Application in Multi-Attribute Decision Making
In this section, we present an application of the proposed algorithm in a multi-attribute decision-making problem. We consider the scholarship selection problem of the Kano state scholarship board (KSB), Nigeria. The KBS works under the ministry of education Kano state that award a scholarship position to the indigene of the state whose parents are of Kano state origin and obtain admissions into tertiary institutions in Nigeria (or in some cases overseas). The board is responsible for: • Awarding the scholarship and improving the welfare of the state-sponsored students for foreign training; • Formulation and review of policies governing the awarding of scholarships; • Providing guidance and counseling for students; • Contacting Government establishment, institutes of learning and foreign universities; • Applying the selection criteria to all the applicants; • Providing a formal recommendation of suitably qualified applicants for overseas training to the governor of the state through the commissioner of education.
Here, we take the dataset of 35 students sponsored with a foreign scholarship by the KSB (available in [29]). Each student is evaluated with respect to 15 decision attributes (or parameters). Let U = {u 1 , u 2 , . . . , u 35 } denote the set of all students and E = {e 1 , e 2 , . . . , e 16 } represent the set of parameters, where each e i for 1 ≤ i ≤ 16 stands for English proficiency, mathematics, physics, chemistry, biology, agricultural sciences, Hausa language, Islamic studies, having attended public school, being above 17 years, having leadership potential, having ambassadorial potential, being an indigene of the state, being healthy, scoring a 2.1 in their undergraduate education and having completed NYSC, respectively. The views of the selection board are described by the soft set (F, E), whose tabular representation is given by Table 11. It is clear from Table 11 that the students {u 8 , u 11 , u 12 , u 15 , u 22 , u 23 , u 24 , u 25 , u 27 , u 31 , u 32 } have the highest choice values in the table, so they can be recommended as the best candidates for the scholarship awards by the KBS, while the students with suboptimal choice values, such as {u 4 , u 16 , u 21 }, can be considered as the second-best choice for the scholarship awards if the total number of scholarships exceeds the number of first priority students. Table 11. Tabular form of (F, E) in the scholarship award problem.  Now, our goal is to find such parameters in E which do not take any part in the decisionmaking process, and eliminate them without changing the decision order of the alternatives (students). In other words, we have to find those parameters in E which are jointly sufficient and individually necessary for the decision order of the students. For this, we apply the proposed algorithm to the given soft set (F, E). Initially, we compute γ e j for all e j ∈ E, which are listed in the last row of Table 11. From Table 11, we see that γ e 1 = γ e 2 = γ e 4 = γ e 16 = 1. Thus, these parameters can be put in the reduced parameter set Z. Next, we search for those A ⊂ E − Z for which γ A is a nonnegative integer. As a result, we obtain subsets such as {e 3 , e 6 , e 7 , e 15 }, {e 5 , e 7 , e 10 , e 14 }, {e 3 , e 6 , e 8 , e 9 , e 14 }, {e 5 , e 6 , e 7 , e 9 , e 10 , e 14 }, and so on, which are put in the FPRS. After filtering the FPRS, we observe that A = {e 3 , e 6 , e 8 , e 9 , e 14 } is the maximum subset of E − Z that satisfies the condition f A (u 1 ) = f A (u 2 ) = . . . = f A (u 35 ) = 3. Therefore, by the proposed algorithm, R = E − (A ∪ Z) = {e 5 , e 7 , e 10 , e 11 , e 12 , e 13 , e 15 } is the optimal NPR of (F, E) as given by Table 12. It is clear from Table 12 that the optimal choices and all the levels of suboptimal choices of the reduced soft set (F, R) are the same as (F, E). Thus, instead of taking the whole parameter set E, the selection board can take only seven parameters in (F, R) to decide whether a student is suitable for the scholarship award or not. This shows that our proposed algorithm is helpful to minimize the work-load and processing time in decision-making problems.

Conclusions and Future Work
Parameter reduction is a key step in soft set-based decision-making problems, which eliminates unnecessary and redundant information without changing the decision ability of the decision-making problem. To date, various methods have been developed for soft set reduction; however, the problems of suboptimal choices and updated parameter sets are only addressed by Kong et al. [25]. They introduced the concept of normal parameter reduction (NPR), which can reduce any soft set-based decision-making system without changing the decision order of decision alternatives. In this paper, we developed a new algorithm for NPR using the concept of σ-algebraic soft sets. Kandamir [36] also used the concept of σ-algebraic soft sets for soft set reduction, but Kandamir's method failed to maintain the entire decision order of decision alternatives. Thus, it is desired to modify their approach and develop such a method which does not suffer from the above-mentioned problems. For this reason, we applied the concept of σ-algebraic soft sets to NPR, and proposed a new algorithm that not only overcomes the existing problems of Kandamir's method, but also reduces the computational complexity of the NPR process. We compared the proposed algorithm with Kong et al.'s algorithm in terms of computational complexity and provided some experimental results. It is evident from the experimental results that the proposed algorithm greatly reduced the computational complexity and work-load of NPR as compared to Kong et al.'s algorithm. At the end of the paper, we presented an application of the proposed algorithm in a real-life decision-making problem.
Soft set-based decision making is a hot topic for researchers, but still, very limited literature can be found regarding soft set reduction. Thus, additional attention from the researchers is required to develop new reduction methods for soft sets. Some specific future research directions can be suggested as follows.

•
More general and efficient approaches are presented day by day for soft set-based decision making, and thus, we need to develop new reduction methods regarding these new decision criterions. • We need to study parameter reduction of some useful extended models of soft sets, such as picture fuzzy soft sets, probabilistic soft sets, neutrosophic soft sets and so on. • At present, very limited applications of soft set reduction can be found in the literature. Therefore, applications of soft set reduction require more attention and should be explored further.