1. Introduction and Preliminaries
The attainment of a healthy body and maintenance of a sound physique are contingent upon the presence of a healthy nutrition and diet. People in good physical and mental health are physically and mentally active, including their stamina, body, and mental and physical activity levels. They are resilient, full of vim and vitality, and have a pleasant disposition to boot. They are strong, vivacious, and endowed with an excellent nature. There are six primary categories of nutrients, which include carbs (Cs), fats, minerals (Ms), proteins (Ps), and vitamins (Vs), with water being one of the most important of these. Each nutrient is responsible for one or more of the general functions that are listed below. Vitamin C and lipids are sources of heat, energy, and power. P, M, and V are responsible for the construction and promotion of growth, the renewal of body tissues, and the regulation of body processes. The recommended dietary allowances for each day are broken down into the following fundamental food groups, which are reflected in
Table 1 and the food pyramid, which both contain representations of the basic food groups that are used to categorize the recommended daily dietary allowances [
1]. These classifications are made for the purpose of making the recommended daily dietary allowances more easily applicable.
Several contemporary theories have been proposed to address the challenges associated with imprecise data, including probability theory, fuzzy sets, intuitionistic fuzzy sets, and rough sets, among others.
Molodtsov [
2] showed that each of the above topics has some builtin limitations, for which they do not have a parametrization tool. He also presented a soft set theory with parametrization tools that can be used to deal with a wide range of uncertainties. This fuzzification of soft set theory has seen significant contributions from researchers in the last few years. After that, Maji et al. [
3] extended the soft set theory of Molodtsov and introduced
$FSS$s in decisionmaking problems. The first realworld use of soft sets in problem solving came from Maji et al. [
4,
5]. They presented and developed the FSS, a notion that combines fuzzy and soft sets and is more widely applicable. Furthermore, Chaudhuriet et al. [
6] deployed a few applications of
$FSS$s with the help of the method in [
4,
5] and compared them with the probability distribution. Also, Çağman et al. [
7] developed the case study of the decisionmaking approach using the fuzzy parametrized
$FSS$s aggregation operator. In 2011, Neog et al. [
8] used fuzzy soft matrices, a fuzzy soft complement and a fuzzy matrix operation to solve a decisionmaking problem.
The objective of this study is to utilize
$FSS$s in a multiobserver multicriteria decisionmaking problem as a means of enhancing the approach proposed in [
1,
9,
10]. This paper presents an overview of the fundamental findings regarding soft sets and
$FSS$s stated in [
11,
12,
13,
14]. In recent years, there have been many applications for soft sets, general topology, and their related topics with applications [
15,
16,
17]. Moreover, Nasef et al. [
18,
19] presented some applications of soft sets in decisionmaking problems.
To solve the reduction issue, Kong et al. [
20] defined and developed the heuristic technique for normal parameter reductions (NPRs) in
$FSS$. The NPR soft set algorithm, as proposed in [
20], was complex to grasp, required numerous computations, and depended on the dispensability. To lessen its computational complexity, this approach was further investigated by several authors; see [
21,
22,
23,
24]. A proximity normal parameter reduction (PNPR) of the
$FSS$ was proposed by Kong et al. in [
25]. Using threeway decision criteria, Khameneh and Kilicman [
26] presented an adaptable method for parameterizing
$FSS$. In order to address the issue of
$FSS$ parameter reduction based on the score criteria, Kong et al. [
27] developed a brandnew NPR method. A distancebased parameter reduction (DBPR) approach for
$FSS$ was introduced by Ma and Qin [
28]. Its use in decisionmaking issues was covered, and a problem arose with it because similarity and reduction are different and cannot depend on each other. For more information about the prarametrization reduction in
$FSS$, see [
29,
30,
31,
32,
33,
34]. Most reduction methods depend on one function.
Throughout this paper, the issue of decision making in the presence of imprecise data holds particular importance when addressing reallife problems. In this example, a multiobserver, multicriteria decisionmaking issue is addressed by employing the notion of $FSS$, which always come equipped with parametrization tools. Also, we give some applications using soft sets and $FSS$s. By the given results, we prove that the approach to decisionmaking problems with imprecise data via $FSS$s is more accurate than other approaches.
Let ${U}_{1}$ be a initial universe set, S be a set of parameters of attributes with respect to ${U}_{1}$, and $P\left({U}_{1}\right)$ denote the power set of ${U}_{1}$.
Definition 1 ([
2]).
A pair $(\mathfrak{F},S)$ is called a soft set over U if and only if $\mathfrak{F}$ is a mapping of S into the set of all subsets of U. In other words, a soft set over U is a function from a set of parameters to $P\left(U\right)$. We can notice that a soft set is not a set in the usual sense but a parameterized family of subsets of U. Definition 2 ([
35]).
For a set ${A}_{1}\subseteq {X}_{1}$, its indicator function ${\mu}_{{A}_{1}}$ is defined asA fuzzy set $\mathfrak{F}$ is described by its membership function ${\mu}_{A}$. For every ${x}_{1}\in {X}_{1}$, this function associates a real number ${\mu}_{F}\left({x}_{1}\right)$ interpreted for the point as a degree of belonging of ${x}_{1}$ to the fuzzy set $\mathfrak{F}$, written as $\mathfrak{F}=\{({x}_{1},{\mu}_{{A}_{1}}):{x}_{1}\in {X}_{1}\}$.
Definition 3 ([
3]).
Let $\tilde{P}\left({U}_{1}\right)$ be all fuzzy subsets of ${U}_{1}$. A pair $(\tilde{F},S)$ is called $FSS$ over ${U}_{1}$ such that $\tilde{F}$ is a function given by $\tilde{F}:S\to \tilde{P}\left({U}_{1}\right)$, where S is a set of parameters. Similarly, definitions of
$FSS$, null
$FSS$, intersection and union operations [
3] are similar to those defined for crisp soft sets (soft sets) [
2].
2. Soft Sets through Pawlak Rough Sets
Molodtsov [
2] explored various applications of the soft set theory across multiple domains, including the examination of function smoothness, game theory, operations research, probability, etc. In this part, we demonstrate how the rough technique can be utilized to apply soft set theory to a decisionmaking problem [
36,
37]. What we will look at is outlined in the following. The information system with set values is outlined in the table that may be found above, where
$U=\{{\mathfrak{z}}_{1},$${\mathfrak{z}}_{2},{\mathfrak{z}}_{3},$${\mathfrak{z}}_{4},$${\mathfrak{z}}_{5},{\mathfrak{z}}_{6}\}$ are a set of six students.
$\mathfrak{V}=$$\{{\mathfrak{v}}_{1}=$ Food containing preservatives,
${\mathfrak{v}}_{2}=$ C,
${\mathfrak{v}}_{3}=$ P,
${\mathfrak{v}}_{4}=$ V,
${\mathfrak{v}}_{5}=$ Fat,
${\mathfrak{v}}_{6}=$ M,
${\mathfrak{v}}_{7}=$ Junk food,
${\mathfrak{v}}_{8}=$ Icecream } is a group of parameters that students can use to visualize the nutrients found in food.
Consider the soft set $(\mathfrak{F},\mathfrak{V})$, which describes the attractiveness of the students given by $(\mathfrak{F},\mathfrak{V})=$ Students consume foods with Foods with added preservatives $=\varphi $; Students eat food containing C $=\{{\mathfrak{z}}_{1},$${\mathfrak{z}}_{2},$${\mathfrak{z}}_{3},$${\mathfrak{z}}_{4},$${\mathfrak{z}}_{5},$${\mathfrak{z}}_{6}\}$; Students eat food containing P $=\{{\mathfrak{z}}_{1},$${\mathfrak{z}}_{2},$${\mathfrak{z}}_{3},$${\mathfrak{z}}_{4},$${\mathfrak{z}}_{6}\}$; Students eat food containing V $=\{{\mathfrak{z}}_{1},$${\mathfrak{z}}_{2},$${\mathfrak{z}}_{3},$${\mathfrak{z}}_{4},$${\mathfrak{z}}_{5},$${\mathfrak{z}}_{6}\}$; Students eat food containing Fat $=\{{\mathfrak{z}}_{1},$${\mathfrak{z}}_{3},{\mathfrak{z}}_{6}\}$; Students eat food containing M $=\{{\mathfrak{z}}_{1},{\mathfrak{z}}_{2},$${\mathfrak{z}}_{6}\}$; Students eat Junk food $=\{{\mathfrak{z}}_{2},$${\mathfrak{z}}_{4},$${\mathfrak{z}}_{5}\}$; and Students eat Icecream $=\{{\mathfrak{z}}_{1},$${\mathfrak{z}}_{3},{\mathfrak{z}}_{6}\}$.
Assume that Mr. X possesses an inclination to procure food items based on specific parameters aligned with his personal preferences $\{{\mathfrak{v}}_{1},$${\mathfrak{v}}_{2},$${\mathfrak{v}}_{3},$${\mathfrak{v}}_{4},$${\mathfrak{v}}_{5},$${\mathfrak{v}}_{6},{\mathfrak{v}}_{7},$${\mathfrak{v}}_{8}\}$, which constitute the subset $P=\{{\mathfrak{v}}_{2}=$$C,$${\mathfrak{v}}_{3}=$$P,{\mathfrak{v}}_{4}=$$V,{\mathfrak{v}}_{5}=Fat,{\mathfrak{v}}_{6}=$$M\}$ of the set $\mathfrak{V}$. This implies that the individual must choose from the set of available food items, denoted as U, the food item(s) that satisfy all or the highest number of parameters specified by the soft set. The objective is to identify the food item that aligns with the predetermined selection criteria established by Mr. X.
First, let us build a tabular representation of the problem so we can better understand it. Take into consideration the soft set
$(\mathfrak{F},\mathfrak{P})$, where P is the decision parameter of Mr. X (see
Table 2 for further information). In this case,
$(\mathfrak{F},$$\mathfrak{P})$ can be considered a soft subset of
$(\mathfrak{F},$$\mathfrak{V})$.
Assume that a hypothetical customer, Mr. Y, intends to make a food purchase based on a predefined set of choice parameters
$Q\subset \mathfrak{P}$. So
$(\mathfrak{F},Q)$ is a soft subset of
$(\mathfrak{F},\mathfrak{P})$ and called the reduct soft set of the soft set
$(\mathfrak{F},\mathfrak{P})$. The choice value of an object
${\mathfrak{z}}_{i}\in U$ is
${\mathfrak{p}}_{i}$, where
${\mathfrak{p}}_{i}=\Sigma {\mathfrak{z}}_{ij}$ such that
${\mathfrak{z}}_{ij}$ is the entries in the table for reducing the soft set as shown in
Table 3.
So, Mr. Y can choose the food of students
$\{{\mathfrak{z}}_{1},{\mathfrak{z}}_{6}\}$. The theory of a weighted soft set, or Wsoft set, was first presented by Lin [
36]. The weighted choice value of an object
${\mathfrak{z}}_{i}\in U$ is
${W}_{{\mathfrak{p}}_{i}}$ since
${W}_{{\mathfrak{p}}_{i}}=$$\Sigma {d}_{ij}$ such that
${d}_{ij}=$${w}_{j}\times {c}_{ij}$. The following Algorithm 1 is for the selection of students.
Algorithm 1 Decision making for food system. 
Step 1: Input the soft set $(\mathfrak{F},\mathfrak{V})$. Step 2: Enter the set $\mathfrak{P}$ of choice parameter for Mr. X and $\mathfrak{P}\subseteq \mathfrak{V}$. Step 3: Reduct soft set of $(\mathfrak{F},\mathfrak{P})$. Step 4: Choose one reduct soft set $(\mathfrak{F},Q)$. Step 5: Get weighted table of the soft set $(\mathfrak{F},Q)$ according to the weights decided by Mr. Y. Step 6: Compute k for which ${W}_{{\mathfrak{p}}_{i}}=max\phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{w}_{{\mathfrak{p}}_{i}}$.

The optional choice object is denoted as “
${c}_{k}$”. If there are multiple values for
k, Mr. Y has the option to choose any one of them. We attempt to resolve the initial problem by employing a modified algorithm. Assume that the weights for the parameter are determined by Mr. Y as presented in
Table 4.
Using these weights, the reduct soft set can be tabulated as in
Table 5.
Therefore, $max\phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{w}_{{\mathfrak{p}}_{i}}=$$\{{w}_{{\mathfrak{p}}_{1}},$${w}_{{\mathfrak{p}}_{6}}\}$. The reduct is that Mr. Y chooses the food of students ${\mathfrak{z}}_{1}$ and ${\mathfrak{z}}_{6}$ among the available food. He seeks counsel from five different counseling agencies ${\mathfrak{v}}_{2},{\mathfrak{v}}_{3},$${\mathfrak{v}}_{4},$${\mathfrak{v}}_{5},$${\mathfrak{v}}_{6}$. The five agencies provide the information about food considering parameters C, P, V, Fat and M of the students ${\mathfrak{z}}_{1}$,${\mathfrak{z}}_{2}$,${\mathfrak{z}}_{3}$,${\mathfrak{z}}_{4}$, ${\mathfrak{z}}_{5}$ and ${\mathfrak{z}}_{6}$, respectively.
3. Best Nutrition in Terms of Soft Sets
To address the issues raised above, the following Algorithm 2 is proposed:
Algorithm 2 Decision making for proposed problem. 
Step 1: Input the performance evaluation of the similar food for different students as tables. Step 2: Determine the average value of each relevant entry in each table, and then calculate that average. Step 3: To obtain the comprehensive decision table, we multiply the weightage of the selection criteria of director (or Mr. Y) to the corresponding entries of each column. Step 4: Calculate the comparison table. Step 5: Calculate the column sums and row sums of the comparison table. Step 6: Obtain the score for every student. The student with maximum score is recommended as the best choice with the best food. So, he has good nutrition.

Suppose Mr. Y is interested in choosing food for students from among the set of students $U=$$\{{\mathfrak{z}}_{1},$${\mathfrak{z}}_{2},$${\mathfrak{z}}_{3},$${\mathfrak{z}}_{4},$ and ${\mathfrak{z}}_{5}\}$ on the basis of the set $\mathfrak{P}=$$\{{\mathfrak{p}}_{1}$ (C), ${\mathfrak{p}}_{2}$ (M), ${\mathfrak{p}}_{3}$ (P), ${\mathfrak{p}}_{4}$ (fat), ${\mathfrak{p}}_{5}$ (Junk food), ${\mathfrak{p}}_{6}\left(V\right)\}$ of the selection criteria called the parameters, and assume that Mr. Y wants to choose food for pupils based on his personal preference weightage of the selection criterion. Now to obtain the recent the performance evaluation data, we construct the $FSS$s $({\mathfrak{F}}_{1},$$\mathfrak{P})$, $({\mathfrak{F}}_{2},$$\mathfrak{P})$, $({\mathfrak{F}}_{3},$$\mathfrak{P})$, $({\mathfrak{F}}_{4},$ and $\mathfrak{P})$ over U, where ${\mathfrak{F}}_{1}$, ${\mathfrak{F}}_{2}$, ${\mathfrak{F}}_{3}$, and ${\mathfrak{F}}_{4}$ are mappings from $\mathfrak{P}$ into ${I}^{U}$ given by four performance evaluation data.
Suppose ${\mathfrak{F}}_{1}\left({\mathfrak{p}}_{1}\right)=$$\{({\mathfrak{z}}_{1},0.50),({\mathfrak{z}}_{2},0.60),$$({\mathfrak{z}}_{3},0.50),({\mathfrak{z}}_{4},0.80),({\mathfrak{z}}_{5},0.90)\}$,
${\mathfrak{F}}_{1}\left({\mathfrak{p}}_{2}\right)=$$\{({\mathfrak{z}}_{1},0.80),({\mathfrak{z}}_{2},0.70),$$({\mathfrak{z}}_{3},0.60),({\mathfrak{z}}_{4},0.40),({\mathfrak{z}}_{5},0.30)\}$,
${\mathfrak{F}}_{1}\left({\mathfrak{p}}_{3}\right)=$$\{({\mathfrak{z}}_{1},0.10),({\mathfrak{z}}_{2},0.20),({\mathfrak{z}}_{3},0.30),({\mathfrak{z}}_{4},0.60),({\mathfrak{z}}_{5},0.80)\}$,
${\mathfrak{F}}_{1}\left({\mathfrak{p}}_{4}\right)=$$\{({\mathfrak{z}}_{1},0.30),({\mathfrak{z}}_{2},0.40),({\mathfrak{z}}_{3},0.50),({\mathfrak{z}}_{4},0.40),({\mathfrak{z}}_{5},0.80)\}$,
${\mathfrak{F}}_{1}\left({\mathfrak{p}}_{5}\right)=$$\{({\mathfrak{z}}_{1},0.90),({\mathfrak{z}}_{2},0.80),({\mathfrak{z}}_{3},0.70),({\mathfrak{z}}_{4},0.60),({\mathfrak{z}}_{5},0.20)\}$,
${\mathfrak{F}}_{1}\left({\mathfrak{p}}_{6}\right)=$$\{({\mathfrak{z}}_{1},0.10),({\mathfrak{z}}_{2},0.20),({\mathfrak{z}}_{3},0.30),({\mathfrak{z}}_{4},0.50),({\mathfrak{z}}_{5},0.80)\}$,
${\mathfrak{F}}_{2}\left({\mathfrak{p}}_{1}\right)=$$\{({\mathfrak{z}}_{1},0.52),({\mathfrak{z}}_{2},0.59),({\mathfrak{z}}_{3},0.60),({\mathfrak{z}}_{4},0.85),({\mathfrak{z}}_{5},0.91)\}$,
${\mathfrak{F}}_{2}\left({\mathfrak{p}}_{2}\right)=$$\{({\mathfrak{z}}_{1},0.79),({\mathfrak{z}}_{2},0.75),({\mathfrak{z}}_{3},0.65),({\mathfrak{z}}_{4},0.43),({\mathfrak{z}}_{5},0.25)\}$,
${\mathfrak{F}}_{2}\left({\mathfrak{p}}_{3}\right)=$$\{({\mathfrak{z}}_{1},0.15),({\mathfrak{z}}_{2},0.22),({\mathfrak{z}}_{3},0.40),({\mathfrak{z}}_{4},0.70),({\mathfrak{z}}_{5},0.90)\}$,
${\mathfrak{F}}_{2}\left({\mathfrak{p}}_{4}\right)=$$\{({\mathfrak{z}}_{1},0.25),({\mathfrak{z}}_{2},0.35),({\mathfrak{z}}_{3},0.45),({\mathfrak{z}}_{4},0.50),({\mathfrak{z}}_{5},0.75)\}$,
${\mathfrak{F}}_{2}\left({\mathfrak{p}}_{5}\right)=$$\{({\mathfrak{z}}_{1},0.87),({\mathfrak{z}}_{2},0.88),({\mathfrak{z}}_{3},0.75),({\mathfrak{z}}_{4},0.65),({\mathfrak{z}}_{5},0.30)\}$,
${\mathfrak{F}}_{2}\left({\mathfrak{p}}_{6}\right)=$$\{({\mathfrak{z}}_{1},0.13),({\mathfrak{z}}_{2},0.22),({\mathfrak{z}}_{3},0.35),({\mathfrak{z}}_{4},0.49),({\mathfrak{z}}_{5},0.85)\}$,
${\mathfrak{F}}_{3}\left({\mathfrak{p}}_{1}\right)=$$\{({\mathfrak{z}}_{1},0.55),({\mathfrak{z}}_{2},0.63),({\mathfrak{z}}_{3},0.54),({\mathfrak{z}}_{4},0.75),({\mathfrak{z}}_{5},0.91)\}$,
${\mathfrak{F}}_{3}\left({\mathfrak{p}}_{2}\right)=$$\{({\mathfrak{z}}_{1},0.88),({\mathfrak{z}}_{2},0.86),({\mathfrak{z}}_{3},0.70),({\mathfrak{z}}_{4},0.50),({\mathfrak{z}}_{5},0.40)\}$,
${\mathfrak{F}}_{3}\left({\mathfrak{p}}_{3}\right)=$$\{({\mathfrak{z}}_{1},0.20),({\mathfrak{z}}_{2},0.30),({\mathfrak{z}}_{3},0.50),({\mathfrak{z}}_{4},0.70),({\mathfrak{z}}_{5},0.90)\}$,
${\mathfrak{F}}_{3}\left({\mathfrak{p}}_{4}\right)=$$\{({\mathfrak{z}}_{1},0.29),({\mathfrak{z}}_{2},0.33),({\mathfrak{z}}_{3},0.48),({\mathfrak{z}}_{4},0.52),({\mathfrak{z}}_{5},0.85)\}$,
${\mathfrak{F}}_{3}\left({\mathfrak{p}}_{5}\right)=$$\{({\mathfrak{z}}_{1},0.85),({\mathfrak{z}}_{2},0.84),({\mathfrak{z}}_{3},0.78),({\mathfrak{z}}_{4},0.65),({\mathfrak{z}}_{5},0.23)\}$,
${\mathfrak{F}}_{3}\left({\mathfrak{p}}_{6}\right)=$$\{({\mathfrak{z}}_{1},0.12),({\mathfrak{z}}_{2},0.25),({\mathfrak{z}}_{3},0.30),({\mathfrak{z}}_{4},0.57),({\mathfrak{z}}_{5},0.85)\}$,
${\mathfrak{F}}_{4}\left({\mathfrak{p}}_{1}\right)=$$\{({\mathfrak{z}}_{1},0.58),({\mathfrak{z}}_{2},0.67),({\mathfrak{z}}_{3},0.56),({\mathfrak{z}}_{4},0.86),({\mathfrak{z}}_{5},0.95)\}$,
${\mathfrak{F}}_{4}\left({\mathfrak{p}}_{2}\right)=$$\{({\mathfrak{z}}_{1},0.89),({\mathfrak{z}}_{2},0.87),({\mathfrak{z}}_{3},0.69),({\mathfrak{z}}_{4},0.45),({\mathfrak{z}}_{5},0.36)\}$,
${\mathfrak{F}}_{4}\left({\mathfrak{p}}_{3}\right)=$$\{({\mathfrak{z}}_{1},0.19),({\mathfrak{z}}_{2},0.27),({\mathfrak{z}}_{3},0.34),({\mathfrak{z}}_{4},0.57),({\mathfrak{z}}_{5},0.89)\}$,
${\mathfrak{F}}_{4}\left({\mathfrak{p}}_{4}\right)=$$\{({\mathfrak{z}}_{1},0.32),({\mathfrak{z}}_{2},0.35),({\mathfrak{z}}_{3},0.40),({\mathfrak{z}}_{4},0.45),({\mathfrak{z}}_{5},0.70)\}$,
${\mathfrak{F}}_{4}\left({\mathfrak{p}}_{5}\right)=$$\{({\mathfrak{z}}_{1},0.85),({\mathfrak{z}}_{2},0.87),({\mathfrak{z}}_{3},0.73),({\mathfrak{z}}_{4},0.61),({\mathfrak{z}}_{5},0.23)\}$,
${\mathfrak{F}}_{4}\left({\mathfrak{p}}_{6}\right)=$$\{({\mathfrak{z}}_{1},0.18),({\mathfrak{z}}_{2},0.24),({\mathfrak{z}}_{3},0.37),({\mathfrak{z}}_{4},0.56),({\mathfrak{z}}_{5},0.78)\}$.
The following is a table that represents the aforementioned
$FSS$s
$({\mathfrak{F}}_{1},\mathfrak{P})$,
$({\mathfrak{F}}_{2},\mathfrak{P})$,
$({\mathfrak{F}}_{3},\mathfrak{P})$,
$({\mathfrak{F}}_{4},\mathfrak{P})$ organized into
Table 6,
Table 7,
Table 8 and
Table 9.
We obtain the performance evaluation shown in
Table 10 by averaging the aforementioned four
$FSS$s.
Assume that Mr. Y determines the preference weights for the various pupils as shown in
Table 11 such that
$\sum _{i=1}^{5}}{w}_{i}=1$.
The comprehensive decision table can be derived by performing a rowwise multiplication of
Table 10 with the weightage
Table 11, followed by transposing the resulting matrix as depicted in
Table 12.
Now that the comparison table is constructed for the students, we use it to help Mr. Y select the most qualified students possible. The comparison table is in the form of a square and has an equal number of rows and columns. The rows and columns are both labeled with the names of the students as
${\mathfrak{z}}_{1}$,
${\mathfrak{z}}_{2}$,
${\mathfrak{z}}_{3}$,
${\mathfrak{z}}_{4}$ and
${\mathfrak{z}}_{5}$, and the entries are
${\mathfrak{z}}_{ij}$ with
$i,j=1,2,3,4,5$ given by
${\mathfrak{z}}_{ij}=$, the number of selection criteria, for which the membership value of
${\mathfrak{z}}_{j}$ is greater than or equal to the membership value of
${\mathfrak{z}}_{i}$. The comparison table is in
Table 13.
The column sum and the row sum from the comprehensive decision table and the scores for each student are presented in the following
Table 14.
It is concluded that for
Table 14 with the score table, the following ordering is suggested:
${\mathfrak{z}}_{5}>{\mathfrak{z}}_{4}>{\mathfrak{z}}_{1}>{\mathfrak{z}}_{3}>{\mathfrak{z}}_{2}$.
In this case, we discover that the highest possible score can be achieved by ${\mathfrak{z}}_{5}$, the student with the best parameters ${\mathfrak{p}}_{1}$, ${\mathfrak{p}}_{2}$, ${\mathfrak{p}}_{3}$, ${\mathfrak{p}}_{4}$, ${\mathfrak{p}}_{5}$ and ${\mathfrak{p}}_{6}$. Hence, Mr. Y can choose the student with the best ratios of food with parameters ${\mathfrak{p}}_{1}$(C), ${\mathfrak{p}}_{2}$ (M), ${\mathfrak{p}}_{3}$ (P), ${\mathfrak{p}}_{4}\left(fat\right)$, ${\mathfrak{p}}_{5}$ (Junk food), and ${\mathfrak{p}}_{6}$ (V) of the selection criteria of $FSS$s $({\mathfrak{F}}_{1},\mathfrak{P})$, $({\mathfrak{F}}_{2},\mathfrak{P})$, $({\mathfrak{F}}_{3},\mathfrak{P})$, and $({\mathfrak{F}}_{4},\mathfrak{P})$ over U.