Next Article in Journal
Enhancing Early Skin Cancer Detection: A Deep Learning Approach with Multi-Scale Feature Refinement and Fusion
Previous Article in Journal
Prime-Enforced Symmetry Constraints in Thermodynamic Recoils: Unifying Phase Behaviors and Transport Phenomena via a Covariant Fugacity Hessian
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Advances in Near Soft Sets and Their Applications in Similarity-Based Decision Making

1
Department of Mathematics, Faculty of Arts and Sciences, Iğdır University, Iğdır 76000, Türkiye
2
Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB R3T 5V6, Canada
3
Department of Computer Engineering, Iğdır University, Iğdır 76000, Türkiye
4
Department of Software Engineering, Faculty of Engineering and Natural Sciences, Istanbul Topkapı University, Istanbul 34087, Türkiye
*
Author to whom correspondence should be addressed.
Symmetry 2026, 18(4), 611; https://doi.org/10.3390/sym18040611
Submission received: 19 February 2026 / Revised: 20 March 2026 / Accepted: 25 March 2026 / Published: 4 April 2026
(This article belongs to the Section Mathematics)

Abstract

In this study, a generalized and advanced form of the near soft set theory (NST) framework is proposed for information aggregation (IA) processes. The primary motivation of the study is to address the lack of similarity-based uncertainty modeling in the literature by integrating the parametric structure of soft sets with the similarity-oriented structure of nearness approximation spaces. Within this framework, the AND-product and OR-product operations are introduced as the main methodological tools, and their algebraic structures are analyzed in detail. It is mathematically demonstrated that these operations satisfy fundamental properties such as idempotency, absorption, distributivity, and De Morgan identities. The principal original contribution of the study is the development of a novel Uni–Int-based decision-making mechanism that enables the systematic distinction between strong and acceptable alternatives. In addition, the boundary frequency indicator ( b r ), which quantitatively evaluates the reliability of objects under perceptual uncertainty and is introduced for the first time in the literature, is proposed. The applicability of the proposed model is demonstrated through a real-estate selection problem, and a sensitivity analysis is conducted to reveal the determining effect of the nearness parameter r on decision granularity. The obtained findings indicate that the proposed NST framework provides a more flexible, more discriminative, and structurally robust decision-support model than classical approaches, particularly for similarity-based IA problems.

1. Introduction

The need to model and effectively manage uncertainty inherent in many interdisciplinary problems has led to the development of various theoretical frameworks, such as Probability Theory, Fuzzy Sets [1], and rough sets [2]. These approaches have made significant contributions to the literature by addressing different aspects of uncertainty and have been widely applied, particularly in data analysis and decision-making processes. In this context, soft set theory, proposed by Molodtsov [3], provides a complementary framework by allowing objects to be represented through freely chosen parameters. Although soft set theory offers considerable flexibility in handling uncertainty due to its parametric structure, it does not directly take into account attribute-based perceptual similarities among objects.
On the other hand, the descriptive near set approach developed by Peters [4,5] provides a strong mathematical foundation for modeling perceptual indiscernibility by analyzing similarities among objects through their attribute values. Nevertheless, this approach does not incorporate the flexible representational capability provided by parametric structures. In this respect, the present study proposes a near soft set (NSS) framework that integrates the parametric expressive power of soft sets with the perceptual similarity-based approach of near sets. Through the integration of the concept of Nearness Approximation Spaces (NASs), this hybrid structure enables uncertainty to be modeled simultaneously through parameterization and through descriptive nearness relations among objects, thereby providing a more comprehensive and flexible analytical framework for complex decision-making problems.
In order to better understand the theoretical foundations of this integrated approach, the development and application areas of soft set theory and near set theory are considered, respectively. In this direction, the operational capabilities of soft set theory were first applied to decision-making problems by Maji et al. [6]; however, the mathematical inconsistencies in the parameter reduction methods involved in that study were later resolved by Chen et al. [7,8] using the principles of rough set theory. In the subsequent process, soft set theory was integrated with hybrid structures and reached a broad range of applications. In the literature, the recognition of soft information patterns [9], the analysis of symptom co-occurrence in disease diagnosis [10], granulation strategies [11], and real-life applications such as bank interview processes [12] are among the pioneering examples of this development.
This developmental process of decision-making mechanisms gained momentum with the Uni-Int function proposed by Çağman and Enginoğlu [13,14]; this approach was later extended to intm-intn schemes by Feng et al. [15] and to soft discernibility matrices by Feng and Zhou [16]. In recent years, soft set theory has become an indispensable analytical tool in complex uncertainty scenarios, and various approaches such as N-soft topologies [17], max-min average methods in medical diagnosis [18], Bonferroni operators [19], and soft expert structures integrating expert opinions [20,21,22,23] have contributed to this development.
In the process of theoretical maturation, the algebraic foundations of soft set operations were first established by Maji et al. [24]; however, it was observed that the universal validity of classical operations remains limited in certain cases, which has led to the development of restricted and extended variants [25,26]. The theoretical framework was enriched by De Morgan laws [27] and MV-algebras [28], while inconsistencies related to symmetric difference and distributive laws were eliminated through critical revisions [29,30,31]. Studies on soft subsethood and equality relations corrected earlier formulation errors [32,33], paving the way for the development of more comprehensive notions such as soft J-equality [34] and relaxed operations [35,36,37]. The theoretical development matured through the formulation of restricted and extended intersection, union, and difference operations that explicitly address discrete parameter cases and are supported by proofs based on function equality [38,39,40].
At the core of soft set-based decision-making processes lie the AND and OR product operations, which enable the aggregation of multi-parameter information. Feng and Li [41] related soft subset forms to these product operations and revealed rich algebraic structures such as semigroups and semirings. This framework was further developed through comprehensive analyses of the AND-product by Sezgin et al. [42] and of the OR-product by Orbay et al. [43]; under M-equality, the family of soft sets was shown to form commutative semirings with identity elements. Today, this operational structure has reached a strong theoretical and practical reference framework through the definition of soft difference-product [44] and enhanced AND-operators. Among the recent developments is the soft union-star product proposed by Durak and Sezgin [45], which is a group-theoretic operation defined on parameter spaces having an internal group structure. This structure preserves fundamental properties such as closure, associativity, commutativity, and idempotency, while also maintaining compatibility with generalized soft subset and equality notions.
On the other hand, near set theory, introduced by J.F. Peters in 2002 as a generalization of rough set theory, has provided an original framework, particularly for the mathematical modeling of perceptual similarities among objects [5,46]. The conceptual foundations of the theory are rooted in image analysis and pattern recognition problems and are based on the notion of perceptual nearness developed together with Pawlak. The philosophical background of this approach was shaped by the perceptual perspective presented in the study entitled “How Near”, written in 2002 and published in 2007 [47]. Peters [5] showed that objects possessing similar features can be regarded as perceptually near to one another and that, accordingly, the universe can be reconstructed on the basis of available information. This approach was further advanced by Peters and Wasilewski [48] within the context of information science, where it was demonstrated that nearness relations play a central role in modeling perceptual information systems and object classification. From an application-oriented perspective, Peters [49] developed the notion of tolerance near sets, thereby introducing a quantitative approach grounded in human perception to image matching and classification problems; moreover, the nearness measure proposed by Henry–Peters [50,51] made it possible to determine similarities between images.
The application of near set theory to algebraic structures began in 2012 with the study of İnan and Öztürk on groups and semigroups in nearness approximation spaces, marking a turning point in the development of this area [52,53]. Following these studies, Bağırmaz [54,55] investigated ideals and approximate structures on near semigroups, while Davvaz and collaborators [56,57] extended this theory to ring and module structures, thereby broadening the scope of the algebraic framework. In 2019, Öztürk and his co-authors defined near rings and Γ -semigroups on weak nearness approximation spaces and obtained fundamental results concerning the prime ideals of these structures [58,59,60]. These developments enabled the systematic construction of nearness-based algebraic structures. Subsequently, the near d-algebras defined by Öztürk in 2021 introduced a new perspective into the literature, and the theoretical foundations of these structures were examined in detail [61]. In recent years, studies have focused on more complex and generalized algebraic structures. In particular, during the 2022–2023 period, Mostafavi and Davvaz investigated hyperstructures such as near polygroups, near semi-hypergroups, and Krasner hyper-rings, and systematically examined their fundamental properties [62,63,64]. In addition, applications of the nearness approach to Cayley graphs were demonstrated, thereby revealing the combinatorial and graphical aspects of the theory [65]. As of 2024, the literature has been substantially enriched by advanced notions such as normal substructures in near groups [66], the concept of modulo and near cosets [67], and ordered near semigroups [68]. In the most recent studies, Jokar and Davvaz [69] extended the notion of near approximation to lattice structures by presenting new characterizations in the context of upper and lower rough ideals; they also further expanded the theoretical framework on quotient Γ -near rings and ordered Γ -near semigroups [70].
The interaction between soft set theory and near set theory was initially addressed through structures proposed on the basis of integrating soft sets with rough sets, and this approach was brought into a systematic framework through studies presented in the relevant literature [33]. In this direction, the ensuing theoretical development paved the way for the emergence of hybrid structures integrating the parametric flexibility of soft sets with the perceptual sensitivity of near sets; eventually, the concept of NSSs emerged in the literature as a natural consequence of this need [71,72].
Studies on NSSs have rapidly expanded into different areas. In this context, in line with the algebraic development of NSSs, structures serving as a basis for continuous transformations were developed through binary operations defined on nonempty near soft elements; in particular, near soft elements and near soft topological groups were defined [73]. On the other hand, recent studies clearly demonstrate the application power of NSSs, showing that these structures are effectively employed in areas such as bipolar near soft sets used in multi-criteria decision-making problems [74] and advanced near-soft matrix-based cryptosystems designed to ensure the secure transmission of hidden information [75].
In modern decision-making processes and pattern recognition problems, the modeling of uncertainty in data and the analysis of perceptual nearness among objects have generally been treated independently. While soft set theory, developed by Molodtsov, offers significant flexibility in representing uncertainty through its parametric structure, near set theory, developed by Pawlak, Henry, and Peters, focuses on the classification of objects based on attribute-based similarities. Nevertheless, the absence of an integrated framework combining the advantages of these two approaches leads to significant limitations in complex and multi-dimensional decision-making problems.
In this context, the following main gaps stand out in the literature:
Lack of Parametric Perception: Classical soft set models are successful in representing objects through parameters; however, they lack a nearness measurement mechanism capable of quantitatively evaluating the perceptual similarity among objects under these parameters. This situation leads to similar but not exactly overlapping alternatives being overlooked.
Limitation of Static Classification: Although near set theory can effectively model attribute-based similarities, it cannot provide the parametric flexibility required for multi-criteria and dynamic decision-making processes. This deficiency makes it difficult to integrate varying criteria across different decision contexts.
Operational Insufficiency: The AND and OR product operations defined on soft sets do not take into account perceptual overlap among objects during the aggregation of multi-parameter information. This may lead to information loss and reduced decision quality, particularly in high-sensitivity applications such as image processing and medical diagnosis.
Original Value of This Study: This study advances the NSS model to a higher theoretical level by integrating the perceptual nearness measure developed by Henry and Peters with the parametric structure of soft sets. In particular, the proposed AND and OR product operations not only enable data aggregation but also aim to preserve the perceptual consistency of the resulting structures. In this way, during the integration of different information sources (e.g., different expert opinions or different image processing outputs), attribute-based similarities among objects are mathematically preserved through the Nearness Measure (NM).
Contributions of the Study. The main contributions of this study can be summarized as follows:
  • AND and OR product operations on near soft sets are formally defined under nearness approximation spaces;
  • It is shown that the proposed near soft products preserve fundamental algebraic properties such as idempotency and absorption laws;
  • The concept of perceptual nearness is directly integrated into the soft product structure, thereby enabling similarity-based decision-making;
  • The boundary frequency indicator b r ( u ) is introduced for the quantitative evaluation of participation in uncertainty;
  • An improved Uni–Int decision-making mechanism based on lower and upper near approximations is proposed;
  • The computational complexity of the proposed framework is analyzed, and its scalability for large-scale datasets is demonstrated.

2. Preliminaries

In this section, we present the fundamental definitions and terminology related to near sets and soft sets, which constitute the mathematical foundation of this study. By reviewing the structural properties of nearness approximation spaces and parameterized families of sets, we establish the necessary background for the development of the near soft set framework.

2.1. Near Sets

In this subsection, we recall the fundamental concepts and notations of near set theory introduced in [4,5].
Definition 1 (Characteristic of an Object).
The characteristic of an object O is a mapping f : X R defined by
f ( X ) = k R .
Definition 2 (Characteristically Near Shapes).
Let C = { φ 1 , , φ i , , φ n } be a set C of characteristics of shapes X , Y . A pair of shapes X , Y are characteristically near, provided
φ i C : φ i ( X ) = φ i ( Y ) = k R
Example 1 (Characteristically Near Shapes).
Assume that both shapes X , Y in Figure 1 have the following characteristic:
φ ( s h a p e A ) = 1 , if shape A has a surface tangent to another surface , else φ ( s h a p e A ) = 0 . O b s e r v e   t h a t
φ ( shapeX ) = φ ( shapeY ) = 1 .
Hence, shapes X , Y in Figure 1 are characteristically near.
The characterization of an object depends on both the quantity and the quality of the available information obtained through a collection of probe functions that measure object characteristics. Each object p O is described by a feature vector Φ ( p ) , whose components are determined by a selected subset of features.
Let F denote the set of all available probe functions. A subset B F is selected according to the relevance of features to the problem domain. The functions ϕ i B form the descriptive basis of the objects and typically represent measurements obtained via sensors or specific evaluation criteria.
The symbols and notations used throughout this study are summarized in Table 1. Accordingly, each object p O is represented by
Φ ( p ) = ϕ 1 ( p ) , ϕ 2 ( p ) , , ϕ L ( p ) ,
where ϕ i B for i = 1 , , L . This representation serves as the mathematical foundation for nearness-based approximation and pattern analysis.
We now formalize the fundamental notions of nearness through indiscernibility relations.
Let
B ˜ = { ( p , p ) O × O : ϕ i B , Δ ϕ i ( p , p ) = 0 }
be the indiscernibility relation induced by the feature set B, where
Δ ϕ i ( p , p ) = | ϕ i ( p ) ϕ i ( p ) |
denotes the difference between the feature values of the objects p and p .
For each p O , the equivalence class of p with respect to B ˜ is defined as
[ p ] B = { p O : ( p , p ) B ˜ } .
The family of all such equivalence classes forms the quotient set
O / B ˜ = { [ p ] B : p O } ,
which induces a partition ξ B on the universe of objects O .
Definition 3
([52]). Let p , p O and B F . The objects p and p are said to be minimally near to each other if there exists at least one probe function ϕ i B such that
p ϕ ˜ i p Δ ϕ i ( p , p ) = 0 .
This principle is referred to as the Nearness Description Principle (NDP). Under this condition, the objects share at least one common description and may belong to the same equivalence class [ p ] B ξ B .
Definition 4
([52]). Let X O and B F . If every object in X is near to itself with respect to B, then X is called a near set relative to itself (reflexive nearness).

2.2. Nearness Approximation Space (NAS)

A Nearness Approximation Space (NAS) is a formal mathematical structure that models perceptual similarity among objects based on feature-wise indiscernibility relations. Introduced by Peters [4,5], NAS generalizes classical Pawlak [2] approximation spaces by incorporating a family of nearness relations.
A nearness approximation space is defined as a quintuple:
NAS = O , F , B r , N r ( B ) , ν N r .
The symbols and their respective interpretations within this space are summarized in Table 2.
The nearness approximation space enables a granular analysis of the universe O by considering multiple feature combinations. The lower approximation N r ( X ) contains objects that certainly belong to X based on the descriptions in B r , whereas the upper approximation N r ( X ) contains objects that possibly belong to X. The boundary region Bnd N r ( B ) ( X ) represents the region of perceptual uncertainty.

2.3. Near Soft Sets

In this subsection, we introduce the notion of a near soft set by integrating soft set theory with nearness approximation spaces. This framework incorporates perceptual similarity into parameterized families of subsets, thereby providing a robust structure for handling uncertain data.
Let N A S = ( O , F , B r , N r , ν N r ) be a nearness approximation space, where O is the universe of discourse, and let E denote the set of parameters.
Definition 5
([3]). Let O be a universal set and E be a set of parameters. For any non-empty subset B E , a soft set over O is an ordered pair ( F , B ) , denoted by F B , where
F : B P ( O )
is a set-valued mapping.
Definition 6
([71]). Let NAS = ( O , F , B r , N r , ν N r ) be a nearness approximation space and let σ = ( F , B ) be a soft set over O . The lower and upper near approximations of σ with respect to NAS are denoted by N r ( σ ) = ( F * , B ) and N r ( σ ) = ( F * , B ) , respectively, where F * and F * are set-valued mappings defined as follows:
F * ( ϕ ) = N r ( F ( ϕ ) ) = p O : [ p ] B r F ( ϕ ) ,
F * ( ϕ ) = N r * ( F ( ϕ ) ) = p O : [ p ] B r F ( ϕ ) Ø ,
for all ϕ B . The operators N r and N r * are called the lower and upper near approximation operators on soft sets, respectively. If | Bnd N r ( B ) ( σ ) | 0 , then the soft set σ is called a near soft set.
Example 2. 
Let O = { p 1 , p 2 , p 3 , p 4 , p 5 } and B = { ϕ 1 , ϕ 2 } F = { ϕ 1 , ϕ 2 , ϕ 3 } . The object descriptions are given in Table 3.
Define the soft set F B as
F B = { ( ϕ 1 , { p 1 , p 3 } ) , ( ϕ 2 , { p 1 , p 2 , p 3 } ) } .
  • Case: r = 1
The nearness classes are as follows:
  • For ϕ 1 : [ p 1 ] ϕ 1 = { p 1 } , [ p 2 ] ϕ 1 = { p 2 , p 5 } , [ p 3 ] ϕ 1 = { p 3 , p 4 } .
  • For ϕ 2 : [ p 1 ] ϕ 2 = { p 1 } , [ p 2 ] ϕ 2 = { p 2 } , [ p 3 ] ϕ 2 = { p 3 , p 4 } , [ p 5 ] ϕ 2 = { p 5 } .
The approximations are
N r ( F B ) = { ( ϕ 1 , { p 1 } ) , ( ϕ 2 , { p 1 , p 2 } ) } , N r * ( F B ) = { ( ϕ 1 , { p 1 , p 3 , p 4 } ) , ( ϕ 2 , { p 1 , p 2 , p 3 , p 4 } ) } .
Since the boundary region is well-defined and nonempty,
Bnd N r ( B ) ( F B ) Φ ,
the soft set F B is a near soft set.
  • Case: r = 2
For B r = { ϕ 1 , ϕ 2 } , the classes are [ p 1 ] B r = { p 1 } , [ p 2 ] B r = { p 2 } , [ p 3 ] B r = { p 3 , p 4 } , and [ p 5 ] B r = { p 5 } . The approximations remain consistent, preserving the NSS structure.
Definition 7
([71,72,76]). Let F B , G B NSS ( O ) be near soft sets with parameter set B F . Then:
(i)
F B is a null near soft set, denoted by F Ø , if
F ( ϕ ) = Ø for all ϕ B .
(ii)
F B is a B-universal near soft set, denoted by F B ˜ , if
F ( ϕ ) = O for all ϕ B .
If B = F , then F B ˜ is called the universal near soft set, denoted by F F ˜ .
(iii)
F B is a near soft subset of G B , denoted by F B G B , if
N r F ( ϕ ) N r G ( ϕ ) for all ϕ B .
(iv)
F B is a near soft superset of G B , denoted by F B G B , if G B F B .
(v)
F B and G B are equal, denoted by F B = G B , if and only if
F B G B and G B F B .
(vi)
The difference of F B and G B , denoted by F B G B = H B , is defined by
H ( ϕ ) = F ( ϕ ) G ( ϕ ) for all ϕ B .
Definition 8
([71]). Let F A , G B NSS ( O ) be two near soft sets with parameter sets A and B, respectively, and let C = A B .
(i)
The intersection F A G B = H C is defined for each ϕ C by
H ( ϕ ) = F ( ϕ ) , ϕ A B , G ( ϕ ) , ϕ B A , F ( ϕ ) G ( ϕ ) , ϕ A B .
(ii)
The union F A G B = H C is defined for each ϕ C by
H ( ϕ ) = F ( ϕ ) , ϕ A B , G ( ϕ ) , ϕ B A , F ( ϕ ) G ( ϕ ) , ϕ A B .
(iii)
The complement of F B , denoted by F B c , is defined by
F c ( ϕ ) = O F ( ϕ ) for all ϕ F .
In this case, F B c is called the near soft complement of F B , and it satisfies
( F B c ) c = F B , F Ø c = F F ˜ .
It is important to distinguish the proposed near soft product operations from existing generalized soft products in the literature, such as fuzzy or intuitionistic soft products. While most generalized models focus on extending the membership degree of parameters, they still operate under a crisp indiscernibility assumption—meaning that objects are either distinct or identical. In contrast, the near soft AND and OR products introduced here are fundamentally different, as they are defined over Nearness Approximation Spaces (NASs). This enables our operations to account for the “nearness” of objects that are descriptively similar but not identical. Unlike standard soft products, which produce a single resultant set, our near soft products yield lower and upper approximations, thereby providing a dual-layer output that captures the boundary uncertainty inherent in human perception.

2.4. Near Soft AND and OR Products

In this subsection, we extend the fundamental product operations of soft set theory to the setting of nearness approximation spaces. The near soft AND and OR products serve as powerful tools for integrating multi-parameter information while preserving the perceptual indiscernibility of objects. By formalizing these operations, we establish a framework that not only combines different data sources but also refines the resulting sets through nearness approximations, thereby providing a more granular representation of uncertainty.
Definition 9.
Let N A S = ( O , F , B r , N r , ν N r ) be a nearness approximation space, and let σ 1 = ( F 1 , B 1 ) and σ 2 = ( F 2 , B 2 ) be two near soft sets over O .
(i)
The near soft AND operation of σ 1 and σ 2 , denoted by σ 1 N r σ 2 , is defined as
σ 1 N r σ 2 = ( H , B 1 × B 2 ) ,
where H = ( H * , H * ) and for each ( φ , ψ ) B 1 × B 2 :
H * ( φ , ψ ) = N r ( F 1 ( φ ) F 2 ( ψ ) ) , H * ( φ , ψ ) = N r * ( F 1 ( φ ) F 2 ( ψ ) ) .
(ii)
The near soft OR operation of σ 1 and σ 2 , denoted by σ 1 N r σ 2 , is defined as
σ 1 N r σ 2 = ( K , B 1 × B 2 ) ,
where K = ( K * , K * ) and for each ( φ , ψ ) B 1 × B 2
K * ( φ , ψ ) = N r ( F 1 ( φ ) F 2 ( ψ ) ) , K * ( φ , ψ ) = N r * ( F 1 ( φ ) F 2 ( ψ ) ) .
Example 3. 
Let F B be the near soft set introduced in Example 2, and let G B be another near soft set constructed from the same information system (Table 3), defined by
G B = { ( ϕ 1 , { p 2 , p 5 } ) , ( ϕ 2 , { p 3 } ) } .
In this example, we investigate the behavior of the near soft AND and OR product operations between F B and G B under different values of the nearness parameter r.
  • Case r = 1 .
1. 
Near Soft AND Product  ( F B N 1 G B ) .  For each ( φ , ψ ) B × B , the intersection F ( φ ) G ( ψ ) is first computed, and then its lower and upper approximations are determined.
  • ( ϕ 1 , ϕ 1 ) : F ( ϕ 1 ) G ( ϕ 1 ) = Ø H * = Ø , H * = Ø .
  • ( ϕ 1 , ϕ 2 ) : F ( ϕ 1 ) G ( ϕ 2 ) = { p 3 } H * = Ø , H * = { p 3 , p 4 } .
  • ( ϕ 2 , ϕ 1 ) : F ( ϕ 2 ) G ( ϕ 1 ) = { p 2 } H * = Ø , H * = { p 2 , p 5 } .
  • ( ϕ 2 , ϕ 2 ) : F ( ϕ 2 ) G ( ϕ 2 ) = { p 3 } H * = Ø , H * = { p 3 , p 4 } .
2. 
Near Soft OR Product  ( F B N 1 G B ) .  Similarly, the union-based approximations are obtained as follows:
  • ( ϕ 1 , ϕ 1 ) : F ( ϕ 1 ) G ( ϕ 1 ) = { p 1 , p 2 , p 3 , p 5 } K * = { p 1 , p 2 , p 5 } , K * = O .
  • ( ϕ 1 , ϕ 2 ) : F ( ϕ 1 ) G ( ϕ 2 ) = { p 1 , p 3 } K * = { p 1 } , K * = { p 1 , p 3 , p 4 } .
  • ( ϕ 2 , ϕ 1 ) : F ( ϕ 2 ) G ( ϕ 1 ) = { p 1 , p 2 , p 3 , p 5 } K * = { p 1 , p 2 , p 5 } , K * = O .
  • ( ϕ 2 , ϕ 2 ) : F ( ϕ 2 ) G ( ϕ 2 ) = { p 1 , p 2 , p 3 } K * = { p 1 , p 2 } , K * = { p 1 , p 2 , p 3 , p 4 } .
  • Case r = 2 .  We now repeat the above computations by considering the 2-neighborhoods induced by the nearness relation B r .
1.
Near Soft AND Product  ( F B N 2 G B ) .
  • ( ϕ 1 , ϕ 1 ) : F ( ϕ 1 ) G ( ϕ 1 ) = Ø H * ( 2 ) = Ø , H * ( 2 ) = Ø .
  • ( ϕ 1 , ϕ 2 ) : F ( ϕ 1 ) G ( ϕ 2 ) = { p 3 } H * ( 2 ) = Ø , H * ( 2 ) = { p 2 , p 3 , p 4 , p 5 } .
  • ( ϕ 2 , ϕ 1 ) : F ( ϕ 2 ) G ( ϕ 1 ) = { p 2 } H * ( 2 ) = Ø , H * ( 2 ) = { p 1 , p 2 , p 5 } .
  • ( ϕ 2 , ϕ 2 ) : F ( ϕ 2 ) G ( ϕ 2 ) = { p 3 } H * ( 2 ) = Ø , H * ( 2 ) = { p 2 , p 3 , p 4 , p 5 } .
2. 
Near Soft OR Product  ( F B N 2 G B ) .
  • ( ϕ 1 , ϕ 1 ) : F ( ϕ 1 ) G ( ϕ 1 ) = { p 1 , p 2 , p 3 , p 5 } K * ( 2 ) = { p 1 , p 2 , p 5 } , K * ( 2 ) = O .
  • ( ϕ 1 , ϕ 2 ) : F ( ϕ 1 ) G ( ϕ 2 ) = { p 1 , p 3 } K * ( 2 ) = { p 1 } , K * ( 2 ) = { p 1 , p 2 , p 3 , p 4 , p 5 } .
  • ( ϕ 2 , ϕ 1 ) : F ( ϕ 2 ) G ( ϕ 1 ) = { p 1 , p 2 , p 3 , p 5 } K * ( 2 ) = { p 1 , p 2 , p 5 } , K * ( 2 ) = O .
  • ( ϕ 2 , ϕ 2 ) : F ( ϕ 2 ) G ( ϕ 2 ) = { p 1 , p 2 , p 3 } K * ( 2 ) = { p 1 , p 2 } , K * ( 2 ) = { p 1 , p 2 , p 3 , p 4 , p 5 } .
Remark 1. 
The extension presented in Example 3 shows that increasing the nearness parameter r enlarges the boundary regions of near soft products by expanding the upper approximations, while the lower approximations preserve their conservative behavior. Moreover, it is observed that the near soft AND product generally produces larger boundary regions due to its intersection-based nature, whereas the near soft OR product tends to enlarge the lower approximation, thereby reducing certain forms of perceptual uncertainty. These structural behaviors are fully consistent with the philosophy of nearness approximation spaces and demonstrate the flexibility of near soft logical products in modeling similarity-based information.
Remark 2. 
The computational behavior analyzed in Example 3 forms the operational foundation of the near soft decision-making framework developed in the subsequent section. Since boundary-oriented sets such as B K ( σ ) and the Uni–Int decision set D ( σ ) are directly influenced by the size of boundary regions, different choices of the nearness parameter r may lead to different admissible decision objects. Therefore, the parameter r can be interpreted as a decision-sensitivity control parameter within near soft information systems.
Theorem 1. 
Let N A S = ( O , F , B r , N r , ν N r ) be a nearness approximation space, and let σ 1 = ( F 1 , B 1 ) and σ 2 = ( F 2 , B 2 ) be two near soft sets over O . Then,
(i)
The near soft AND product σ 1 N r σ 2 is a near soft set over B 1 × B 2 ;
(ii)
The near soft OR product σ 1 N r σ 2 is a near soft set over B 1 × B 2 .
Proof. 
We prove the first statement for the AND operation; the proof for the OR case follows analogously.
Step 1: Structural Definition. By Definition 9, the near soft AND product is defined as σ 1 N r σ 2 = ( H , B 1 × B 2 ) , where the mapping H : B 1 × B 2 P ( O ) × P ( O ) assigns each pair ( φ , ψ ) to ( H * ( φ , ψ ) , H * ( φ , ψ ) ) . Here,
H * ( φ , ψ ) = N r ( F 1 ( φ ) F 2 ( ψ ) ) , H * ( φ , ψ ) = N r * ( F 1 ( φ ) F 2 ( ψ ) ) .
Step 2: Verification of Approximation Properties. In any nearness approximation space, for every subset X O , the approximation operators satisfy
N r ( X ) X N r * ( X ) .
Setting X = F 1 ( φ ) F 2 ( ψ ) yields
N r ( F 1 ( φ ) F 2 ( ψ ) ) N r * ( F 1 ( φ ) F 2 ( ψ ) ) .
Hence, for each ( φ , ψ ) B 1 × B 2 , we obtain
H * ( φ , ψ ) H * ( φ , ψ ) ,
which ensures that the corresponding boundary region is well-defined.
Step 3: Conclusion. Since H * and H * are well-defined set-valued mappings into P ( O ) and satisfy the required approximation properties, it follows that σ 1 N r σ 2 is a near soft set over the parameter set B 1 × B 2 .
The proof for the OR operation ( σ 1 N r σ 2 ) proceeds in an analogous manner by replacing the intersection F 1 ( φ ) F 2 ( ψ ) with the union F 1 ( φ ) F 2 ( ψ ) and applying the same approximation properties. □
Proposition 1. 
Let N A S = ( O , F , B r , N r , ν N r ) be a nearness approximation space and let σ 1 = ( F 1 , B ) , σ 2 = ( F 2 , B ) and σ 3 = ( F 3 , C ) be near soft sets over O . Then the following properties hold:
(i)
Weak idempotency:
σ 1 N r σ 1 σ 1 and σ 1 N r σ 1 σ 1 ,
where ≅ denotes near soft equivalence under a natural parameter projection.
(ii)
Monotonicity:  If σ 1 σ 2 , then
σ 1 N r σ 3 σ 2 N r σ 3 and σ 1 N r σ 3 σ 2 N r σ 3 .
(iii)
Projected containment:  There exists a natural projection π : B × C B such that
π ( σ 1 N r σ 3 ) σ 1 and σ 1 π ( σ 1 N r σ 3 ) .
Proof. 
(i)
Let σ 1 = ( F , B ) . By Definition 9,
σ 1 N r σ 1 = ( H , B × B ) ,
where
H * ( φ , ψ ) = N r ( F ( φ ) F ( ψ ) ) , H * ( φ , ψ ) = N r ( F ( φ ) F ( ψ ) ) .
Restricting to the diagonal Δ B = { ( φ , φ ) : φ B } yields
H * ( φ , φ ) = N r ( F ( φ ) ) , H * ( φ , φ ) = N r ( F ( φ ) ) ,
which coincides with σ 1 . Hence, σ 1 N r σ 1 is near soft equivalent to σ 1 . The proof for the OR operation is analogous.
(ii)
Suppose σ 1 σ 2 . Then F 1 ( φ ) F 2 ( φ ) for all φ B . For any ψ C ,
F 1 ( φ ) F 3 ( ψ ) F 2 ( φ ) F 3 ( ψ ) ,
and
F 1 ( φ ) F 3 ( ψ ) F 2 ( φ ) F 3 ( ψ ) .
By the monotonicity of N r and N r , the inclusions are preserved, yielding the desired result.
(iii)
Define π : B × C B by π ( φ , ψ ) = φ . Since
F 1 ( φ ) F 3 ( ψ ) F 1 ( φ ) and F 1 ( φ ) F 1 ( φ ) F 3 ( ψ ) ,
applying N r and N r preserves these inclusions. Hence,
π ( σ 1 N r σ 3 ) σ 1 and σ 1 π ( σ 1 N r σ 3 ) .  
Theorem 2 (De Morgan Laws).
Let σ 1 = ( F 1 , B 1 ) and σ 2 = ( F 2 , B 2 ) be two near soft sets. Then, for all ( φ , ψ ) B 1 × B 2 , the following identities hold:
( σ 1 N r σ 2 ) c = σ 1 c N r σ 2 c , ( σ 1 N r σ 2 ) c = σ 1 c N r σ 2 c ,
where all operations are taken over the parameter set B 1 × B 2 .
Proof. 
Let ( φ , ψ ) B 1 × B 2 . By the definition of the near soft AND operation,
( F 1 N r F 2 ) ( φ , ψ ) = N r F 1 ( φ ) F 2 ( ψ ) .
Taking the relative complement in O and applying the classical De Morgan laws, we obtain
O F 1 ( φ ) F 2 ( ψ ) = ( O F 1 ( φ ) ) ( O F 2 ( ψ ) ) = F 1 c ( φ ) F 2 c ( ψ ) .
This coincides with the core mapping of σ 1 c N r σ 2 c over B 1 × B 2 . Hence,
( σ 1 N r σ 2 ) c = σ 1 c N r σ 2 c .
The second identity follows analogously. □
Theorem 3 (Absorption Laws for Near Soft Operations).
Let N A S = ( O , F , B r , N r , ν N r ) be a nearness approximation space. Let σ 1 = ( F 1 , B ) and σ 2 = ( F 2 , B ) be two near soft sets over the same parameter set B. Then the following absorption laws hold:
σ 1 N r ( σ 1 N r σ 2 ) = σ 1 , σ 1 N r ( σ 1 N r σ 2 ) = σ 1 .
Proof. (i) AND–OR absorption.
By Definition 9, the near soft OR operation σ 1 N r σ 2 is given by
( σ 1 N r σ 2 ) ( ψ ) = N r * F 1 ( ψ ) F 2 ( ψ ) , ψ B .
Hence, the near soft AND operation σ 1 N r ( σ 1 N r σ 2 ) is defined by
( σ 1 N r ( σ 1 N r σ 2 ) ) ( φ ) = N r F 1 ( φ ) ( F 1 ( φ ) F 2 ( φ ) ) , φ B .
Using the classical absorption law of set theory,
F 1 ( φ ) ( F 1 ( φ ) F 2 ( φ ) ) = F 1 ( φ ) ,
we obtain
( σ 1 N r ( σ 1 N r σ 2 ) ) ( φ ) = N r ( F 1 ( φ ) ) .
Since N r ( F 1 ( φ ) ) = F 1 ( φ ) by the definition of a near soft set, it follows that
σ 1 N r ( σ 1 N r σ 2 ) = σ 1 .
(ii) OR–AND absorption. Similarly, by Definition 9,
( σ 1 N r σ 2 ) ( ψ ) = N r F 1 ( ψ ) F 2 ( ψ ) , ψ B .
Therefore,
( σ 1 N r ( σ 1 N r σ 2 ) ) ( φ ) = N r * F 1 ( φ ) ( F 1 ( φ ) F 2 ( φ ) ) , φ B .
Applying the classical absorption law,
F 1 ( φ ) ( F 1 ( φ ) F 2 ( φ ) ) = F 1 ( φ ) ,
we obtain
( σ 1 N r ( σ 1 N r σ 2 ) ) ( φ ) = N r * ( F 1 ( φ ) ) .
By the definition of the upper near approximation operator, N r * ( F 1 ( φ ) ) = F 1 ( φ ) . Hence,
σ 1 N r ( σ 1 N r σ 2 ) = σ 1 .
This completes the proof. □
Theorem 4 (Distributive Laws for Near Soft Operations).
Let N A S = ( O , F , B r , N r , ν N r ) be a nearness approximation space. Let σ 1 = ( F 1 , B ) , σ 2 = ( F 2 , B ) , and σ 3 = ( F 3 , B ) be near soft sets defined over the same parameter set B. Then, the following distributive laws hold:
σ 1 N r ( σ 2 N r σ 3 ) = ( σ 1 N r σ 2 ) N r ( σ 1 N r σ 3 ) , σ 1 N r ( σ 2 N r σ 3 ) = ( σ 1 N r σ 2 ) N r ( σ 1 N r σ 3 ) .
Proof. 
We prove the first identity; the second follows analogously.
(i) 
AND–OR distributivity. Let φ B be arbitrary. By Definition 9,
( σ 2 N r σ 3 ) ( φ ) = N r * F 2 ( φ ) F 3 ( φ ) .
Hence,
( σ 1 N r ( σ 2 N r σ 3 ) ) ( φ ) = N r F 1 ( φ ) ( F 2 ( φ ) F 3 ( φ ) ) .
Using the classical distributive law in O ,
F 1 ( φ ) ( F 2 ( φ ) F 3 ( φ ) ) = ( F 1 ( φ ) F 2 ( φ ) ) ( F 1 ( φ ) F 3 ( φ ) ) .
Applying the monotonicity of the near approximation operators, we obtain
N r ( F 1 ( φ ) F 2 ( φ ) ) ( F 1 ( φ ) F 3 ( φ ) ) = ( σ 1 N r σ 2 N r σ 1 N r σ 3 ) ( φ ) .
Therefore,
σ 1 N r ( σ 2 N r σ 3 ) = ( σ 1 N r σ 2 ) N r ( σ 1 N r σ 3 ) .
(ii)
OR–AND distributivity. The second equality is obtained analogously by using the classical identity
F 1 ( φ ) ( F 2 ( φ ) F 3 ( φ ) ) = ( F 1 ( φ ) F 2 ( φ ) ) ( F 1 ( φ ) F 3 ( φ ) ) ,
together with the definitions of the near soft OR and AND operations. □
It is important to clarify that, although the algebraic identities established in this section (e.g., Theorem 1 and Proposition 1) may appear formally similar to classical soft product properties, their validity within the near soft framework is not a trivial adaptation. In classical soft set theory, these laws are derived from standard set-theoretic intersection and union operations. In the near soft setting, however, every operation is governed by nearness approximation operators ( N r and N r * ), which are non-linear and satisfy specific monotonicity properties. The preservation of these identities demonstrates that the proposed near soft products constitute well-behaved extensions that maintain algebraic integrity, even when the underlying universe is partitioned into indiscernibility classes. This structural consistency is a crucial result, as it ensures that nearness-based granularity does not compromise the fundamental logic of IA, while still providing the additional advantage of boundary-driven decision analysis.

3. Near Soft Decision-Making Method

In this section, we propose a practical decision-making algorithm based on near soft product operations. The method employs the lower and upper near approximations of AND–OR products to evaluate alternatives, thereby incorporating similarity-based uncertainty into the final decision-making process.
Let N A S = ( O , F , B r , N r , ν N r ) be a nearness approximation space. Let σ 1 = ( F 1 , B 1 ) and σ 2 = ( F 2 , B 2 ) be two near soft sets over O . Denote by S P the family of all near soft products obtained from σ 1 and σ 2 via the AND and OR operations.
Step 1.
(Near Soft Product Construction). Construct a near soft product
σ = σ 1 e N r σ 2 S P ,
where e N r { N r , N r } . The resulting near soft set is defined on B 1 × B 2 by
H ( φ , ψ ) = H * ( φ , ψ ) , H * ( φ , ψ ) , ( φ , ψ ) B 1 × B 2 .
Step 2.
(Boundary Decision Function). Define the boundary-based decision function
B K : S P P ( O )
by
B K ( σ ) = ( φ , ψ ) B 1 × B 2 H * ( φ , ψ ) H * ( φ , ψ ) .
This set represents the collective boundary region capturing perceptual indiscernibility and uncertainty among the objects.
Step 3.
(Uni–Int Decision Set). Compute the global upper and lower regions as
U n i ( σ ) = ( φ , ψ ) B 1 × B 2 H * ( φ , ψ ) , I n t ( σ ) = ( φ , ψ ) B 1 × B 2 H * ( φ , ψ ) .
The Uni–Int decision set is defined by
D ( σ ) = U n i ( σ ) I n t ( σ ) .
Step 4.
(Decision Rule). An object x O is called an admissible (or acceptable) decision object if
x D ( σ ) .
Objects belonging to I n t ( σ ) represent strong decisions, while those outside U n i ( σ ) are excluded from consideration.
Example 4 (Near Soft Decision-Making for Real Estate Selection).
Let
U = { u 1 , u 2 , u 3 , u 4 , u 5 , u 6 }
be the set of houses offered for sale by a real estate agency, and
E = { e 1 , e 2 , e 3 , e 4 , e 5 , e 6 , e 7 , e 8 , e 9 }
be the parameter set defined as
e 1 : expensive , e 2 : beautiful , e 3 : close to the city center , e 4 : cheap , e 5 : with a garden , e 6 : modern , e 7 : large , e 8 : medium size , e 9 : small .
Assume the nearness approximation space
N A S = ( U , F , B r , N r , ν N r ) ,
where the indiscernibility classes are given by
[ u 1 ] B r = [ u 2 ] B r = { u 1 , u 2 } , [ u 3 ] B r = [ u 4 ] B r = { u 3 , u 4 } , [ u 5 ] B r = [ u 6 ] B r = { u 5 , u 6 } .

3.1. Step 1. Near Soft Product Construction

According to Step 1 of the proposed method, the customer considers two groups of criteria.
Cost and location parameters:
B 1 = { e 1 , e 3 , e 4 } , σ 1 = ( F 1 , B 1 ) ,
where
F 1 ( e 1 ) = { u 1 , u 3 } , F 1 ( e 3 ) = { u 2 , u 3 , u 4 } , F 1 ( e 4 ) = { u 5 , u 6 } .
Structural and aesthetic parameters:
B 2 = { e 5 , e 6 , e 7 } , σ 2 = ( F 2 , B 2 ) ,
where
F 2 ( e 5 ) = { u 1 , u 4 , u 5 } , F 2 ( e 6 ) = { u 2 , u 3 , u 6 } , F 2 ( e 7 ) = { u 3 , u 4 } .
Both σ 1 and σ 2 are valid near soft sets, since each parameter induces a nonempty boundary region.
We construct the near soft AND product
σ = σ 1 N r σ 2 = ( H , B 1 × B 2 ) ,
where
H ( φ , ψ ) = H * ( φ , ψ ) , H * ( φ , ψ ) ,
with
H * ( φ , ψ ) = N r F 1 ( φ ) F 2 ( ψ ) , H * ( φ , ψ ) = N r F 1 ( φ ) F 2 ( ψ ) .

3.2. Step 2. Boundary Decision Function

Following Step 2 of the method, the boundary-based decision function is computed as
B K ( σ ) = ( e i , e j ) B 1 × B 2 H * ( e i , e j ) H * ( e i , e j ) .
Since each object appears in at least one boundary region, we obtain
B K ( σ ) = U .
This indicates that all houses participate in perceptual uncertainty and, thus, necessitates a refined decision mechanism.

3.3. Step 3. Uni–Int Decision Set

According to Step 3, we compute
U n i ( σ ) = ( e i , e j ) H * ( e i , e j ) = U , I n t ( σ ) = ( e i , e j ) H * ( e i , e j ) = { u 3 , u 4 } .
Hence, the Uni–Int decision set is
D ( σ ) = U n i ( σ ) I n t ( σ ) = { u 1 , u 2 , u 5 , u 6 } .

3.4. Step 4. Final Decision

By Step 4 of the proposed method, an object is admissible if it belongs to D ( σ ) . Therefore, the optimal houses for the customer are
{ u 1 , u 2 , u 5 , u 6 } .
The houses u 3 and u 4 belong to I n t ( σ ) and, thus, represent strong decision objects, which are excluded from the boundary-based decision set.

4. Comparative and Sensitivity Analysis

This section strengthens the practical component of the paper by (i) comparing the proposed near soft decision mechanism with the corresponding classical soft-product construction and (ii) studying the sensitivity of the resulting decision sets with respect to the nearness parameter r. The analysis is carried out using the real-estate case study in Example 4.

4.1. Classical Soft AND-Product Baseline

Let U = { u 1 , , u 6 } , B 1 = { e 1 , e 3 , e 4 } , B 2 = { e 5 , e 6 , e 7 } , and σ 1 = ( F 1 , B 1 ) , σ 2 = ( F 2 , B 2 ) be as in Example 4. The classical soft AND-product (without nearness approximations) is defined by
σ 1 σ 2 = ( H , B 1 × B 2 ) , H ( e i , e j ) = F 1 ( e i ) F 2 ( e j ) .
A standard and transparent way to obtain a crisp ranking in the classical setting is to use an occurrence (choice) function:
c ( u ) = { ( e i , e j ) B 1 × B 2 : u H ( e i , e j ) } , u U .
Using the intersections listed in Example 4, we obtain
c ( u 1 ) = 1 , c ( u 2 ) = 1 , c ( u 3 ) = 4 , c ( u 4 ) = 2 , c ( u 5 ) = 1 , c ( u 6 ) = 1 .
Hence, under the classical soft-product baseline, the alternative u 3 is the strongest candidate (largest choice value), followed by u 4 , whereas the remaining objects are tied. This baseline will be contrasted with the proposed near soft model, which explicitly incorporates perceptual indiscernibility and yields a structured, boundary-driven admissibility set.

4.2. Near Soft Boundary-Driven Comparison (Same Inputs)

In the near soft setting, the same intersections F 1 ( e i ) F 2 ( e j ) are refined via the lower and upper near approximations:
H r ( e i , e j ) = N r F 1 ( e i ) F 2 ( e j ) , H r * * ( e i , e j ) = N r F 1 ( e i ) F 2 ( e j ) .
The boundary contribution of each parameter pair is then
B r ( e i , e j ) = H r * * ( e i , e j ) H r ( e i , e j ) ,
and the aggregated boundary set is B K r ( σ ) = ( e i , e j ) B r ( e i , e j ) . To quantify how much each object participates in perceptual uncertainty, define the boundary frequency
b r ( u ) = { ( e i , e j ) B 1 × B 2 : u B r ( e i , e j ) } , u U .
Finally, the global near soft regions are defined by
U r ( σ ) = ( e i , e j ) B 1 × B 2 H r * * ( e i , e j ) , Int r ( σ ) = ( e i , e j ) B 1 × B 2 H r ( e i , e j ) ,
and the admissible (boundary-driven) decision set is
D r ( σ ) = U r ( σ ) Int r ( σ ) .
(Notice that, in Example 4, Int r ( σ ) must be interpreted as the union of lower regions, which is consistent with the reported output { u 3 , u 4 } .)
Interpretation. The classical baseline c measures how often an object satisfies the crisp AND-combination of parameters. In contrast, b r measures how often an object lies in boundary regions induced by nearness, thereby capturing similarity-based indiscernibility (i.e., “near misses” that are relevant in perceptual decision environments). This yields a decision set D r ( σ ) that is not merely a winner-take-all output but a structured set of admissible choices under controlled uncertainty.

4.3. Sensitivity with Respect to the Nearness Parameter r

We now demonstrate how the near soft output changes as the nearness tolerance increases. Following the philosophy summarized in Table 4 (larger r yields more flexible similarity), we consider two levels:
  • Strict nearness (r = 1). The indiscernibility classes are those given in Example 4: [ u 1 ] r = 1 = [ u 2 ] r = 1 = { u 1 , u 2 } , [ u 3 ] r = 1 = [ u 4 ] r = 1 = { u 3 , u 4 } , [ u 5 ] r = 1 = [ u 6 ] r = 1 = { u 5 , u 6 } .
  • More tolerant nearness (r = 2). To model increased perceptual tolerance, we use a coarser partition (objects become indiscernible in larger groups), e.g., [ u 1 ] r = 2 = { u 1 , u 2 , u 3 , u 4 } and [ u 5 ] r = 2 = { u 5 , u 6 } . This is a natural sensitivity scenario: as r increases, the upper approximations enlarge and boundary regions become wider.
The calculated values of the near soft AND-product σ = σ 1 N r σ 2 and its corresponding lower ( H * ) and upper ( H * ) approximations are presented in Table 5.
Table 6 highlights a key modeling advantage of the near soft approach: as similarity tolerance increases, the model becomes more conservative in declaring “strong” objects (the certain region Int r ( σ ) shrinks), while the admissible boundary region expands. In particular, when nearness is highly tolerant, many singleton intersections do not contain entire indiscernibility blocks, causing lower approximations to vanish and shifting the decision emphasis to boundary-driven admissibility.

4.4. Compact Comparative Summary

For completeness, we summarize the baseline and near soft outcomes:
  • Classical soft baseline (no nearness). Choice values: c ( u 3 ) is maximal, followed by u 4 ; the remaining objects are tied at 1. This yields a sharp ranking but ignores perceptual indiscernibility.
  • Near soft output (with nearness). For r = 1 , the model distinguishes strong objects { u 3 , u 4 } (certain region) from admissible boundary objects { u 1 , u 2 , u 5 , u 6 } . For r = 2 , admissibility expands (reflecting higher tolerance), as shown in Table 6.
Therefore, the proposed near soft framework complements the crisp classical baseline by introducing a tunable similarity parameter r and by producing boundary-aware admissibility sets that are particularly suitable for similarity-based decision environments.

4.5. Boundary Frequency Analysis

To further quantify the contribution of individual alternatives to perceptual uncertainty, we introduce a boundary frequency indicator that measures how often an object appears in the boundary regions induced by near soft product operations.
Definition 10.
For a fixed nearness parameter r and a near soft product σ defined on B 1 × B 2 , the boundary frequency of an object u U is defined by
b r ( u ) = { ( e i , e j ) B 1 × B 2 : u H r * * ( e i , e j ) H r ( e i , e j ) } .
Intuitively, b r ( u ) counts the number of parameter pairs for which u lies in the near soft boundary region and, therefore, represents a borderline or similar-but-uncertain alternative.
Boundary frequencies for r = 1. Using the boundary regions computed in Example 4, we obtain the following values in Table 7:
The boundary frequency values reveal an additional layer of structure that is not visible in classical soft decision models. In particular, although u 3 and u 4 belong to the certain region Int 1 ( σ ) , they still participate in boundary regions for other parameter pairs, reflecting residual similarity-based uncertainty. Conversely, the alternatives u 1 and u 2 , which belong to the admissible decision set D 1 ( σ ) , exhibit the highest boundary frequencies, indicating that they are consistently close to satisfying multiple parameter combinations without being decisively certain.
Interpretation. The indicator b r ( u ) can be viewed as a soft confidence index:
  • Large b r ( u ) values correspond to alternatives that are repeatedly involved in borderline cases and may be preferred when flexibility is desired;
  • Small b r ( u ) values correspond to alternatives that are either strongly accepted or strongly rejected.
This analysis demonstrates that near soft decision-making not only produces admissible decision sets but also enables a refined ranking of alternatives based on their participation in perceptual uncertainty, which is not achievable using classical soft product operations.

4.6. Computational Complexity and Scalability

While the practical utility of the near soft Uni–Int mechanism is demonstrated through a real-estate case study, it is essential to consider its scalability for large-scale decision systems. The computational complexity of the proposed framework is primarily governed by two factors: (i) the construction of nearness equivalence classes and (ii) the computation of near approximations for the AND-product. For a universe U of size n and a parameter set E of size m, the worst-case complexity of generating indiscernibility classes is approximately O ( n 2 ) . Once the nearness structure is established, the near soft product operations follow an O ( n · m 2 ) complexity, consistent with classical soft product operations. The boundary frequency calculation b r ( u ) introduces a linear overhead of O ( n ) during the aggregation phase. For very large datasets ( n > 10 5 ), scalability can be further improved by utilizing hashing-based indiscernibility algorithms or by parallelizing the boundary region unions, ensuring that the near soft framework remains computationally viable for high-dimensional information analysis.

5. Conclusions

In this study, a rigorous theoretical and applicative framework for near soft sets is developed by systematically investigating the algebraic behavior and decision-making capability of near soft AND and OR product operations. By integrating the parameterized structure of soft sets with the nearness approximation spaces of near set theory, the proposed framework addresses a significant gap in the literature related to similarity-based uncertainty modeling.
From a theoretical perspective, the main contributions of the study can be summarized as follows. First, near soft AND and OR product operations are formally defined within the context of nearness approximation spaces. Second, it is shown that these operations preserve fundamental algebraic properties, including idempotency, absorption, distributivity, and De Morgan-type laws. These results demonstrate that near soft sets are not merely a technical extension of classical soft sets, but rather constitute a structurally consistent and mathematically well-founded framework in which perceptual nearness is directly embedded into the algebraic structure of IA.
From an applicative standpoint, the proposed Uni–Int decision-making mechanism extends classical soft decision models by explicitly incorporating near soft boundary regions. The comparative analysis reveals that, unlike classical soft product-based approaches, which produce crisp rankings, the near soft framework generates structured decision regions that distinguish strong, admissible, and uncertain alternatives. In addition, the sensitivity analysis with respect to the nearness parameter r indicates that increasing similarity tolerance enlarges boundary regions and shifts the decision emphasis from certainty toward admissibility, thereby providing a controllable mechanism for modeling decision flexibility.
Furthermore, an additional quantitative layer is introduced through the boundary frequency indicator, which measures how frequently individual alternatives participate in perceptual uncertainty across parameter combinations. This indicator enables a refined evaluation of admissible decision objects by capturing their consistent involvement in boundary regions, a feature that is not accessible within classical soft product frameworks.
Overall, the proposed near soft product-based decision model provides a flexible, interpretable, and mathematically grounded approach to similarity-aware decision-making. The framework is particularly suitable for applications in which object similarity, tolerance control, and boundary-driven admissibility are essential, such as medical diagnosis, pattern recognition, and information analysis.
Future research directions are expected to be developed along several lines. In particular, the proposed framework may be extended to more generalized uncertainty environments, such as fuzzy, intuitionistic fuzzy, and neutrosophic near soft structures. In addition, the adaptation of the model to real-world large-scale applications—such as image classification, medical decision support systems, and multi-criteria industrial optimization problems—is anticipated to be explored in future studies. Finally, the development of efficient computational algorithms and parallelizable implementations for large datasets is expected to constitute a promising direction for enhancing the practical applicability of the near soft decision-making framework.

Author Contributions

Conceptualization, A.Ö., J.P., F.Ö., M.D. and M.E.; methodology, A.Ö. and J.P.; software, A.Ö., J.P., F.Ö., M.D. and M.E.; validation, A.Ö., J.P., F.Ö., M.D. and M.E.; formal analysis, A.Ö., J.P., F.Ö., M.D. and M.E.; investigation, A.Ö., J.P., F.Ö. and M.D.; resources, A.Ö., J.P. and M.D.; data curation, A.Ö., J.P., F.Ö., M.D. and M.E.; writing—original draft preparation, A.Ö., J.P., F.Ö. and M.D.; writing—review and editing, A.Ö. and J.P.; visualization, A.Ö., J.P. and F.Ö.; supervision, A.Ö. and J.P.; project administration, A.Ö. and J.P.; funding acquisition, M.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data is contained within the article.

Acknowledgments

The authors are highly grateful to the Referees for their constructive suggestions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  2. Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
  3. Molodtsov, D. Soft set theory—First results. Comput. Math. Appl. 1999, 37, 19–31. [Google Scholar] [CrossRef]
  4. Peters, J.F. Near sets: Special theory about nearness of objects. Fundam. Inform. 2007, 75, 407–433. [Google Scholar]
  5. Peters, J.F. Near sets: General theory about nearness of objects. Appl. Math. Sci. 2007, 1, 2609–2629. [Google Scholar]
  6. Maji, P.K.; Roy, A.R.; Biswas, R. An application of soft sets in a decision making problem. Comput. Math. Appl. 2002, 44, 1077–1083. [Google Scholar] [CrossRef]
  7. Chen, D.; Tsang, E.C.C.; Yeung, D.S. Some notes on the parameterization reduction of soft sets. Int. Conf. Mach. Learn. Cyber. 2003, 3, 1442–1445. [Google Scholar]
  8. Chen, D.; Tsang, E.C.C.; Yeung, D.S.; Wang, X. The parametrization reduction of soft sets and its applications. Comput. Math. Appl. 2005, 49, 757–763. [Google Scholar] [CrossRef]
  9. Xiao, Z.; Chen, L.; Zhong, B.; Ye, S. Recognition for Soft Information Based on the Theory of Soft Sets. In Proceedings of the ICSSSM’05—2005 International Conference on Services Systems and Services Management, Chongqing, China, 13–15 June 2005; Chen, J., Ed.; IEEE: Piscataway, NJ, USA, 2005; Volume 2, pp. 1104–1106. [Google Scholar]
  10. Herawan, T. Soft set-based decision making for patients suspected influenza-like illness. Int. J. Mod. Phys. Conf. Ser. 2010, 9, 259–270. [Google Scholar] [CrossRef]
  11. Kharal, A. Soft approximations and uni-int decision making. Sci. World J. 2014, 2014, 327408. [Google Scholar] [CrossRef]
  12. Dauda, M.K.; Mamat, M.; Waziri, M.Y. An application of soft set in decision making. J. Teknol. Sci. Eng. 2015, 77, 119–122. [Google Scholar] [CrossRef]
  13. Çağman, N.; Enginoğlu, S. Soft set theory and uni-int decision making. Eur. J. Oper. Res. 2010, 207, 848–855. [Google Scholar] [CrossRef]
  14. Çağman, N.; Enginoğlu, S. Soft matrix theory and its decision making. Comput. Math. Appl. 2010, 59, 3308–3314. [Google Scholar] [CrossRef]
  15. Feng, F.; Li, Y.; Çağman, N. Generalized uni-int decision making schemes based on choice value soft sets. Eur. J. Oper. Res. 2012, 220, 162–170. [Google Scholar] [CrossRef]
  16. Feng, Q.; Zhou, Y. Soft discernibility matrix and its applications in decision making. Appl. Soft Comput. 2014, 24, 749–756. [Google Scholar] [CrossRef]
  17. Riaz, M.; Razzaq, A.; Aslam, M.; Pamucar, D. M-parameterized N-soft topology-based TOPSIS approach for multi-attribute decision making. Symmetry 2021, 13, 748. [Google Scholar] [CrossRef]
  18. Sangodapo, T.O.; Onasanya, B.O.; Mayerova-Hoskova, S. Decision-making with fuzzy soft matrix using a revised method: A case of medical diagnosis of diseases. Mathematics 2021, 9, 2327. [Google Scholar] [CrossRef]
  19. Xu, Z.; Chen, C.; Yang, Y. Generalized fuzzy soft power Bonferroni mean operators and their application in decision making. Symmetry 2021, 13, 810. [Google Scholar] [CrossRef]
  20. Kalaiselvan, S.; Vijayabalaji, S. Soft expert symmetric group and its application in MCDM problem. Symmetry 2022, 14, 2685. [Google Scholar] [CrossRef]
  21. Saeed, M.; Saeed, M.H.; Shafaqat, R.; Sessa, S.; Ishtiaq, U.; Di Martino, F. A theoretical development of cubic Pythagorean fuzzy soft set with its application in multi-attribute decision making. Symmetry 2022, 14, 2639. [Google Scholar] [CrossRef]
  22. Ahmmad, J.; Mahmood, T. Picture fuzzy soft prioritized aggregation operators and their applications in medical diagnosis. Symmetry 2023, 15, 861. [Google Scholar] [CrossRef]
  23. Musa, S.Y.; Alajlan, A.I.; Asaad, B.A.; Ameen, Z.A. N-bipolar soft expert sets and their applications in robust multi-attribute group decision-making. Mathematics 2025, 13, 530. [Google Scholar] [CrossRef]
  24. Maji, P.K.; Biswas, R.; Roy, A.R. Soft set theory. Comput. Math. Appl. 2003, 45, 555–562. [Google Scholar] [CrossRef]
  25. Pei, D.; Miao, D. From sets to information systems. In Proceedings of the IEEE International Conference on Granular Computing, Beijing, China, 25–27 July 2005; pp. 617–621. [Google Scholar]
  26. Yang, C.F. A note on: Soft set theory. Comput. Math. Appl. 2008, 56, 1135–1137. [Google Scholar]
  27. Jiang, Y.; Tang, Y.; Chen, Q.; Wang, J.; Tang, S. Extending soft sets with description logics. Comput. Math. Appl. 2010, 59, 2087–2096. [Google Scholar] [CrossRef]
  28. Ali, M.I.; Shabir, M.; Naz, M. Algebraic structures of soft sets associated with new operations. Comput. Math. Appl. 2011, 61, 2647–2654. [Google Scholar] [CrossRef]
  29. Sezgin, A.; Atagün, A.O. On operations of soft sets. Comput. Math. Appl. 2011, 61, 1457–1467. [Google Scholar] [CrossRef]
  30. Fu, L. Notes on soft set operations. ARPN J. Syst. Softw. 2011, 1, 205–208. [Google Scholar]
  31. Ge, X.; Yang, S. Investigations on some operations of soft sets. World Acad. Sci. Eng. Technol. 2011, 75, 1113–1116. [Google Scholar]
  32. Ali, M.I.; Feng, F.; Liu, X.; Min, W.K.; Shabir, M. On some new operations in soft set theory. Comput. Math. Appl. 2009, 57, 1547–1553. [Google Scholar] [CrossRef]
  33. Feng, F.; Li, Y.M.; Davvaz, B.; Ali, M.I. Soft sets combined with fuzzy sets and rough sets: A tentative approach. Soft Comput. 2010, 14, 899–911. [Google Scholar] [CrossRef]
  34. Chen, X.; Yadav, P.; Singh, R.; Islam, S.M.N. ES structure based on soft J-subset. Mathematics 2023, 11, 853. [Google Scholar] [CrossRef]
  35. Jun, Y.B.; Yang, X. A note on the paper “Combination of interval-valued fuzzy set and soft set”. Comput. Math. Appl. 2011, 61, 1468–1470. [Google Scholar] [CrossRef]
  36. Liu, X.Y.; Feng, F.; Jun, Y.B. A note on generalized soft equal relations. Comput. Math. Appl. 2012, 64, 572–578. [Google Scholar] [CrossRef]
  37. Ali, B.; Saleem, N.; Sundus, N.; Khaleeq, S.; Saeed, M.; George, R.A. A contribution to the theory of soft sets via generalized relaxed operations. Mathematics 2022, 10, 2636. [Google Scholar] [CrossRef]
  38. Sezgin, A.; Kökçü, H.; Atagün, A.O. A comprehensive study on restricted and extended intersection operations of soft sets. Nat. Appl. Sci. J. 2025, 8, 44–111. [Google Scholar] [CrossRef]
  39. Sezgin, A.; Kökçü, H.; Stojanović, N.; Luzum, M. An extensive investigation of restricted and extended union operations of soft sets. Int. J. Eng. Sci. Technol. 2025, 17, 1–43. [Google Scholar] [CrossRef]
  40. Sezgin, A.; Kökçü, H.; Riaz, M. A detailed analysis of restricted and extended difference operations of soft sets. Int. J. Eng. Sci. Technol. 2025, 17, 1–38. [Google Scholar]
  41. Feng, F.; Li, Y. Soft subsets and soft product operations. Inf. Sci. 2013, 232, 44–57. [Google Scholar] [CrossRef]
  42. Sezgin, A.; Atagün, A.O.; Çağman, N. A complete study on and-product of soft sets. Sigma J. Eng. Nat. Sci. 2025, 43, 1–14. [Google Scholar] [CrossRef]
  43. Orbay, K.; Orbay, M.; Sezgin, A. An Exhaustive Analysis of the OR-Product of Soft Sets: A Symmetry Perspective. Symmetry 2025, 17, 1661. [Google Scholar] [CrossRef]
  44. Sezgin, A.; Çam, N.H. Soft difference-product: A new product for soft sets with its decision making. J. Amasya Univ. Inst. Sci. Technol. 2024, 5, 114–137. [Google Scholar] [CrossRef]
  45. Durak, I.; Sezgin, A. Soft union–star product of groups. Eurasian J. Sci. Eng. Tech. 2026, 7, 1–8. [Google Scholar] [CrossRef]
  46. Pawlak, Z. Classification of Objects by Means of Attributes; Tech. Rep. PAS 429; Institute for Computer Science, Polish Academy of Sciences: Warsaw, Poland, 1981. [Google Scholar]
  47. Pawlak, Z.; Peters, J.F.; Blisko, J. Jak blisko (How near). In Systemy Wspomagania Decyzji; University of Silesia: Katowice, Poland, 2007; Volume I, pp. 57–109. ISBN 83-920730-4-5. [Google Scholar]
  48. Peters, J.F.; Wasilewski, P. Foundations of near sets. Inf. Sci. 2009, 179, 3091–3109. [Google Scholar] [CrossRef]
  49. Peters, J.F. Tolerance near sets and image correspondence. Int. J. Biomim. Robot. Comput. Devices 2009, 1, 239–245. [Google Scholar] [CrossRef]
  50. Henry, C.J. Near Sets: Theory and Applications. Ph.D. Thesis, Department of Electrical & Computer Engineering, University of Manitoba, Winnipeg, MB, Canada, 2010. [Google Scholar]
  51. Henry, C.; Peters, J.F. Image pattern recognition using near sets. In International Workshop on Rough Sets, Fuzzy Sets, Data Mining, and Granular-Soft Computing; Springer: Berlin/Heidelberg, Germany, 2007; pp. 475–482. [Google Scholar]
  52. Inan, E.; Ozturk, M.A. Near groups on nearness approximation spaces. Hacet. J. Math. Stat. 2012, 41, 545–558. [Google Scholar]
  53. İnan, E.; Öztürk, M.A. Near semigroups on nearness approximation spaces. Ann. Fuzzy Math. Inform. 2015, 10, 287–297. [Google Scholar]
  54. Bağırmaz, N. Near ideals in near semigroups. Eur. J. Pure Appl. Math. 2018, 11, 505–516. [Google Scholar] [CrossRef]
  55. Bağırmaz, N. Near approximations in groups. Appl. Algebra Eng. Commun. Comput. 2019, 30, 285–297. [Google Scholar] [CrossRef]
  56. Davvaz, B.; Setyawati, D.W.; Soleha, S.; Mukhlash, I. Near approximations in modules. Found. Comput. Decis. Sci. 2021, 46, 319–337. [Google Scholar] [CrossRef]
  57. Davvaz, B.; Soleha, S.; Setyawati, D.W.; Mukhlash, I. Near approximation in rings. Appl. Algebra Eng. Commun. Comput. 2021, 32, 701–721. [Google Scholar] [CrossRef]
  58. Öztürk, M.A.; İnan, E. Nearness rings. Ann. Fuzzy Math. Inform. 2019, 17, 115–132. [Google Scholar] [CrossRef]
  59. Öztürk, M.A.; Jun, Y.B.; İz, A. Gamma semigroups on weak nearness approximation spaces. J. Int. Math. Virtual Inst. 2019, 9, 53–72. [Google Scholar]
  60. Öztürk, M.A. Prime ideals of gamma semigroups on weak nearness approximation spaces. Asian-Eur. J. Math. 2019, 12, 1950080. [Google Scholar] [CrossRef]
  61. Öztürk, M.A. Nearness d-algebras. J. Algebr. Hyperstruct. Log. Algebr. 2021, 2, 73–84. [Google Scholar] [CrossRef]
  62. Mostafavi, M.; Davvaz, B. Near Krasner hyperrings on nearness approximation spaces. J. Algebr. Hyperstruct. Log. Algebr. 2022, 3, 1–33. [Google Scholar] [CrossRef]
  63. Mostafavi, M.; Davvaz, B. Near polygroups on nearness approximation spaces. New Math. Nat. Comput. 2022, 18, 593–613. [Google Scholar] [CrossRef]
  64. Mostafavi, M.; Davvaz, B. Near semi hypergroups on nearness approximation spaces. Comput. Appl. Math. 2023, 42, 344. [Google Scholar] [CrossRef]
  65. Setyawati, D.W.; Soleha, S.; Subiono; Mukhlash, I.; Rinwati; Davvaz, B. Application of near approximations in Cayley graphs. Discret. Math. Algorithms Appl. 2023, 15, 2250180. [Google Scholar] [CrossRef]
  66. Öztürk, M.A.; Tekin, Ö.; Jun, Y.B. Normal nearness subgroups. Commun. Algebra 2024, 52, 102–115. [Google Scholar] [CrossRef]
  67. Öztürk, M.A. Nearness cosets of the nearness groups. New Math. Nat. Comput. 2024, 20, 935–944. [Google Scholar] [CrossRef]
  68. Öztürk, M.A.; Muhiuddin, G.; Jun, Y.B. Ordered nearness semigroups. Ann. Commun. Math. 2024, 7, 176–185. [Google Scholar]
  69. Jokar, F.; Davvaz, B. A new approach to approximations in lattices. J. Algebr. Hyperstruct. Log. Algebr. 2025, 6, 39–61. [Google Scholar] [CrossRef]
  70. Tekin, Ö.; Öztürk, M.A. Ordered gamma nearness semigroups. Ric. Mat. 2025; online first. [CrossRef]
  71. Taşbozan, H.; Icen, I.; Bagirmaz, N.; Ozcan, A.F. Soft sets and soft topology on nearness approximation spaces. Filomat 2017, 31, 4117–4125. [Google Scholar] [CrossRef]
  72. Ozkan, A. On near soft sets. Turk. J. Math. 2019, 43, 1005–1017. [Google Scholar] [CrossRef]
  73. Tasbozan, H. Near soft topological groups based on near soft element. Turk. J. Sci. 2022, 7, 53–57. [Google Scholar]
  74. Tasbozan, H. Application of bipolar near soft sets. Ikonion J. Math. 2024, 6, 21–29. [Google Scholar] [CrossRef]
  75. Tasbozan, H. Near soft sets based cryptosystem. Turk. J. Math. Comput. Sci. 2026, 18, 289–295. [Google Scholar] [CrossRef]
  76. Demirtaş, N.; Dalkıç, O.; Demirtaş, A. Separation axioms on near soft topological spaces. J. Univ. Math. 2023, 6, 227–238. [Google Scholar] [CrossRef]
Figure 1. Representation of near shapes in a topological space. The sphere X and the cylinder Y satisfy φ ( X ) = φ ( Y ) = 1 , indicating nearness under a common feature.
Figure 1. Representation of near shapes in a topological space. The sphere X and the cylinder Y satisfy φ ( X ) = φ ( Y ) = 1 , indicating nearness under a common feature.
Symmetry 18 00611 g001
Table 1. Symbols and descriptions used in near set theory.
Table 1. Symbols and descriptions used in near set theory.
SymbolInterpretation
O Set of perceptual objects
XSample set of objects ( X O )
pAn individual object ( p O )
F Set of all available probe (feature) functions
A , B Selected subsets of features ( A , B F )
Φ Object description mapping, Φ : O R L
LDescription length ( L = | B | )
Φ ( p ) Feature vector ϕ 1 ( p ) , ϕ 2 ( p ) , , ϕ L ( p )
Table 2. Symbols and descriptions in a nearness approximation space [5].
Table 2. Symbols and descriptions in a nearness approximation space [5].
SymbolInterpretation
BA subset of probe functions selected from F ( B F ).
B r A subfamily of r probe functions used to define localized nearness relations ( B r B , | B r | = r ).
B r Indiscernibility relation: p B r q ϕ B r , ϕ ( p ) = ϕ ( q ) .
[ p ] B r Equivalence class of an object p O with respect to B r .
O / B r The quotient set (partition ξ O , B r ) of O induced by B r .
rThe cardinality of the feature subset, where 1 r | B | .
N r ( B ) The family of all partitions ξ O , B r generated by all possible r-element subsets B r B . This constitutes the nearness structure.
ν N r Overlap function ν N r : P ( O ) × P ( O ) [ 0 , 1 ] , measuring the degree of perceptual overlap.
N r ( X ) Lower near approximation: N r ( X ) = { [ p ] B r O / B r : [ p ] B r X } .
N r ( X ) Upper near approximation: N r ( X ) = { [ p ] B r O / B r : [ p ] B r X } .
Bnd N r ( B ) ( X ) Boundary region: Bnd N r ( B ) ( X ) = N r ( X ) N r ( X ) .
Table 3. Values of parameter functions ϕ i over objects p j .
Table 3. Values of parameter functions ϕ i over objects p j .
p 1 p 2 p 3 p 4 p 5
ϕ 1 12442
ϕ 2 31220
ϕ 3 21312
Table 4. Comparison of near soft AND/OR product behaviors under different nearness parameters.
Table 4. Comparison of near soft AND/OR product behaviors under different nearness parameters.
Aspect r = 1 r = 2
Nearness toleranceStrict similarityMore flexible similarity
Lower approximationMostly small or emptyLargely unchanged
Upper approximationLimited expansionSignificantly expanded
Boundary regionNarrowerWider
Decision sensitivityLowHigh
Table 5. Calculated values of the near soft AND-product σ = σ 1 N r σ 2 and its corresponding lower ( H * ) and upper ( H * ) approximations.
Table 5. Calculated values of the near soft AND-product σ = σ 1 N r σ 2 and its corresponding lower ( H * ) and upper ( H * ) approximations.
( e i , e j ) F 1 ( e i ) F 2 ( e j ) H * H *
( e 1 , e 5 ) { u 1 } { u 1 , u 2 }
( e 1 , e 6 ) { u 3 } { u 3 , u 4 }
( e 1 , e 7 ) { u 3 } { u 3 , u 4 }
( e 3 , e 5 ) { u 4 } { u 3 , u 4 }
( e 3 , e 6 ) { u 2 , u 3 } { u 1 , u 2 , u 3 , u 4 }
( e 3 , e 7 ) { u 3 , u 4 } { u 3 , u 4 } { u 3 , u 4 }
( e 4 , e 5 ) { u 5 } { u 5 , u 6 }
( e 4 , e 6 ) { u 6 } { u 5 , u 6 }
( e 4 , e 7 )
Table 6. Sensitivity of the near soft decision regions to the nearness tolerance r (Example 4).
Table 6. Sensitivity of the near soft decision regions to the nearness tolerance r (Example 4).
Nearness Level U r ( σ ) Int r ( σ ) D r ( σ ) = U r ( σ ) Int r ( σ )
r = 1 (strict)U { u 3 , u 4 } { u 1 , u 2 , u 5 , u 6 }
r = 2 (tolerant)UU
Table 7. Boundary frequency values b 1 ( u ) for the near soft AND-product in Example 4.
Table 7. Boundary frequency values b 1 ( u ) for the near soft AND-product in Example 4.
Object b 1 ( u )
u 1 3
u 2 3
u 3 2
u 4 2
u 5 2
u 6 2
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Özkan, A.; Peters, J.; Özger, F.; Duman, M.; Ersoy, M. Advances in Near Soft Sets and Their Applications in Similarity-Based Decision Making. Symmetry 2026, 18, 611. https://doi.org/10.3390/sym18040611

AMA Style

Özkan A, Peters J, Özger F, Duman M, Ersoy M. Advances in Near Soft Sets and Their Applications in Similarity-Based Decision Making. Symmetry. 2026; 18(4):611. https://doi.org/10.3390/sym18040611

Chicago/Turabian Style

Özkan, Alkan, James Peters, Faruk Özger, Metin Duman, and Merve Ersoy. 2026. "Advances in Near Soft Sets and Their Applications in Similarity-Based Decision Making" Symmetry 18, no. 4: 611. https://doi.org/10.3390/sym18040611

APA Style

Özkan, A., Peters, J., Özger, F., Duman, M., & Ersoy, M. (2026). Advances in Near Soft Sets and Their Applications in Similarity-Based Decision Making. Symmetry, 18(4), 611. https://doi.org/10.3390/sym18040611

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop