Next Article in Journal
An Integrated Algorithm with Feature Selection, Data Augmentation, and XGBoost for Ovarian Cancer
Next Article in Special Issue
Short-Term Prediction of Traffic Flow Based on the Comprehensive Cloud Model
Previous Article in Journal
Analytical Solutions and Computer Modeling of a Boundary Value Problem for a Nonstationary System of Nernst–Planck–Poisson Equations in a Diffusion Layer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fuzzy Rough Set Models Based on Fuzzy Similarity Relation and Information Granularity in Multi-Source Mixed Information Systems

by
Pengfei Zhang
,
Yuxin Zhao
,
Dexian Wang
,
Yujie Zhang
and
Zheng Yu
*
School of Intelligent Medicine, Chengdu University of Traditional Chinese Medicine, Chengdu 611137, China
*
Author to whom correspondence should be addressed.
Mathematics 2024, 12(24), 4039; https://doi.org/10.3390/math12244039
Submission received: 4 December 2024 / Revised: 21 December 2024 / Accepted: 22 December 2024 / Published: 23 December 2024

Abstract

:
As a pivotal research method in the field of granular computing (GrC), fuzzy rough sets (FRSs) have garnered significant attention due to their successful overcoming of the limitations of traditional rough sets in handling continuous data. This paper is dedicated to exploring the application potential of FRS models within the framework of multi-source complex information systems, which undoubtedly holds profound research significance. Firstly, a novel multi-source mixed information system (MsMIS), encompassing five distinct data types, is introduced, thereby enriching the dimensions of data processing. Subsequently, a similarity function, designed based on the unique attributes of the data, is utilized to accurately quantify the similarity relations among objects. Building on this foundation, fuzzy T-norm operators are employed to integrate the similarity matrices derived from different data types into a cohesive whole. This integration not only lays a solid foundation for subsequent model construction but also highlights the value of multi-source information fusion in the analysis of the MsMIS. The integrated results are subsequently utilized to develop FRS models. Through rigorous examination from the perspective of information granularity, the rationality of the FRS model is proven, and its mathematical properties are explored. This paper contributes to the theoretical advancement of FRS models in GrC and offers promising prospects for their practical implementation.

1. Introduction

The fuzzy rough set (FRS) [1] is an extension of the Pawlak rough set (PRS), designed to address uncertainty and fuzziness. The RST, proposed by Zdzisław Pawlak in 1982, is utilized for data analysis and decision-making problems under incomplete knowledge [2]. It relies on upper and lower approximations, enabling effective handling of classification and attribute reduction.
Granular computing (GrC) is a computational model and methodology that uses “granules” as the fundamental units for processing information. Breaking down information into smaller, more meaningful “granules” simplifies the analysis and resolution of complex problems. Granules can encompass collections of datasets, objects, knowledge, and other forms of information. The central concept of GrC is to enable simplification and efficient processing by exploiting information granularity [3]. FRSs and PRSs are important approaches to granular computing. In PRSs, lower and upper approximations are defined as unions of some sets, which demonstrate a distinct granular structure among those sets. However, in the existing theory of FRSs, the definitions of lower and upper approximations are based on fuzzy membership functions, rather than being constructed as unions of some fuzzy sets. For instance, Jensen and Shen introduced the concept of attribute reduction using special fuzzy rough sets; however, the reduction structure in fuzzy rough sets is not as clearly defined as it is in classical rough sets [4]. This distinction between fuzzy and classical rough sets has inspired us to develop a theory of granular fuzzy sets, which we apply to capture the granular structure of fuzzy rough sets. In fuzzy set theory, a fuzzy point is defined as a fuzzy set containing only one element with a non-zero membership value [5], and any non-empty fuzzy set can be seen as the union of several fuzzy points. This implies that fuzzy points in fuzzy set theory serve a similar function to crisp points in classical set theory [6]. This insight has motivated us to construct information granules around fuzzy points. In particular, within a specialized type of information system, such as a multi-source mixed information system (MsMIS), there has been limited research exploring the underlying granular structure using the approach of GrC. In this paper, a multi-source mixed information system can be called a hybrid information system or a mixed information system. The purpose of adding the prefix ‘multi-source’ is to highlight the consideration of a hybrid/mixed information system composed of data collected from different sensors or information sources, viewed from a specific application perspective. The research work on hybrid/mixed information systems mainly focuses on applications and rarely focuses on the theoretical aspects of granular structure. For example, Zeng et al. introduced a new hybrid distance in a hybrid information system, and a novel fuzzy rough set was constructed by combining the hybrid distance and the Gaussian kernel. This fuzzy rough set model mainly solves the problem of feature selection in incremental situations [7]. Li et al. presented a novel three-way decision approach in a hybrid information system with images and applied it to medical diagnosis [8]. Yuan et al. introduced a fuzzy rough set method to deal with the problem of outlier detection in mixed attribute data [9].
The MsMIS can be found everywhere in real life, such as electronic medical record systems, health record systems, and some common databases. These systems contain rich multi-source information, and researching this type of data will help to mine useful information in big data. In this paper, each data type of an MsMIS is considered and the fuzzy rough set needs to calculate the similarity between them according to its characteristics, so as to construct a fuzzy similarity relation.The fusion of all fuzzy similarity relations is also a great challenge. In [10], the authors demonstrate that multi-kernel learning can be used to transform complex data into a unified framework.
Therefore, it would be advisable to employ a multi-kernel approach to integrate the information derived from various fuzzy similarity relations. Kernel functions are utilized to measure similarity, subsequently combined through a specific fusion approach. A collection of kernel functions is devised to assess the similarity among samples characterized by diverse attribute types. An equivalence relation is established using a match kernel [11]; a string kernel is applied to determine the similarity of two strings in genetic analysis [12]; and a histogram intersection kernel is employed for image matching [13]. Recently, numerous studies have been conducted on multi-kernel learning [14], attribute reduction [10,15,16], and classification tasks [17,18]. However, the application of fuzzy similarity relations to construct fuzzy information granularity aims to reveal several mathematical properties within the framework of a MsMIS, particularly highlighting its significant exploratory value when integrated with FRS theory [19].
The work presented in this paper makes several meaningful contributions to the domain of fuzzy granulation, as outlined below.
(1) The concept of an MsMIS is defined, which adeptly combines multi-kernel learning techniques with similarity measures among samples to establish basic information granules within the framework of granular computing.
(2) By drawing on the principles of multi-source information fusion, this paper achieves effective integration of different information granules.
(3) This work validates the rationale and effectiveness of the proposed information granules within the MsMIS framework, developing a novel FRS model based on this validation.
(4) This model not only enriches the theoretical system of fuzzy sets and granular computing but also provides robust tools and support for subsequent researchers to conduct more in-depth explorations in attribute reduction.
In summary, the presented work opens new perspectives and pathways for future research, carrying significant theoretical and practical implications.
The rest of this paper is organized as follows. Section 2 provides some fundamental concepts about Pawlak rough sets and fuzzy rough sets. In Section 3, the concept of multi-source mixed information systems is defined and explained with an example. Section 4 provides definitions and proofs related to fuzzy information granularity in the context of FRSs, and the concepts of the classical attribute reduction method based on FRS are given. Finally, Section 5 concludes this paper.

2. Preliminaries

In this section, we first introduce some concepts of Pawlak rough sets and fuzzy rough sets.
Definition 1 
([2]). Given an information system ( U , A , V , f ) , U = { x 1 , x 2 , , x n } is a non-empty finite set of object, A is a non-empty finite set of attribute, V = a i A V a i is a union of attribute domain, and f : U × A V is an information function, x U and a i A , f ( x , a i ) V a i . For the sake of simplicity, f ( x , a i ) can be expressed as a i ( x ) .
In general, the domain U is divided into a set of equivalence classes by an equivalence relation R , denoted as [ x ] R . For each x i , y i , x k U , equivalence relation R satisfies 1) reflexivity ( R ( x i , x i ) = 1 ), 2) symmetry ( R ( x i , x j ) = R ( x j , x i ) ), and 3) transitivity ( R ( x i , x j ) = R ( x j , x k ) R ( x i , x j ) = R ( x i , x k ) ). Given X U , the lower and upper approximations can be denoted as
R ̲ ( X ) = { x U | [ x ] R X } ; R ¯ ( X ) = { x U | [ x ] R X } ,
where [ x ] R = { x i U | x i R x j } refers to the equivalence class containing X.
Definition 2 
([20]). Given the universe U = { x 1 , x 2 , , x n } , let A be a map of U to [0,1], then A is known as the membership function on A . The fuzzy set can be defined as A = A ( x 1 ) x 1 + A ( x 2 ) x 2 + + A ( x n ) x n .
Let F ( U ) be all fuzzy sets on U, given A , B F ( U ) , the following operations of fuzzy sets are defined as
( 1 )   A = B A ( x ) = B ( x ) ,
( 2 )   A B A ( x ) B ( x ) ,
( 3 )   ( A B ) ( x ) = max { A ( x ) , B ( x ) } = A ( x ) B ( x ) ,
( 4 )   ( A B ) ( x ) = min { A ( x ) , B ( x ) } = A ( x ) B ( x ) ,
( 5 )   A c ( x ) = 1 A ( x ) .
Definition 3 
([21]). Let the binary operator T be a triangular function on the interval [ 0 , 1 ] , called T -norm, if for any x i , x j , x k , x z [ 0 , 1 ] , it satisfies (1) commutativity ( T ( x i , x j ) = T ( x j , x i ) ); (2) associativity ( T ( T ( x i , x j ) , x k ) = T ( x i , T ( x j , x k ) ) ); and (3) monotonicity ( x i x j , x k x z T ( x i , x k ) T ( x j , x z ) ) and boundary condition ( T ( x i , 1 ) = x i , T ( 1 , x i ) = x i ).
If the mapping S : [ 0 , 1 ] × [ 0 , 1 ] [ 0 , 1 ] fulfills the four conditions outlined in Definition 3, then for any x i [ 0 , 1 ] , the value S ( x i , 0 ) signifies that S is termed a triangular-conorm, abbreviated as a t-conorm. If N is a monotone decreasing mapping from [ 0 , 1 ] to [ 0 , 1 ] with N ( 0 ) = 1 and N ( 1 ) = 0 , then N is called a complement operator. N s ( x i ) = 1 x i is the standard complement operator. c o N X ( x i ) = N ( X ( x i ) ) for any X F ( U ) , c o N X ( x i ) is known as fuzzy complement of X determined by a negator N. If for any x i U , N ( X ( x i ) ) = x i , then
S ( x i , x j ) = N ( T ( N ( x i ) , N ( x j ) ) ) , T ( x i , x j ) = N ( S ( N ( x i ) , N ( x j ) ) ) ,
where T and S are dual w.r.t. N.
Let A , B , C F ( U ) , we have (1) If for any x U , C ( x ) = S ( A ( x ) , B ( x ) ) , then C is the S-norm union of A and B, denoted as C = A S B ; and (2) If for any x U , C ( x ) = T ( A ( x ) , B ( x ) ) , then C is the T-norm intersection of A and B, denoted as C = A T B . In addition, let J be a non-empty index set, for any j J , x , x j [ 0 , 1 ] , we call T a left-continuous t-norm if and only if T ( j J x j , x ) = j J T ( x j , x ) . Similarly, S is called right-continuous t-norm if and only if S ( i J x j , x ) = i J S ( x j , x ) .
Given a T-norm, for any x i , x j [ 0 , 1 ] , then,
θ ( x i , x j ) = sup { x k [ 0 , 1 ] : T ( x i , x k ) x j } , σ ( x i , x j ) = inf { z [ 0 , 1 ] : T ( x i , x z ) x j } ,
where θ is an R-implicator induced by a T-norm, and σ is a T-coimplicator induced by a T-norm and negation, it is noteworthy that the fuzzy intersection operation denoted as “min" and the fuzzy union operation denoted as “max" constitute a special pair of T-norm and T-conorm operators. These operators are widely utilized and further extended in various contexts. Hu et al. [22] comprehensively summarized several common T-norm and T-conorm operators, which we will not reiterate here.
For nominal variables, the relationship between objects is established by analyzing whether the values of the objects in the variables are equal, forming a division of the domain. However, if the variables are numerical or fuzzy, then a clear equivalence relation is not enough to accurately describe the relationship between objects. The equivalence relation is the most basic concept in the Pawlak rough set. Similarly, for fuzzy rough sets, the concept of the fuzzy equivalence relation is fundamental. Given a non-empty finite set U, the fuzzy relation R is called the fuzzy equivalence relation on U. For any x i , x j , x k U , it satisfies three conditions, namely, (1) reflexivity R ( x i , x i ) = 1 ; (2) symmetry ( R ( x i , x j ) = R ( x j , x i ) ) ; and (3) transitivity R ( x i , x j ) min { R ( x i , x k ) , R ( x k , x j ) } . R is the fuzzy equivalence relation on U [21]. More generally, we call R the fuzzy T-equivalence relation, if for any x i , x j , x k U , R satisfies reflexivity, symmetry, and T-transitivity ( T ( R ( x i , x j ) , R ( x j , x k ) ) R ( x i , x k ) ).
Definition 4 
([1]). Let R be an equivalence relation on U. x i U , [ x i ] R = r i 1 x 1 + r i 2 x 2 + + r i n x n denotes a fuzzy equivalence class. If X F ( U ) , the lower and upper approximations of X are a pair of fuzzy sets on U whose membership functions are, respectively,
R ̲ S X ( x ) = inf y U S ( N ( R ( x , y ) ) , X ( y ) ) R ¯ T X ( x ) = sup y U T ( R ( x , y ) , X ( y ) )
where ( R ̲ S X , R ¯ T X ) are called the ( S , T ) -fuzzy rough sets of X.

3. Multi-Source Mixed Information Systems

Definition 5. 
A multi-source mixed information system (MsMIS) can be expressed as ( U , A { d } , V , f ) , where U is the object set, and A = A c A n A i m A s v A i v , where A c is the categorical attribute, A n is the numerical attribute, A i m is the image attribute, A s v is the set-valued attribute, and A i v is the interval-valued attribute. d means the decision attribute set (sometimes written as D = { d 1 , d 2 , , d l } ). For the sake of simplicity, A = { a 1 , a 2 , , a m } is used to represent the attribute set, where m represents the number of attributes.
Example 1. 
Table 1 illustrates an MsMIS, wherein X = { x 1 , x 2 , x 3 , x 4 } represents the collection of four patients and A = { a 1 , a 2 , a 3 , a 4 , a 5 } signifies the set comprising five different attributes. In this table, a 1 is the categorical attribute ("-" and "+" represent negative and positive, respectively), a 2 is the numerical attribute, a 3 is the set-valued attribute, a 4 is the interval-valued attribute, and a 5 is the image attribute, and d denotes the set of decision attributes.

3.1. Similarity Relations

In an MsMIS, mixed attributes can encompass a diverse range of data types, including Boolean values, categorical information, real-valued data, set-valued data, interval-valued data, text content, image files, video clips, audio recordings, and sensor signals [23]. In this article, we consider the similarity between two attributes, and their related definitions [11,24,25,26,27] are as follows.
Definition 6. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. x i , x j U , the similarity of categorical data is denoted as
S c ( x i , x j ) = 0 i f x i x j ; 1 i f x i = x j .
From Definition 6, the similarity function R c ( x i , x j ) can be applied to extract the equivalence relation from categorical data.
Definition 7. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. x i , x j U , the similarity of numerical data is denoted by
S n ( x i , x j ) = exp ( | | x i x j | | 2 δ 2 ) .
The similarity function S n ( x i , x j ) is called a Gaussian kernel, which can be used to extract information from numerical data.
Definition 8. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. The similarity of image data is denoted as
S i m ( x i , x j ) = k = 1 P min ( x i k , x j k ) .
The similarity function S i m ( x i , x j ) is a histogram intersection kernel, which is used to compute the similarity between histograms of image data. x i k and x j k express the number of pixels that have colors in the k-th fixed list of P color ranges, where k = 1 P x i k = 1 and k = 1 P x j k = 1 . In image data, a color histogram indicates the distribution of colors.
Definition 9. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. The similarity of set-valued data is denoted by
S s v ( x i , x j ) = a ( x i ) a ( x j ) a ( x i ) a ( x j ) ,
where a ( x i ) is the attribute set on a. The similarity function S s v ( x i , x j ) can measure the similarity between set-valued data.
Definition 10. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. The similarity of interval-valued data is denoted as
S i v ( x i j , x l j ) = exp 1 2 | | x i j x l j | | 2 ( i , l = 1 , 2 , , k ) ,
where | | x i j x l j | | 2 = ( a i j a l j ) 2 + ( b i j b l j ) 2 and the interval-valued data x i j = ( a i j , b i j ) with a i j b i j . The similarity function S i v ( x i , x j ) can normalize each interval-valued variable.

3.2. Kernel Functions Fusion Based on Similarity

In rough sets, the relation between two attributes can be calculated by an intersection operation, while in fuzzy rough sets, T-norm operations are used [10]. In this paper, the min operation is applied to the fusion of multiple similarities in terms of T-norm.
Definition 11. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. The matrix of the m-th similarity function S m can be denoted by
M ( S m ( x i , x j ) ) n × n = S m x 1 , x 1 S m x 1 , x 2 S m x 1 , x n S m x 2 , x 1 S m x 2 , x 2 S m x 2 , x n S m x n , x 1 S m x n , x 2 S m x n , x n ,
where S m refers to a similarity function, which is calculated for attributes a i , i = 1 , 2 , , m .
Definition 12. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. If two similarity functions S i and S j are used to calculate the attributes a i and a j , respectively, then, for x , y U , new operators based on fuzzy T-norm operators are denoted as follows:
(1) Standard min operator
K T m ( x , y ) = min S i ( x , y ) , S j ( x , y ) .
(2) Algebraic product
K T P ( x , y ) = S i ( x , y ) × S j ( x , y ) .
(3) Lukasiewicz T-norm
K T l ( x , y ) = max S i ( x , y ) + S j ( x , y ) 1 , 0 .
(4) T cos -norm
K T cos ( x , y ) = max S i ( x , y ) S j ( x , y ) 1 S i ( x , y ) 2 1 S j ( x , y ) 2 , 0
Proposition 1. 
Given two similarity functions S i , S j [ 0 , 1 ] for any two conitional attributes a i and a j , then
K S i a n d K S j ,
where = T m , T P , T l , T cos , and S i ( x , x ) = 1 , S j ( x , x ) = 1 .
Proof of Proposition 1. 
Proposition 1 is easy to prove. For example, let = T m . Suppose that A = { a i , a j } and K T m A = min { S i , S j } . We have S i ( x , y ) K T m ( x , y ) and S j ( x , y ) K T m A ( x , y ) . Therefore, K T m S i and K T m S j . In the same way, we can prove K T P S i and K T P S j , K T l S i and K T l S j , K T c o s S i and K T c o s S j . □
Example 2. 
Figure 1 indicates the process of multi-source mixed data fusion. Four objects are described using the five attributes a 1 , a 2 , a 3 , a 4 , and a 5 . Thus, the similarity of categorical data S c , the similarity of numerical data S n , the similarity of set-valued S s v , the similarity of interval-valued data S i v , and the similarity of image data S i m are considered to calculate the fuzzy relation between objects. We use the objects x 2 and x 4 as an example, i.e.,
(1) For categorical data: S c ( x 2 , x 4 ) = 1 ;
(2) For numerical data: S n ( x 2 , x 4 ) = exp ( | | 0.5 0.2 | | 2 0 . 5 2 ) = 0.70 ;
(3) For set-valued data: S s v ( x 2 , x 4 ) = { M , U , P } { U } { M , U , P } { U } = 1 3 = 0.33 ;
(4) For interval-valued data: S i v ( x 2 , x 4 ) = exp { 1 2 ( ( 1 0 ) 2 + ( 2 2 ) 2 ) } = 0.61 ;
(5) For image data: S i m ( x 2 , x 4 ) = min ( 0.6 , 0.5 ) + min ( 0.1 , 0.1 ) + min ( 0.2 , 0.2 ) + min ( 0.1 , 0.2 ) = 0.9 . Then, these five attributes are combined by K T m :
K T m ( x 1 , x 2 ) = min ( 1 , 0.70 , 0.33 , 0.61 , 0.9 ) = 0.33 .
Therefore,
M ( K T m ( x i , x j ) ) 4 × 4 = 1 0 0 0 0 1 0.37 0.33 0 0.37 1 0.14 0 0.33 0.14 1 4 × 4 ;
Similarly, M ( K T P ( x i , x j ) ) 4 × 4 = 1 0 0 0 0 1 0.19 0.08 0 0.19 1 0.02 0 0.08 0.02 1 4 × 4 ;
M ( K T l ( x i , x j ) ) 4 × 4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 4 × 4 ;
M ( K T c o s ( x i , x j ) ) 4 × 4 = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 4 × 4 .
M ( K T m ( x i , x j ) ) 4 × 4 , M ( K T P ( x i , x j ) ) 4 × 4 , M ( K T l ( x i , x j ) ) 4 × 4 and M ( K T c o s ( x i , x j ) ) 4 × 4 are the fuzzy T-equivalence relation matrix in terms of fuzzy T-norm operators K T m ( x , y ) , K T P ( x , y ) , K T l ( x , y ) , and K T c o s ( x , y ) , respectively. In addition, this example can easily verify Proposition 1.
From Figure 1, it is evident that by calculating the similarity function for each data type, feature information is extracted. This extracted information can then be fused using the operator K T m . This computational process can be viewed as multi-source information fusion, where information from different sensors or sources is integrated and processed in a similar manner.

3.3. Two Types of FRSs Based on Similarity Function

Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. Given P A , P = { a i , a j } , if two similarity functions S i and S j are derived with a i and a j , respectively, then, we can fuse similarity functions ( K T P = f T ( S i , S j ) ) based on T-norm to obtain fuzzy relation, which can be called the fuzzy T-equivalence relation. Based on the work in [10,22], we can derive two types of FRSs as shown below.
Definition 13. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. Given P A , the fuzzy S-lower and T-upper approximations of X are defined by
K T , S P ̲ ( X ) ( x ) = inf y U S ( N ( K T P ( x , y ) ) , X ( y ) ) , K T , T P ¯ ( X ) ( x ) = sup y U T ( K T P ( x , y ) ) , X ( y ) ) ,
where K T P = f T ( S i , S j ) and ( K T , S P ̲ X , K T , T P ¯ X ) are known as the ( S , T ) -fuzzy rough sets.
Definition 14. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. Given P A , the fuzzy θ-lower and σ-upper approximations of X are defined by
K T , θ P ̲ ( X ) ( x ) = inf y U θ ( K T P ( x , y ) , X ( y ) ) , K T , σ P ¯ ( X ) ( x ) = sup y U σ ( N ( K T P ( x , y ) ) , X ( y ) ) ,
where K T P = f T ( S i , S j ) , and ( K T , θ P ̲ X , K T , σ P ¯ X ) are known as the ( θ , σ ) -fuzzy rough sets.

4. FRSs Based on Fuzzy Information Granularity

According to Equations (2) and (3), T, S, and N are discussed based on logical operators. In this section, we will study them from the perspective of fuzzy information granularity.
Definition 15 
([6]). For any x U , given ϑ [ 0 , 1 ] , a fuzzy set is denoted as x ϑ , and it follows that:
x ϑ ( y ) = ϑ , y = x , 0 , y x .
Definition 16 
([6]). Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for X F ( U ) and ϑ [ 0 , 1 ] , then several fuzzy information granularities can be defined as
[ x ϑ ] K T P T ( y ) = T ( K T P ( x , y ) , ϑ ) , [ x ϑ ] K T P S ( y ) = S ( N ( K T P ( x , y ) ) , N ( ϑ ) ) .
[ x X ] K T P T ( y ) = T ( K T P ( x , y ) , X ( x ) ) , [ x X ] K T P S ( y ) = S ( N ( K T P ( x , y ) ) , N ( X ( x ) ) ) .

4.1. The Rationality of Fuzzy Information Granularities Is in an MsMIS

Definition 17. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , let A K T P T = { [ x ϑ ] K T P T : x U , ϑ [ 0 , 1 ] } and A K T P S = { [ x ϑ ] K T P S : x U , ϑ [ 0 , 1 ] } . Then, we have:
( 1 )   x U , [ x ϑ ] K T P T ( y ) = T ( [ x ] K T P T ( y ) , ϑ ) , [ x ϑ ] K T P S ( y ) = S ( [ x ] K S P T ( y ) , N ( ϑ ) ) .
( 2 )   x U , if ϑ < ϑ , then [ x ϑ ] K T P T [ x ϑ ] K T M P T and [ x ϑ ] K T P S [ x ϑ ] K T P S .
( 3 )   x U ,   [ x ϑ ] K T P T = ϑ < ϑ [ x ϑ ] K T P T , [ x ϑ ] K S P T = ϑ < ϑ [ x ϑ ] K T P S .
( 4 )   c o N ( [ x ϑ ] K T P T ) = [ x ϑ ] K T M P S , c o N ( [ x ϑ ] K T P S ) = [ x ϑ ] K T M P T .
(Proof of Definition 17). 
( 1 ) Let K T P ( x , y ) = [ x ] K T P T ( y ) , N ( K S P ( x , y ) ) = [ x ] K T P S ( y ) . Hence, the proof is obtained.
( 2 ) Since ϑ < ϑ , it is easy prove that [ x ϑ ] K T P T [ x ϑ ] K T P T holds. As T and S are dual w.r.t. N, then, [ x ϑ ] K T P S [ x ϑ ] K T P S also holds.
( 3 ) Obviously.
( 4 ) See ( 1 ) and ( 2 ) . □
According to Definition 17, it exhibits the relationships between [ x ϑ ] K T P T and [ x ] K T P T , and [ x ϑ ] K T P S and [ x ] K T P S , respectively. This means [ x ϑ ] K T P T and [ x ϑ ] K T P S can be defined through [ x ] K T P T and [ x ] K T P S , respectively. Therefore, it can be used as an axiom to characterize elements of A K T P T and A K S P S .
Proposition 2. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any x U and ϑ [ 0 , 1 ] , then, the following conditions are equivalent.
( 1 )   K T P is reflexive;
( 2 )   [ x ϑ ] K T M P T ( x ) = ϑ ;
( 3 )   [ x ϑ ] K T M P S ( x ) = N ( ϑ ) .
Proof of Proposition 2. 
It is obvious that ( 1 ) ( 2 ) and ( 1 ) ( 3 ) .
( 2 ) ( 1 ) ”: K T P ( x , x ) < 1 , pick K T P ( x , x ) < ϑ , then [ x ϑ ] K T M P T ( x ) K T P ( x , x ) < ϑ , it is a contradiction, so K T P ( x , x ) = 1 .
( 3 ) ( 1 ) ”: if K T P ( x , x ) < 1 , then N ( K T P ( x , x ) > 0 . If K T P ( x , x ) < ϑ , then N ( K T P ( x , x ) > N ( ϑ ) . Hence, [ x ϑ ] K T P S ( y ) = S ( N ( K T P ( x , y ) ) , N ( ϑ ) ) > N ( K T P ( x , x ) > N ( ϑ ) is a contradiction. Therefore, K T P ( x , x ) = 1 .
From Proposition 2, if K T P is reflexive, we have x ϑ [ x ϑ ] K T P T and x N ( ϑ ) [ x ϑ ] K T P S . As a result, [ x ϑ ] K T P T and [ x ϑ ] K T P S can be referred to as fuzzy information granularities with x ϑ and x N ( ϑ ) as their centers, respectively.
Proposition 3. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any x U , then, the following conditions are equivalent.
( 1 )   K T P is symmetric;
( 2 )   [ x ] K T P T ( y ) = [ y ] K T P T ( x ) ;
( 3 )   [ x ] K T P S ( y ) = [ y ] K T P S ( x ) .
Proof of Proposition 3. 
Since [ x ] K T P T ( y ) = K T P ( x , y ) and [ x ] K T P S ( y ) = N ( K T P ( x , y ) ) , ( 1 ) and ( 2 ) can be easily proved. □
Proposition 4. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any x U , then, the following conditions are equivalent.
( 1 )   K T P is T-transitive;
( 2 )   [ x ] K T P T ( y ) sup z U T ( [ x ] K T P T ( z ) , [ z ] K T P T ( y ) ) ;
( 3 )   [ x ] K T P S ( y ) inf z U S ( [ x ] K S P T ( z ) , [ z ] K T P S ( y ) ) .
Proof of Proposition 4. 
( 1 ) ( 2 ) ”: x , y , z U , [ x ] K T P T ( y ) = K T P ( x , y ) T ( K T P ( x , z ) , K T P ( z , y ) ) = T ( [ x ] K T P T ( z ) , [ z ] K T P T ( y ) ) . Hence, [ x ] K T P T ( y ) sup z U T ( [ x ] K T P T ( z ) , [ z ] K T P T ( y ) ) .
( 2 ) ( 1 ) ”: It is easily proved by the T-transitive.
( 1 ) ( 3 ) ”: x , y , z U , [ x ] K T P S ( y ) = N ( K T P ) N ( T ( K T P ( x , z ) , K T P ( z , y ) ) ) = S ( N ( K T P ( x , z ) ) , N ( K T P ( z , y ) ) ) = S ( [ x ] K T P S ( z ) , [ z ] K T P S ( y ) ) . Hence, we have the conclusion: [ x ] K T P S ( y ) inf z U S ( [ x ] K S P T ( z ) , [ z ] K T P S ( y ) ) .
( 3 ) ( 1 ) ”: it can be easily proved according to T-transitivity and duality. □
Theorem 1. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any x , y , z U and ϑ [ 0 , 1 ] , then,
( 1 )   [ x ϑ ] K T P T = ϑ , [ x ] K T P T ( y ) = [ y ] K T P T ( x ) , [ x ] K T P T ( y ) = sup z U T ( [ x ] K T P T ( z ) , [ z ] K T P T ( y ) ) .
( 2 )   [ x ϑ ] K T P S = N ( ϑ ) , [ x ] K T P S ( y ) = [ y ] K T P S ( x ) , [ x ] K T P T ( y ) = inf z U S ( [ x ] K T P S ( z ) , [ z ] K T P S ( y ) ) .
Proof of Theorem 1. 
As K T P is reflexive, it can be easily proved. □
Proposition 5. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, then, we have:
( 1 ) If ϑ = ϑ , T ( K T P ( x , y ) , ϑ ) = ϑ and S ( N ( K T P ( x , y ) ) , N ( ϑ ) ) = N ( ϑ ) , then [ x ϑ ] K T P T = [ y ϑ ] K T P T and [ x ϑ ] K T P S = [ y ϑ ] K T P S .
( 2 )   [ x ] K T P T and [ x ] K T P S are the maximal element in A K T P T and the minimal element in A K T P S , respectively.
( 3 )   [ x ϑ ] K T P T = { [ y ν ] K T P T : x ϑ [ y ν ] K T P T } , [ x ϑ ] K T P S = { [ y ν ] K T M P S : [ y ν ] K T P S ( x ) N ( ϑ ) } .
( 4 )   [ x ϑ ] K T P T = z U [ z ν ] K T P T where ν = [ z ϑ ] K T P T ( x ) ; [ x ϑ ] K T P S = z U [ z ι ] K T P S where ι = [ z ϑ ] K T P S ( x ) .
( 5 )   y ξ [ x ϑ ] K T P T [ y ξ ] K T P T [ x ϑ ] K T P T ; y ξ ¬ [ x ϑ ] K T P S [ y ξ ] K T P S [ x ϑ ] K T P S .
Proof of Proposition 5. 
( 1 ) If [ x ϑ ] K T P T = [ y ϑ ] K T P T , then ϑ = ϑ . According to Theorem 1(1), T ( K T P ( x , y ) , ϑ ) = [ x ϑ ] K T P T ( y ) = ϑ = [ x ϑ ] K T P T ( y ) = ϑ holds. On the contrary, if ϑ = ϑ and T ( K T P ( x , y ) , ϑ ) = ϑ . z U , [ z ϑ ] K T P T ( z ) = T ( K T P ( x , z ) , ϑ ) T ( T ( K T P ( x , z ) , K T P ( y , z ) ) , ϑ ) = T ( T ( K T P ( x , z ) , ϑ ) , K T P ( y , z ) ) = T ( K T P ( y , z ) , ϑ ) = [ y ϑ ] K T P T ( z ) . Hence, [ x ϑ ] K T P T [ y ϑ ] K T P T . Similarly, [ x ϑ ] K T P T [ y ϑ ] K T P T . In the same way, we can prove that [ x ϑ ] K T P S = [ y ϑ ] K T P S .
( 2 ) Suppose [ x ] K T P T [ y ] K T P T , we have K T P ( x , y ) = 1 , it implies [ x ] K T P T = [ y ] K T P T . Suppose [ x ] K T P S [ y ] K T P S , then N ( K T P ( x , y ) ) = 0 , this implies [ x ] K T P S = [ y ] K T P S .
( 3 ) Suppose x ϑ [ y ν ] K T P T , we have ϑ T ( K T P ( x , y ) , ν ) . z U , [ x ϑ ] K T P T ( z ) = T ( K T P ( x , y ) , ϑ )   T ( K T P ( x , z ) , T ( K T P ( x , y ) , ν ) ) = T ( T ( K T P ( x , z ) , K T P ( x , y ) ) , ν ) T ( K T P ( y , z ) , ν ) = [ y ν ] K T P T ( z ) . Hence, [ x ϑ ] K T P T [ y ν ] K T P T , which implies [ x ϑ ] K T P T = { [ y ν ] K T P T : x ϑ [ y ν ] K T P T } . Suppose [ y ν ] K T P S N ( ϑ ) , we have S ( N ( K T P ( x , y ) ) , N ( ν ) ) N ( ϑ ) . Hence, [ x ϑ ] K T P S ( z ) = S ( N ( K T P ( x , z ) ) , N ( ϑ ) ) S ( N ( K T P ( x , z ) ) , S ( N ( K T P ( x , y ) , N ( ν ) ) ) S ( S ( N ( K T P ( y , z ) ) , N ( ν ) ) = [ y ν ] K T P S ( z ) . This implies [ x ϑ ] K T P S = { [ y ν ] K T P S : [ y ν ] K T P S ( x ) N ( ϑ ) } .
( 4 ) According to T-transitivity of K T P , then K T P ( x , y ) T ( K T P ( x , y ) , K T P ( y , z ) ) . This implies T ( K T P ( x , y ) , ϑ ) sup z U T ( T ( K T P ( x , y ) , K T P ( x , z ) ) , ϑ ) = sup z U T ( K T P ( z , y ) , T ( K T P ( x , z ) , ϑ ) ) . Hence, [ x ϑ ] K T P T = z U [ z ν ] K T P T , ν = [ z ϑ ] K T P T ( x ) . Similarly, we can prove that [ x ϑ ] K T P S = z U [ z ι ] K T P S where ι = [ z ϑ ] K T P S ( x ) .
( 5 ) : It is straightforward.

4.2. The Lower and Upper Approximations of Fuzzy Sets Are Constructed Using Information Granularities

In this subsection, we combine fuzzy information granularity from the perspective of granular computing. Initially, fuzzy information granularities are employed to form the lower and upper approximations of fuzzy sets. These approximations represent a seamless extension of the precise approximations found in traditional rough sets and align perfectly with those in FRSs.
Definition 18. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any X F ( U ) and ϑ [ 0 , 1 ] , then,
G K T , T P ̲ ( X ) = { [ x ϑ ] K T P T : x U , ϑ [ 0 , 1 ] , [ x ϑ ] K T P T X } , G K T , S P ¯ ( X ) = { [ x ϑ ] K T P S : x U , ϑ [ 0 , 1 ] , X [ x ϑ ] K T P S } ,
where G K T , T P ̲ is referred to as the T-lower approximation of X and G K T , S P ¯ is referred to as the S-upper approximation of X. We can call ( G K T , T P ̲ ( X ) , G K T , S P ¯ ( X ) ) granular ( T , S ) -fuzzy rough sets.
Theorem 2. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any X F ( U ) , then
(1) G K T , T P ̲ ( X ) ( x ) = K T , θ P ̲ ( X ) ( x ) ;
(2) G K T , S P ¯ ( X ) ( x ) = K T , σ P ¯ ( X ) ( x ) .
Proof of Theorem 2. 
( 1 ) We first need to prove G K T , T P ̲ ( X ) ( x ) = inf y U θ ( K T P ( x , y ) , X ( y ) ) . According to Definitions 16 and 18, x U , let ϑ = G K T , T P ̲ ( X ) ( x ) , then, [ x ϑ ] K T P T ( y ) = T ( K T P ( x , y ) , ϑ ) = T ( K T P ( x , y ) , sup { [ z ν ] K T P T ( x ) : [ z ν ] K T P T X } ) = sup { T ( K T P ( x , y ) , T ( K T P ( z , x ) , ν ) } . Because T-transitivity of K T P , so sup { T ( K T P ( x , y ) , T ( K T P ( z , x ) , ν ) } sup { T ( K T P ( z , y ) , ν ) } . Hence, sup { T ( K T P ( x , y ) , T ( K T P ( z , x ) , ν ) : [ z ν ] K T P T X } sup { T ( K T P ( z , y ) , ν ) : [ z ν ] K T P T X } = G K T , T P ̲ ( X ) ( y ) . Therefore, we have [ x ϑ ] K T P T G K T , T P ̲ ( X ) .
ϑ > ϑ , then [ x ϑ ] K T P T X . Otherwise G K T , T P ̲ ( X ) ( x ) = ϑ , it is a contradiction. This implies [ x ϑ ] K T P T is the maximal in { [ x ω ] K T P T : ω [ 0 , 1 ] } X . Conversely, y U , if for any [ x ω ] K T P T such that [ x ω ] K T P T ( y ) X ( y ) , we have [ x ϑ ] K T P T [ x ω ] K T P T and [ x ϑ ] K T P T y U ( [ x ω ] K T P T ) . Since [ x ϑ ] K T P T is the maximal in { [ x ω ] K T P T : ω [ 0 , 1 ] } X , then, ϑ = [ x ϑ ] K T P T ( x ) = inf y U sup { ω : T ( K T P ( x , y ) , ω ) X ( y ) } = inf y U θ ( K T P ( x , y ) , X ( y ) ) = G K T , T P ̲ ( X ) ( x ) . Therefore, G K T , T P ̲ ( X ) ( x ) = K T , θ P ̲ ( X ) ( x ) .
( 2 )   G K T , S P ¯ ( X ) = c o N G K T , T P ̲ ( c o N X ) = c o N K T , θ P ̲ ( c o N X ) = K T , σ P ¯ ( X ) . □
Example 3. 
By Example 2, a fuzzy T-equivalence relation K T m P in terms of min operator T m , is denoted by
K T m P x i , x j = 1 0 0 0 0 1 0.37 0.33 0 0.37 1 0.14 0 0.33 0.14 1 .
Then, for any X F ( U ) , let T ( x , y ) = max { 0 , x + y 1 } ;   S ( x , y ) = min { 1 , x + y } . By Definition 15, then, K T , σ P ¯ ( X ) ( x ) = sup y U max { 0 , K T P ( x , y ) + X ( y ) 1 } , K T , θ P ̲ ( X ) ( x ) = inf y U min { 1 , 1 K T P ( x , y ) + X ( y ) } . Assume that X = 0.1 x 1 + 0.2 x 2 + 0.7 x 3 + 0.9 x 4 . We have:
K T , σ P ¯ ( X ) = 0.1 x 1 + 0.23 x 2 + 0.7 x 3 + 0.9 x 4 ;
K T , θ P ̲ ( X ) = 0.1 x 1 + 0.2 x 2 + 0.7 x 3 + 0.87 x 4 .
x U and ϑ [ 0 , 1 ] . [ x ϑ ] K T m P T ( y ) = max { 0 , K T m P ( x , y ) + ϑ 1 } , for each K T , θ P ̲ ( X ) ( x i ) = ϑ i , then:
[ ( x 1 ) 0.1 ] K T m P T = 0.1 x 1 + 0 x 2 + 0 x 3 + 0 x 4 ; [ ( x 1 ) 0.2 ] K T m P T = 0 x 1 + 0.2 x 2 + 0 x 3 + 0 x 4 ;
[ ( x 1 ) 0.7 ] K T m P T = 0 x 1 + 0.07 x 2 + 0.7 x 3 + 0 x 4 ; [ ( x 1 ) 0.87 ] K T m P T = 0 x 1 + 0.2 x 2 + 0.01 x 3 + 0.87 x 4 .
From Example 3, K T , θ P ̲ ( X ) ( x ) consists of the union of these four fuzzy granularities ( [ ( x 1 ) 0.1 ] K T m P T , [ ( x 2 ) 0.2 ] K T m P T , [ ( x 3 ) 0.7 ] K T m P T , and [ ( x 4 ) 0.87 ] K T m P T ), i.e.,
G K T , T P ̲ ( X ) = [ ( x 1 ) 0.1 ] K T m P T [ ( x 2 ) 0.2 ] K T m P T [ ( x 3 ) 0.7 ] K T m P T [ ( x 4 ) 0.87 ] K T m P T = K T , θ P ̲ ( X ) .
Definition 19. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any X F ( U ) and ϑ [ 0 , 1 ] , then,
G K T , θ P ̲ ( X ) = { [ x N ( X ( x ) ) ] K T P S : x U } , G K T , σ P ¯ ( X ) = { [ x X ( x ) ] K T P T : x U } ,
where G K T , θ P ̲ is referred to as the θ-lower approximation of X and G K T , σ P ¯ is referred to as the σ-upper approximation of X. We can call the ( G K T , θ P ̲ ( X ) , G K T , σ P ¯ ( X ) ) granular ( θ , σ ) -fuzzy rough sets.
Theorem 3. 
Assume that M s M I S = ( U , A { d } , V , f ) is an MsMIS. Given P A , if K T P is a fuzzy T-equivalence relation on U, for any X F ( U ) , then
(1) G K T , θ P ̲ ( X ) ( x ) = K T , S P ̲ ( X ) ( x ) ;
(2) G K T , σ P ¯ ( X ) ( x ) = K T , T P ¯ ( X ) ( x ) .
Proof of Theorem 3. 
(1) For x U ,   G K T , θ P ̲ ( X ) ( x ) = inf x U [ x N ( X ( x ) ) ] K T P S ( x ) = inf x U S ( N ( K T P ( x , y ) ) , X ( y ) ) .
(2) For x U ,   G K T , σ P ¯ ( X ) ( x ) = ( { [ x X ( x ) ] K T P T : x U } ) ( x ) = sup x U [ x X ( x ) ] K T P T ( x ) = sup x U T ( K T P ( x , y ) , X ( y ) ) .

4.3. Continuation of the Classic Attribute Reduction in the FRS Model

Definition 20. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. Given P A , x , y U , and K T P is the fuzzy T-equivalence relation in P calculated by K T P ( x , y ) , for any ϑ [ 0 , 1 ] , the fuzzy positive regions of D w.r.t. P is defined by
P O S T , ϑ P ( D ) = i = 1 l G K T , ϑ P ̲ d i ,
where D = { d 1 , d 2 , , d l } .
Definition 21. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. Given P A , and K T P is the fuzzy T-equivalence relation in P calculated by K T P ( x , y ) , for any ϑ [ 0 , 1 ] , the quality of approximating classification can be denoted as
γ T , ϑ P ( D ) = | P O S T , ϑ P ( D ) | | U | ,
where D = { d 1 , d 2 , , d l } and | | P O S T , ϑ P ( D ) | | U | | = i = 1 l x d i G K T , ϑ P ̲ d i ( x ) .
The fuzzy approximation quality is a fuzzy extension of the Pawlak rough set approximation quality. Since the lower approximation reflects the degree to which a sample necessarily belongs to its decision in the fuzzy approximation space, the fuzzy approximation quality as an indicator reflects the average degree to which a sample necessarily belongs to its decision in the feature fuzzy approximation space. The larger the value of this indicator, the more certain the classification in the fuzzy approximation space, and the stronger the ability to approximate multi-decisions. The greater the dependency of the decision on the attribute subset. Therefore, the approximation quality is also referred to as the degree of dependence of the decision on the condition.
Theorem 4. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS. Given P , Q A , P = { a 1 , a 2 , , a p } , Q = { a 1 , a 2 , , a q } , p q . K T P and K T Q are the fuzzy T-equivalence relation calculated by kernel fusion function K T m ( x , y ) in feature spaces P and Q, respectively, for any ϑ [ 0 , 1 ] , we have
(1) K T , ϑ P K T , ϑ Q ;
(2) K T , ϑ P ̲ d i K T , ϑ Q ̲ d i ;
(3) K T , ϑ P ¯ d i K T , ϑ Q ¯ d i ;
(4) P O S T P ( D ) P O S T , ϑ Q ( D ) ;
(5) γ T , ϑ P ( D ) γ T , ϑ Q ( D ) .
Proof of Theorem 4 
( 1 ) Obviously.
( 2 ) Let P = a 1 a 2 a p and Q = a 1 a 2 a q . Since P Q , then p q . We have K T m P = min { K 1 , K 2 , , K P } = a , and K T m Q = min { a , K P + 1 , K P + 2 , , K Q } . Hence, K T m P K T m Q . Using the same method, we can prove the other T-norms. Hence, for any ϑ [ 0 , 1 ] , we can easy obtain K T , ϑ P ̲ d i K T , ϑ Q ̲ d i and K T , ϑ P ¯ d i K T , ϑ Q ¯ d i .
( 3 ) See ( 2 ) .
( 4 ) According to Definition 20 and ( 2 ) , the proof is derived from the monotonicity of the lower approximation.
( 5 ) According to Definition 21 and ( 4 ) , it is easy to obtain the proof.
Theorem 4 shows that with an increase in attributes, the approximation quality monotonically increases. This indicates that the addition of new attributes will inevitably introduce new classification information, causing the granulation structure of the approximation space to become finer. Consequently, the approximation ability is enhanced, and the approximation of the classification decision may become more accurate. □
Definition 22. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS, and D = { d 1 , d 2 , , d l } . For any ϑ [ 0 , 1 ] , and a P A , if γ T , ϑ P ( D ) = γ T , ϑ P { a } ( D ) , then a is redundant in P w.r.t. D; otherwise, a is indispensable.
Definition 23. 
Let M s M I S = ( U , A { d } , V , f ) be an MsMIS, and D = { d 1 , d 2 , , d l } . For any ϑ [ 0 , 1 ] , and a P A , P is a relative reduct to D if it satisfies:
(1) γ T , ϑ P ( D ) = γ T , ϑ P { a } ( D ) ;
(2) a P , a is indispensable in P to decision attribute D.
Definitions 22 and 23 are classical attribute reduction methods. The advantage of this approach is that it obtains the attribute indistinguishability relation through an approximation quality. It is a continuation of the classical rough set theory applied to attribute reduction, and similarly applies within the FRS models.

5. Conclusions

In this paper, the multi-source mixed information system (MsMIS) is defined as a complex data representation that integrates diverse data types. For each data type, a similarity function is designed based on its inherent characteristics, which functions similarly to a kernel function in computing the similarity between two samples. Under the framework of granular computing, we further utilize fuzzy similarity relations to construct information granules, thereby building a fuzzy rough set model tailored for the MsMIS. We have demonstrated that the basic information granularity can be used to represent this complex mathematical property. The proposed fuzzy rough set model holds significant implications for both theoretical research and practical applications.
In future work, we aim to explore the application of the proposed FRS models to two key areas, attribute reduction and outlier detection. By leveraging FRS models, we intend to streamline datasets by identifying and eliminating redundant or irrelevant attributes, thus improving computational efficiency and model interpretability. Additionally, we will apply the models to outlier detection, helping to identify data points that deviate significantly from the norm, which could enhance the accuracy and robustness of predictive models. This approach will be evaluated to verify the superior performance of the FRS models in these tasks.

Author Contributions

Conceptualization, P.Z.; Formal analysis, P.Z. and Y.Z. (Yujie Zhang); Funding acquisition, Y.Z. (Yuxin Zhao); Investigation, D.W. and Y.Z. (Yuxin Zhao); Methodology, Y.Z. (Yujie Zhang) and D.W.; Project administration, Y.Z. (Yuxin Zhao); Software, D.W. and Y.Z. (Yuxin Zhao); Supervision, P.Z. and Y.Z. (Yujie Zhang); Validation, P.Z. and Z.Y.; Writing—original draft, Z.Y.; Writing—review and editing, P.Z. and Z.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the ’Sichuan Science and Technology Program’ grant number 2022YFS0036, the ’Postdoctoral Fellowship Program of CPSF’ grant numbers GZB20230092 and GZB20230091, and the ’China Postdoctoral Science Foundation’ grant numbers 2023M740383, 2023M730378 and 2023MD744127.

Data Availability Statement

No data were used in the preparation of this manuscript.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  1. Dubois, D.; Prade, H. Rough fuzzy sets and fuzzy rough sets. Int. J. Gen. Syst. 1990, 17, 191–209. [Google Scholar] [CrossRef]
  2. Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
  3. Yang, H.; Liu, J. A novel unemployment rate forecasting method based on fuzzy information granules and GM (1, 1) model. AIMS Math. 2024, 9, 8689–8711. [Google Scholar] [CrossRef]
  4. Jensen, R.; Shen, Q. Fuzzy–rough attribute reduction with application to web categorization. Fuzzy Sets Syst. 2004, 141, 469–485. [Google Scholar] [CrossRef]
  5. Mordeson, J. Fuzzy Group Theory; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
  6. Degang, C.; Yongping, Y.; Hui, W. Granular computing based on fuzzy similarity relations. Soft Comput. 2011, 15, 1161–1172. [Google Scholar] [CrossRef]
  7. Zeng, A.; Li, T.; Liu, D.; Zhang, J.; Chen, H. A fuzzy rough set approach for incremental feature selection on hybrid information systems. Fuzzy Sets Syst. 2015, 258, 39–60. [Google Scholar] [CrossRef]
  8. Li, Z.; Zhang, P.; Xie, N.; Zhang, G.; Wen, C.F. A novel three-way decision method in a hybrid information system with images and its application in medical diagnosis. Eng. Appl. Artif. Intell. 2020, 92, 103651. [Google Scholar] [CrossRef]
  9. Yuan, Z.; Chen, H.; Li, T.; Sang, B.; Wang, S. Outlier detection based on fuzzy rough granules in mixed attribute data. IEEE Trans. Cybern. 2021, 52, 8399–8412. [Google Scholar] [CrossRef]
  10. Hu, Q.; Zhang, L.; Zhou, Y.; Pedrycz, W. Large-scale multimodality attribute reduction with multi-kernel fuzzy rough sets. IEEE Trans. Fuzzy Syst. 2017, 26, 226–238. [Google Scholar] [CrossRef]
  11. Odone, F.; Barla, A.; Verri, A. Building kernels from binary strings for image matching. IEEE Trans. Image Process. 2005, 14, 169–180. [Google Scholar] [CrossRef]
  12. Lodhi, H.; Saunders, C.; Shawe-Taylor, J.; Cristianini, N.; Watkins, C. Text classification using string kernels. J. Mach. Learn. Res. 2002, 2, 419–444. [Google Scholar]
  13. Maji, S.; Berg, A.C.; Malik, J. Classification using intersection kernel support vector machines is efficient. In Proceedings of the 2008 IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, USA, 23–28 June 2008; pp. 1–8. [Google Scholar]
  14. Yeh, Y.R.; Lin, T.C.; Chung, Y.Y.; Wang, Y.C.F. A novel multiple kernel learning framework for heterogeneous feature fusion and variable selection. IEEE Trans. Multimed. 2012, 14, 563–574. [Google Scholar]
  15. Zhang, P.; Wang, D.; Yu, Z.; Zhang, Y.; Jiang, T.; Li, T. A multi-scale information fusion-based multiple correlations for unsupervised attribute selection. Inf. Fusion 2024, 106, 102276. [Google Scholar] [CrossRef]
  16. Qian, D.; Liu, K.; Zhang, S.; Yang, X. Semi-supervised feature selection by minimum neighborhood redundancy and maximum neighborhood relevancy. Appl. Intell. 2024, 54, 7750–7764. [Google Scholar] [CrossRef]
  17. Liu, M.; Sun, W.; Liu, B. Multiple kernel dimensionality reduction via spectral regression and trace ratio maximization. Knowl.-Based Syst. 2015, 83, 159–169. [Google Scholar] [CrossRef]
  18. Nasir, I.M.; Alrasheedi, M.A.; Alreshidi, N.A. MFAN: Multi-Feature Attention Network for Breast Cancer Classification. Mathematics 2024, 12, 3639. [Google Scholar] [CrossRef]
  19. Prasertpong, R. Roughness of soft sets and fuzzy sets in semigroups based on set-valued picture hesitant fuzzy relations. AIMS Math. 2022, 7, 2891–2928. [Google Scholar] [CrossRef]
  20. Zadeh, L.A. Fuzzy logic= computing with words. IEEE Trans. Fuzzy Syst. 1996, 4, 103–111. [Google Scholar] [CrossRef]
  21. Yuan, Z.; Chen, H.M.; Xie, P.; Zhang, P.F.; Liu, J.; Li, T.R. Attribute reduction methods in fuzzy rough set theory: An overview, comparative experiments, and new directions. Appl. Soft Comput. 2021, 107, 107353. [Google Scholar] [CrossRef]
  22. Hu, Q.; Yu, D.; Pedrycz, W.; Chen, D. Kernelized fuzzy rough sets and their applications. IEEE Trans. Knowl. Data Eng. 2010, 23, 1649–1667. [Google Scholar] [CrossRef]
  23. Zeng, A.; Li, T.; Hu, J.; Chen, H.; Luo, C. Dynamical updating fuzzy rough approximations for hybrid data under the variation of attribute values. Inf. Sci. 2017, 378, 363–388. [Google Scholar] [CrossRef]
  24. Schölkopf, B.; Smola, A.J.; Bach, F. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond; MIT Press: Cambridge, MA, USA, 2002. [Google Scholar]
  25. Swain, M.J.; Ballard, D.H. Color indexing. Int. J. Comput. Vis. 1991, 7, 11–32. [Google Scholar] [CrossRef]
  26. Kuncheva, L.I. A stability index for feature selection. In Proceedings of the Artificial Intelligence and Applications, Innsbruck, Austria, 12–14 February 2007; pp. 421–427. [Google Scholar]
  27. Brezočnik, L.; Žlender, T.; Rupnik, M.; Podgorelec, V. Using Machine Learning and Natural Language Processing for Unveiling Similarities between Microbial Data. Mathematics 2024, 12, 2717. [Google Scholar] [CrossRef]
Figure 1. The calculation process of multi-source mixed data fusion.
Figure 1. The calculation process of multi-source mixed data fusion.
Mathematics 12 04039 g001
Table 1. An MsMIS.
Table 1. An MsMIS.
U a 1 a 2 a 3 a 4 a 5 d
x 1 -50{M, P}[1,3]Mathematics 12 04039 i0011
x 2 +36{M, U, P}[1,2]Mathematics 12 04039 i0020
x 3 +28{M, U}[2,3]Mathematics 12 04039 i0030
x 4 +16{U}[0,2]Mathematics 12 04039 i0041
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, P.; Zhao, Y.; Wang, D.; Zhang, Y.; Yu, Z. Fuzzy Rough Set Models Based on Fuzzy Similarity Relation and Information Granularity in Multi-Source Mixed Information Systems. Mathematics 2024, 12, 4039. https://doi.org/10.3390/math12244039

AMA Style

Zhang P, Zhao Y, Wang D, Zhang Y, Yu Z. Fuzzy Rough Set Models Based on Fuzzy Similarity Relation and Information Granularity in Multi-Source Mixed Information Systems. Mathematics. 2024; 12(24):4039. https://doi.org/10.3390/math12244039

Chicago/Turabian Style

Zhang, Pengfei, Yuxin Zhao, Dexian Wang, Yujie Zhang, and Zheng Yu. 2024. "Fuzzy Rough Set Models Based on Fuzzy Similarity Relation and Information Granularity in Multi-Source Mixed Information Systems" Mathematics 12, no. 24: 4039. https://doi.org/10.3390/math12244039

APA Style

Zhang, P., Zhao, Y., Wang, D., Zhang, Y., & Yu, Z. (2024). Fuzzy Rough Set Models Based on Fuzzy Similarity Relation and Information Granularity in Multi-Source Mixed Information Systems. Mathematics, 12(24), 4039. https://doi.org/10.3390/math12244039

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop