Abstract
A set-valued information system (SIS) is the generalization of a single-valued information system. This article explores uncertainty measurement for a SIS by using Gaussian kernel. The fuzzy -equivalence relation lead by a SIS is first obtained by using Gaussian kernel. Then, information structures in this SIS are described by set vectors. Next, dependence between information structures is presented and properties of information structures are investigated. Lastly, uncertainty measures of a SIS are presented by using its information structures. Moreover, effectiveness analysis is done to assess the feasibility of our presented measures. The consequence of this article will help us understand the intrinsic properties of uncertainty in a SIS.
1. Introduction
1.1. Research Background and Related Works
Granular computing (GrC) as a fundamental issue in knowledge representation and data mining was presented by Zadeh [1,2,3,4]. Information granulation, organization and causation are basic notions of GrC. Information granule is a collection of objects that are drawn together by some constraints. The process of building information granules is referred to as information granulation. Information granulation makes a universe into a family of disjoint information granules. Granular structure is the collection of information granules where the internal structure of each information granule is visible as a sub-structure. Lin [5,6,7] and Yao [8,9,10] illustrated the importance of GrC, which aroused people’s interest in it. As yet, the study on GrC has four methods, i.e., rough set theory (RST) [11], fuzzy set theory [12], concept lattice [13,14] and quotient space [15].
RST is an effective tool to manage uncertainty. An information system (IS) on the basis of RST was presented by Pawlak [11,16,17,18,19]. Many applications of RST, for instance, uncertainty modeling [20,21,22,23], reasoning with uncertainty [8,24,25], rule extraction [26,27,28,29], classification and feature selection [30,31,32,33,34] are related to ISs.
In GrC in an IS, the study of information structures is a significant research topic. An equivalence relation is a peculiar similarity between two objects from a dataset. Each attribute subset can determine an equivalence relation which partitions the object set into some disjoint classes. These disjoint classes are addressed as equivalence classes and each of them may be regarded as an information granule consisting of indistinguishable objects [26]. The collection of all these information granules constitutes an information structure by means of set vector in the given IS induced by this attribute subset.
Uncertainty measurement is an important issue in some fields, such as machine learning [35], pattern recognition [36,37], image processing [38], medical diagnosis [39] and data mining [40]. Some scholars have done some exploration in this aspect. For example, Yao et al. [9] presented a granularity measure on the viewpoint of granulation; Wierman [29] provided measures of uncertainty and granularity in RST; Bianucci et al. [41,42] explored entropy and co-entropy approaches for uncertainty measurements of coverings; Yao [25] studied several types of information-theoretical measures for attribute importance in RST; Beaubouef et al. [43] proposed a method for measuring the uncertainty of rough sets. Liang et al. [44,45] investigated information granulation in complete information systems; Dai et al. [46] researched entropy and granularity measures for SISs; Qian et al. [47,48] presented the axiomatic definition of information granulation in a knowledge base and examined information granularity of a fuzzy relation by using its fuzzy granular structure; Xu et al. [49] considered knowledge granulation in ordered information systems; Dai et al. [50] studied the uncertainty of incomplete interval-valued information systems based on -weak similarity; Xie et al. [51] put forward new uncertainty measurement for an interval-valued information system; Zhang et al. [52] measured the uncertainty of a fully fuzzy information system.
1.2. Motivation and Inspiration
A SIS has uncertainty. How to search for uncertainty measures in a SIS is a meaningful research issue. However, until now, the study of uncertainty measurement for a SIS has not been reported. The purpose of this article is to address uncertainty measurements in a SIS by using its information structures. Information granule of each object in a given SIS is first constructed by means of Gaussian kernel. Like this, information structures is consequently proposed. The uncertainty of this SIS is measured by using its information structure. For the sake of evaluating the performance of our presented measurement, effectiveness analysis is performed by means of elementary statistical methods.
Why do we investigate uncertainty measurement for a SIS? This is because a SIS itself has uncertainty. Why do we use information structures to measure the uncertainty of SISs? This is because it is hard to compare the size of measure values of uncertainty for SISs. Moreover, if dependence between two information structures is obtained, then the size of measure values of uncertainty for SISs can be compared by using the dependence.
The remaining sections of this article proceed as follows: Section 2 reviews some notions about fuzzy sets, fuzzy relations and SISs. Section 3 proposes distance between two objects in a SIS. Section 4 obtains the fuzzy -equivalence relation lead by a SIS by using Gaussian kernel. Section 5 investigates information structures in a SIS. Section 6 gives some tools for assessing uncertainty of a SIS. Section 7 summarizes this article.
2. Preliminaries
Some notions about fuzzy sets, fuzzy relations and SISs are reviewed.
Throughout this article, U denotes a finite set. indicates the family of all subsets of U. I expresses the unit interval .
Put
2.1. Fuzzy Sets and Fuzzy Relations
Fuzzy sets are extensions of ordinary sets [12]. A fuzzy set P in U is addressed as a function assigning to each element u of U a value and is referred to as the membership degree of u to the fuzzy set P.
In this article, denotes the set of all fuzzy sets in U. The cardinality of can be calculated with
If R is a fuzzy set in , then R is referred to as a fuzzy relation on U. In this article, expresses the set of all fuzzy relations on U.
Let . Then R may be represented by
where expresses the similarity between and .
If is an unit matrix, then R is said to be a fuzzy identity relation, and which is written as ; if , then R is said to be a fuzzy universal relation, which is written as .
Let . , a fuzzy set is addressed as
Then can be viewed as the information granule of the point u [48].
Definition 1
([53]). A function is called a t-norm, if it satisfies:
- (1)
- Commutativity:
- (2)
- Associativity:
- (3)
- Monotonicity:
- (4)
- Boundary condition:
Example 1.
For any , denote
Then is a t-norm.
Definition 2
([54]). Suppose that T is a t-norm. Suppose . Then R is a fuzzy T-equivalence relation on U if it satisfies:
- (1)
- Reflexivity:
- (2)
- Symmetry:
- (3)
- T-transitivity:
Proposition 1
([55]). Suppose that satisfies for all Then ,
Corollary 1.
Given . If R is reflexive, then R is -transitive.
2.2. Set-Valued Information Systems
Definition 3
([11]). Given that U is an object set of s and A is an attribute set. Suppose that U and A are finite sets. Then is referred to as an IS, if is able to determine an information function , where .
If , then is referred to as a subsystem of .
Definition 4
([56]). Let be an IS. If any and , is a set, then is referred to as a set-valued information system SIS.
If , then is referred to as a subsystem of .
3. The Distance between Two Objects in a SIS
Definition 5
([54]). Suppose that is a SIS. , the distance between and is addressed as
where .
According to the above definition, the distance between two objects in a SIS is given in the following.
Definition 6.
Assume that is a SIS. Given . the distance between u and v in is addressed as
Proposition 2.
Let be a SIS. Given . Then
Proof.
Then
Thus
□
Example 3.
Calculate in Table 1. We have
Then
4. The Fuzzy -Equivalence Relation Induced by a SIS
A Gaussian kernel makes data linear and simplifies classification tasks [57,58]. Hu et al. [59,60] established relations between rough sets and Gaussian kernel, so fuzzy relations are obtained by the Gaussian kernel. In this section, a fuzzy -equivalence relation on the object set of a SIS is extracted by using a Gaussian kernel.
Gaussian kernel computes similarity between objects u and v, where is the Euclidean distance between u and v, is a threshold. In this article, pick .
Let be a SIS. Given and . Since expresses the distance between objects u and v in , can be replaced by . Thus, we obtain by using Gaussian kernel. Since means the similarity between objects u and v in , is a fuzzy relation on U. The specific definition is as follows.
Definition 7.
Let be a SIS. Given and , denote
Then is said to Gaussian kernel matric of relative to δ.
Theorem 1.
Let be a SIS. Given and . Then is a fuzzy -equivalence relation on U.
Proof.
This holds by Corollary 1. □
Definition 8.
Let be a SIS. Given and . Then is addressed as the fuzzy -equivalence relation lead by relative to δ.
Example 4.
(Continued from Example 2) Pick , we have
Then is the fuzzy -equivalence relation induced by the subsystem with respect to δ.
Given and . Then an algorithm on the fuzzy -equivalence relation is designed as follows (Algorithm 1).
| Algorithm 1: The fuzzy -equivalence relation. |
![]() |
5. Information Structures in a SIS
In this section, information structures in a SIS are investigated.
5.1. Some Concepts of Information Structures in a SIS
Given . Then for each i, can be viewed as the fuzzy neighborhood or the information granule of the point [48]. According to this view, Qian et al. [48] defined the fuzzy granular structure of R as follows:
Let be a SIS. Given and . Then, by Theorem 1, is a fuzzy -equivalence relation on U. For each i, can be viewed as the fuzzy neighborhood or the information granule of the point . Based on Qian’s idea, we have the following definition.
Definition 9.
Let be a SIS. and , denote
Then is referred to as δ-information structure of .
Example 5.
(Continued from Example 4)
is -information structure of .
Definition 10.
Let be a SIS. Given . Put
Then is referred to as δ-information structure base of .
Definition 11.
Assume that is a SIS. Given and . If , , then and are called the same, which is written as .
Below, dependence between information structures is proposed.
Definition 12.
Let be a SIS. Given and .
- (1)
- is said to be dependent of , if , , which is written as .
- (2)
- is said to be strictly dependent of , if and , which is written as .
5.2. Properties of Information Structures in a SIS
Theorem 2.
Let be a SIS. Given and . Then
Proof.
Obviously. □
Theorem 3.
Let be a SIS. Given and . Then
Proof.
Clearly. □
Corollary 2.
Let be a SIS. Given and . Then
Proof.
This follows from Theorems 2 and 3. □
Theorem 4.
Let be a SIS.
- (1)
- If , then , ;
- (2)
- If , then , .
Proof.
(1) For any , it is clear that
Then
So
By Theorem 3,
(2) By Definition 7,
Then
So
Thus, by Theorem 3,
□
Corollary 3.
Let be a SIS. Given and . Then
, .
Proof.
This holds by Theorem 4. □
6. Measuring Uncertainty of a SIS
In this section, some tools for evaluating uncertainty of a SIS are proposed.
6.1. Granulation Measures for a SIS
Definition 13.
Let be a SIS. Suppose that is a function. Given . Then is referred to as an information granulation function in with respect to δ, if satisfies the following conditions:
- (1)
- Non-negativity: , ;
- (2)
- Invariability: , if , then ;
- (3)
- Monotonicity: , if , then .
Here, is referred to as δ-information granulation of .
Similar to Definition 5 in [48], the definition of -information granulation of a SIS is given in the following.
Definition 14.
Let be a SIS. Given . Then , δ-information granulation of is addressed as
Example 6.
(Continued from Example 4)
Proposition 3.
Suppose that is a SIS. Then and ,
Moreover, if , then achieves ; if , then achieves 1.
Proof.
Since , , . By Definition 14,
If , then , . So .
If , then , . So . □
Proposition 4.
Let be a SIS. Given and . Then
- (1)
- If , then ;
- (2)
- If , then .
Proof.
(1) Since , , we have . Then . By Definition 14,
Thus
(2) Since , we have and .
Then, , and , .
So, , and , .
Hence . □
Proposition 5.
Let be a SIS.
- (1)
- If , then , .
- (2)
- If , then , .
Proof.
This holds by Theorem 4 and Proposition 4(1). □
Example 7.
Let , . Then
We have
Thus
Corollary 4.
Let be a SIS. Given and . Then
Proof.
The result is a consequence of Proposition 5. □
Theorem 5.
is an information granulation function.
Proof.
This holds by Definition 14 and Proposition 4. □
6.2. Entropy Measures for a SIS
Similar to Definition 8 in [61], we have the following definition.
Definition 15.
Suppose that is a SIS. Then , δ-information entropy of is addressed as
Example 8.
(Continued from Example 4)
Theorem 6.
Let be a SIS. Given and . Then
(1) If , then ;
(2) If , then .
Proof.
(1) Obviously.
(2) Please note that . Then by Proposition 4, we have , and ,
Then ,
and ,
Hence . □
Proposition 6.
Let be a SIS.
- (1)
- If , then , ;
- (2)
- If , then , .
Proof.
It is proved by Theorems 4 and 6(1). □
Corollary 5.
Let be a SIS. Given and . Then
Proof.
The result is a consequence of Proposition 6. □
Rough entropy, proposed by Yao [25], evaluates granularity of a given partition. Similar to Definition 10 in [61], we have the following definition.
Definition 16.
Let be a SIS. Given and . δ-rough entropy of is addressed as
Example 9.
(Continued from Example 4)
Proposition 7.
Let be a SIS. Given and . Then
Moreover, if , then achieves 0; if , then achieves .
Proof.
Please note that is a fuzzy equivalence relation on U. Then ,
So , ,
.
Then .
By Definition 16,
If , then , . So .
If , then , . So . □
Proposition 8.
Let be a SIS. Given and . Then
- (1)
- If , then ;
- (2)⊆
- If , then .
Proof.
(1) Obviously.
(2) Please note that . The by the proof of Proposition 4(2), we have , and ,
Then ,
and ,
Hence . □
Proposition 9.
Let be a SIS.
- (1)
- If , then , ;
- (2)
- If , then , .
Proof.
It is easy to prove by Theorems 4 and 8(1). □
From Theorem 8 and Proposition 9, we come to the conclusion that the more certain -information structure is, the smaller -rough entropy value becomes.
Corollary 6.
Let be a SIS. Given and . Then
Proof.
The result is a consequence of Proposition 9. □
Theorem 7.
is an information granulation function.
Proof.
This holds by Definition 16 and Theorem 8. □
Theorem 8.
Assume that is a SIS. Given and . Then
Proof.
□
Corollary 7.
Let be a SIS. Given and . Then
Proof.
By Proposition 7, .
By Theorem 8, .
Thus . □
6.3. Information Amounts in a SIS
Similar to Definition 10 in [61], the following definition is presented.
Definition 17.
Let be a set-valued information system. Given and . δ-information amount of is addressed as
Example 10.
(Continued from Example 4)
Theorem 9.
Let be a SIS. Given and . Then.
- (1)
- If , then ;
- (2)
- If , then .
Proof.
(1) Obviously.
(2) Please note that . Then by the proof of Proposition 4(2), we have , and ,
Hence . □
Proposition 10.
Let be a SIS.
- (1)
- If , then , ;
- (2)
- If , then , .
From Theorem 9 and Proposition 10, it can be concluded that the more certain -information structure is, the bigger -information amount value becomes.
Corollary 8.
Let be a SIS. Given and . Then
Proof.
This holds by Proposition 10. □
Theorem 10.
Assume that is a SIS. Given and . Then
Proof.
□
Corollary 9.
Let be a SIS. Given and . Then
Proof.
By Proposition 3, .
By Theorem 10, .
Thus . □
Example 11.
(Continued from Example 2)
Let , , and be four subsystems of . Pick The following results are obtained:
- (1)
- If monotonicity is only considered, then δ-information granulation and δ-rough entropy are both monotonically increasing with the δ value growth, that means the uncertainty of four subsystems increase as the δ value increases. Meanwhile, δ-information amount and δ-information entropy are both monotonically decreasing with δ value growth, That means the uncertainty of four subsystem decreases as the δ value increases (see Figure 1, Figure 2, Figure 3 and Figure 4).
Figure 1. Uncertainty measurement of a SIS.
Figure 2. Uncertainty measurement of a SIS.
Figure 3. Uncertainty measurement of a SIS.
Figure 4. Uncertainty measurement of a SIS. - (2)
- If , consider δ-information granulation and δ-rough entropy, is got. That shows the larger the subsystem, the smaller the measured value. Pay attention to δ-information amount and δ-information entropy, we have That displays the measured value of the subsystem is larger than the smaller one (see Figure 5).
Figure 5. Uncertainty measures of subsystems with the changeless δ.
6.4. Effectiveness Analysis
In this subsection, we do effectiveness analysis from the angle of statistics.
6.4.1. Dispersion Analysis
Below, coefficient of variation is used to do effectiveness analysis.
Given a data set . Its average value , its standard deviation , and its coefficient of variation
Example 12.
(Continued from Example 4) Denote
Then
So (see Figure 6).
Figure 6.
-values for measuring uncertainty of the subsystems.
So
This means the dispersion degree of and are minimum.
From Figure 1, Figure 2, Figure 3, Figure 4, Figure 5 and Figure 6, the following results are obtained:
- (1)
- if monotonicity is only needed, then G, , H and E can evaluate uncertainty of a SIS.
- (2)
- if the dispersion degree is only considered, then E has better performance for measuring uncertainty of a SIS.
6.4.2. Association Analysis
Pearson correlation coefficient is used to assess the intensity of the linear correlation between data sets.
Given that and are data sets. Pearson correlation coefficient between X and Y is defined as
where , .
Example 13.
(Continued from Example 4) We have the following results shown in Table 2.
Table 2.
r-values of sixteen pairs of measure values sets for measuring uncertainty the subsystem.
From Table 2, the following conclusions are given, which is shown in Table 3, where “CPC”, “CNC”, “HPC” and “HNC” mean “completely positive correlation”, “completely negative correlation”, “high positive correlation” and “high negative correlation”, respectively.
Table 3.
The correlation between two measures.
7. Conclusions
In this article, information structures in a SIS have been described as set vectors. In light of this consideration, dependence between two information structures has been depicted. Properties of information structures have been provided. By using information structures, granularity and entropy measures for a SIS have been investigated. Moreover, the amount of information in a SIS has been also considered. In future work, three-way decision in a SIS will be studied.
Author Contributions
The authors discuss the results of this paper. J.H. designs the overall structure of this paper and improves the language; P.W. collects the data; Z.L. writes the paper.
Funding
This work is supported by High Level Innovation Team Program from Guangxi Higher Education Institutions of China (Document No. [2018] 35), Natural Science Foundationof Guangxi (2018GXNSFDA294003, 2018GXNSFDA281028, 2018JJA180014), Key Laboratory of Software Engineering in Guangxi University for Nationalities (2018-18XJSY-03) and Engineering Project of Undergraduate Teaching Reform of Higher Education in Guangxi (2017JGA179).
Conflicts of Interest
The authors declare no conflict of interest.
References
- Zadeh, L.A. Fuzzy logic equals computing with words. IEEE Trans. Fuzzy Syst. 1996, 4, 103–111. [Google Scholar] [CrossRef]
- Zadeh, L.A. Toward a theory of fuzzy information granulation and its centrality in human reasoning and fuzzy logic. Fuzzy Sets Syst. 1997, 90, 111–127. [Google Scholar] [CrossRef]
- Zadeh, L.A. Some reflections on soft computing, granular computing and their roles in the conception, design and utilization of information intelligent systems. Soft Comput. 1998, 2, 23–25. [Google Scholar] [CrossRef]
- Zadeh, L.A. A new direction in AI-Toward a computational theory of perceptions. AI Mag. 2001, 22, 73–84. [Google Scholar]
- Lin, T.Y. Granular computing on binary relations I: Data mining and neighborhood systems. In Rough Sets in Knowledge Discovery; Skowron, A., Polkowski, L., Eds.; Physica-Verlag: Heidelber, Germany, 1998; pp. 107–121. [Google Scholar]
- Lin, T.Y. Granular computing on binary relations II: Rough set representations and belief functions. In Rough Sets In Knowledge Discovery; Skowron, A., Polkowski, L., Eds.; Physica-Verlag: Heidelber, Germany, 1998; pp. 121–140. [Google Scholar]
- Lin, T.Y. Granular computing: Fuzzy logic and rough sets. In Computing with Words in Information Intelligent Systems; Zadeh, L.A., Kacprzyk, J., Eds.; Physica-Verlag: Heidelber, Germany, 1999; pp. 183–200. [Google Scholar]
- Yao, Y.Y. Information granulation and rough set approximation. Int. J. Intell. Syst. 2001, 16, 87–104. [Google Scholar] [CrossRef]
- Yao, Y.Y. Probabilistic approaches to rough sets. Expert Syst. 2003, 20, 287–297. [Google Scholar] [CrossRef]
- Yao, Y.Y. Perspectives of Granular computing. In Proceedings of the 2005 IEEE International Conference on Granular Computing, Beijing, China, 25–27 July 2005; Volume 1, pp. 85–90. [Google Scholar]
- Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
- Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
- Ma, J.; Zhang, W.; Leung, Y.; Song, X. Granular computing and dual Galois connection. Inf. Sci. 2007, 177, 5365–5377. [Google Scholar] [CrossRef]
- Wu, W.Z.; Leung, Y.; Mi, J. Granular computing and knowledge reduction in formal contexts. IEEE Trans. Knowl. Data Eng. 2009, 21, 1461–1474. [Google Scholar]
- Zhang, L.; Zhang, B. Theory and Application of Problem Solving-Theory and Application of Granular Computing in Quotient Spaces; Tsinghua University Publishers: Beijing, China, 2007. [Google Scholar]
- Pawlak, Z. Rough Sets: Theoretical Aspects of Reasoning about Data; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1991. [Google Scholar]
- Pawlak, Z.; Skowron, A. Rough sets and boolean reasoning. Inf. Sci. 2007, 177, 41–73. [Google Scholar] [CrossRef]
- Pawlak, Z.; Skowron, A. Rough sets: Some extensions. Inf. Sci. 2007, 177, 28–40. [Google Scholar] [CrossRef]
- Pawlak, Z.; Skowron, A. Rudiments of rough sets. Inf. Sci. 2007, 177, 3–27. [Google Scholar] [CrossRef]
- Cornelis, C.; Jensen, R.; Martin, G.H.; Slezak, D. Attribute selection with fuzzy decision reducts. Inf. Sci. 2010, 180, 209–224. [Google Scholar] [CrossRef]
- Dubois, D.; Prade, H. Rough fuzzy sets and fuzzy rough sets. Int. J. Gen. Syst. 1990, 17, 191–209. [Google Scholar] [CrossRef]
- Swiniarski, R.W.; Skowron, A. Rough set methods in feature selection and recognition. Pattern Recognit. Lett. 2003, 24, 833–849. [Google Scholar] [CrossRef]
- Slowinski, R.; Vanderpooten, D. A generalized definition of rough approximations based on setilarity. IEEE Trans. Snowledge Data Eng. 2000, 12, 331–336. [Google Scholar] [CrossRef]
- Greco, S.; Inuiguchi, M.; Slowinski, R. Fuzzy rough sets and multiplepremise gradual decision rules. Int. J. Approx. Reason. 2006, 41, 179–211. [Google Scholar] [CrossRef]
- Yao, Y.Y. Relational interpretations of neighborhood operators and rough set approximation operators. Inf. Sci. 1998, 111, 239–259. [Google Scholar] [CrossRef]
- Blaszczynski, J.; Slowinski, R.; Szelag, M. Sequential covering rule induction algorithm for variable consistency rough set approaches. Inf. Sci. 2011, 181, 987–1002. [Google Scholar] [CrossRef]
- Kryszkiewicz, M. Rules in incomplete information systems. Inf. Sci. 1999, 113, 271–292. [Google Scholar] [CrossRef]
- Mi, J.S.; Leung, Y.; Wu, W.Z. An uncertainty measure in partition-based fuzzy rough sets. Int. J. Gen. Syst. 2005, 34, 77–90. [Google Scholar] [CrossRef]
- Wierman, M.J. Measuring uncertainty in rough set theory. Int. J. Gen. Syst. 1999, 28, 283–297. [Google Scholar] [CrossRef]
- Hu, Q.H.; Pedrycz, W.; Yu, D.R.; Lang, J. Selecting discrete and continuous features based on neighborhood decision error minimization. IEEE Trans. Syst. Man Cybern. Part B 2010, 40, 137–150. [Google Scholar]
- Jensen, R.; Shen, Q. Semantics-preserving dimensionality reduction: Rough and fuzzy rough based approaches. IEEE Trans. Snowledge Data Eng. 2004, 16, 1457–1471. [Google Scholar] [CrossRef]
- Jensen, R.; Shen, Q. New approaches to fuzzy-rough feature selection. IEEE Trans. Fuzzy Syst. 2009, 17, 824–838. [Google Scholar] [CrossRef]
- Qian, Y.H.; Liang, J.Y.; Pedrycz, W.; Dang, C.Y. An accelerator for attribute reduction in rough set theory. Artif. Intell. 2010, 174, 597–618. [Google Scholar] [CrossRef]
- Thangavel, S.; Pethalakshmi, A. Dimensionality reduction based on rough set theory: A review. Appl. Soft Comput. 2009, 9, 1–12. [Google Scholar] [CrossRef]
- Xie, S.D.; Wang, Y.X. Construction of tree network with limited delivery latency in homogeneous wireless sensor networks. Wirel. Pers. Commun. 2014, 78, 231–246. [Google Scholar] [CrossRef]
- Cament, L.A.; Castillo, L.E.; Perez, J.P.; Galdames, F.J.; Perez, C.A. Fusion of local normalization and Gabor entropy weighted features for face identification. Pattern Recognit 2014, 47, 568–577. [Google Scholar] [CrossRef]
- Gu, B.; Sheng, V.S.; Wang, Z.J.; Ho, D.; Osman, S. Incremental learning for v-support vector regression. Neural Netw. 2015, 67, 140–150. [Google Scholar] [CrossRef] [PubMed]
- Navarrete, J.; Viejo, D.; Cazorla, M. Color smoothing for RGB-D data using entropy information. Appl. Soft Comput. 2016, 46, 361–380. [Google Scholar] [CrossRef]
- Hempelmann, C.F.; Sakoglu, U.; Gurupur, V.P.; Jampana, S. An entropy-based evaluation method for knowledge bases of medical information systems. Expert Syst. Appl. 2016, 46, 262–273. [Google Scholar] [CrossRef]
- Delgado, A.; Romero, I. Environmental conflict analysis using an integrated grey clustering and entropy-weight method: A case study of a mining project in Peru. Environ. Model. Softw. 2016, 77, 108–121. [Google Scholar] [CrossRef]
- Bianucci, D.; Cattaneo, G. Information entropy and granulation co-entropy of partitions and coverings: A summary. Trans. Rough Sets 2009, 10, 15–66. [Google Scholar]
- Bianucci, D.; Cattaneo, G.; Ciucci, D. Entropies and cocentropies of coverings with application to incomplete information systems. Fundam. Informaticae 2007, 75, 77–105. [Google Scholar]
- Beaubouef, T.; Petry, F.E.; Arora, G. Information-theoretic measures of uncertainty for rough sets and rough relational databases. Inf. Sci. 1998, 109, 185–195. [Google Scholar] [CrossRef]
- Liang, J.Y.; Shi, Z.Z. The information entropy, rough entropy and knowledge granulation in rough set theory. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2004, 12, 37–46. [Google Scholar] [CrossRef]
- Liang, J.Y.; Shi, Z.Z.; Li, D.Y.; Wierman, M.J. The information entropy, rough entropy and knowledge granulation in incomplete information systems. Int. J. Gen. Syst. 2006, 35, 641–654. [Google Scholar] [CrossRef]
- Dai, J.H.; Tian, H.W. Entropy measures and granularity measures for set-valued information systems. Inf. Sci. 2013, 240, 72–82. [Google Scholar] [CrossRef]
- Qian, Y.H.; Liang, J.Y.; Wu, W.Z.; Dang, C.Y. Knowledge structure, knowledge granulation and knowledge distance in a knowledge base. Int. J. Approx. Reason. 2009, 50, 174–188. [Google Scholar] [CrossRef]
- Qian, Y.H.; Liang, J.Y.; Wu, W.Z.; Dang, C.Y. Information granularity in fuzzy binary GrC model. IEEE Trans. Fuzzy Syst. 2011, 19, 253–264. [Google Scholar] [CrossRef]
- Xu, W.H.; Zhang, X.Y.; Zhang, W.X. Knowledge granulation, knowledge entropy and knowledge uncertainty measure in ordered information systems. Appl. Soft Comput. 2009, 9, 1244–1251. [Google Scholar]
- Dai, J.H.; Wei, B.J.; Zhang, X.H.; Zhang, Q.L. Uncertainty measurement for incomplete interval-valued information systems based on α-weak similarity. Knowl.-Based Syst. 2017, 136, 159–171. [Google Scholar] [CrossRef]
- Xie, N.X.; Liu, M.; Li, Z.W.; Zhang, G.Q. New measures of uncertainty for an interval-valued information system. Inf. Sci. 2019, 470, 156–174. [Google Scholar] [CrossRef]
- Zhang, G.Q.; Li, Z.W.; Wu, W.Z.; Liu, X.F.; Xie, N.X. Information structures and uncertainty measures in a fully fuzzy information system. Int. J. Approx. Reason. 2018, 101, 119–149. [Google Scholar] [CrossRef]
- Moser, B. On representing and generating kernels by fuzzy equivalence relations. J. Mach. Learn. Res. 2006, 7, 2603–2630. [Google Scholar]
- Zeng, A.P.; Li, T.R.; Liu, D.; Zhang, J.B.; Chen, H.M. A fuzzy rough set approach for incremental feature selection on hybrid information systems. Fuzzy Sets Syst. 2015, 258, 39–60. [Google Scholar] [CrossRef]
- Moser, B. On the T-transitivity of kernels. Fuzzy Sets Syst. 2006, 157, 1787–1796. [Google Scholar] [CrossRef]
- Yao, Y.Y.; Noroozi, N. A unified framework for set-based computations. In Proceedings of the 3rd International Workshop on Rough Sets and Soft Computing, San Jose, CA, USA, 10–12 November 1994; pp. 10–12. [Google Scholar]
- Shawe-Tayor, J.; Cristianini, N. Kernel Methods for Patternn Analysis; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
- Yang, S.; Yan, S.; Zhang, C.; Tang, X. Bilinear analysis for kernel selection and nonlinear feature extraction. IEEE Trans. Neural Netw. 2007, 18, 1442–1452. [Google Scholar] [CrossRef]
- Hu, Q.H.; Xie, Z.X.; Yu, D.R. Hybrid attribute reduction based on a novel fuzzy-rough model and information granulation. Pattern Recognit. 2007, 40, 3509–3521. [Google Scholar] [CrossRef]
- Hu, Q.H.; Zhang, L.; Chen, D.G.; Pedrycz, W.; Yu, D.R. Gaussian kernel based fuzzy rough sets: Model, uncertainty measures and applications. Int. J. Approx. Reason. 2010, 51, 453–471. [Google Scholar] [CrossRef]
- Liang, J.Y.; Qu, K.S. Information measures of roughness of knowledge and rough sets for information systems. J. Syst. Sci. Syst. Eng. 2002, 10, 95–103. [Google Scholar]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
