Abstract
Concept-cognitive learning reveals the principle of human cognition by simulating the brain’s process of learning and processing concepts. Nevertheless, for neighborhood similarity granules, the average information of objects regarding all attributes is not considered, which may lead to unbalanced acquisition of knowledge. On the other hand, there are some unnecessary concepts in the extension of fuzzy concepts, which results in poor classification learning. To tackle these challenges, we present a forgetting-based concept-cognitive learning model for classification in a fuzzy formal decision context. Firstly, the fuzzy concept space is established based on the the correlation coefficient matrix. Then, to delete unnecessary objects that are in the zone of proximal development, we construct the forgetting fuzzy concept space by selecting the concept corresponding to the maximum similarity. Subsequently, a forgetting-based fuzzy concept model (FCCLM) mechanism is proposed. In the end, experimental results on eight datasets validate the feasibility and efficiency of the proposed learning mechanism through classification performance assessment.
Keywords:
concept-cognitive learning; fuzzy formal concept analysis; correlation coefficient; forgetting fuzzy concept; knowledge fusion MSC:
03C45
1. Introduction
Cognitive computing [1], as an important applied branch of cognitive science, can employ computational approaches to develop innovative solutions for complex real-world challenges [2,3]. Concepts are created by humans via the generalization and abstraction of the fundamental attributes of objects. As fundamental building blocks of human knowledge systems, they provide a structured approach to knowledge representation and organization, which is crucial for cognitive computing.
Wille [4,5] introduced formal concept analysis (FCA for short) to illustrate the formal context and formal concepts from a mathematical standpoint. With the development of FCA, some extended models such as fuzzy concepts, interval set concepts, three-way concepts, and multi-granularity concepts have been proposed by scholars [6,7,8,9]. It should be noted that fuzzy concepts generated from a fuzzy formal context are a special case of Chu space [10,11]. The basic form of a fuzzy Chu space is a triple , where A and X are sets, is a mapping, and k is a truth value set. When k takes values in the unit interval , the fuzzy Chu space is reduced to fuzzy formal concept analysis. In recent years, the combination of FCA and cognitive computing has led to the emergence of a popular research direction, namely concept-cognitive learning (CCL for short). CCL denotes the acquisition of concepts from provided clues via specific cognitive models, aiming to uncover the systematic regularity of how the human brain learns concepts. For example, a novel concept-cognitive model based on granular computing was first studied by Zhang et al. [12], and the experiments show that the model is feasible. Yao [13] put forward a three-level framework of concept cognition from three aspects: the abstract layer, the brain layer, and the machine layer. Qiu et al. [14] proposed a concept-cognitive system that can realize multiple concept cognitions. Additionally, Kumar et al. [15] investigated the concept learning approach to analyze functionalities of bidirectional associative memory. Thereafter, Li et al. [16] discussed the concept learning mechanism from the perspective of philosophy and cognitive psychology. Therefore, the above-mentioned results in concept-cognitive systems have mainly been obtained by means of granular computing.
Subsequently, common concept-cognitive learning models mainly include information similarity matching [17,18,19]. Xu et al. [17] designed a two-way learning method of fuzzy concepts from the necessary or sufficient information granules in fuzzy datasets. Then Xu et al. [18] further proposed the dynamic learning of necessary and sufficient concepts in a dynamic environment. Meanwhile, many scholars have presented achievements in the combination of CCL and machine learning, which mainly address the problems of cognitive classification of data and knowledge fusion. For example, Mi et al. [20,21,22] introduced an incremental CCL model [20], fuzzy-based CCL model [21], and concurrent CCL model [22]. Furthermore, Yuan et al. [23] presented an incremental learning mechanism based on progressive fuzzy three-way concepts. Zhang et al. [24] discussed a knowledge discovery and concept classification method by assigning different weights to the fuzzy formal context. Next, Wang et al. [25] studied an innovative multi-view fuzzy CCL model to integrate concepts through multiple views. Liang et al. [26] investigated an incremental CCL method based on concept reduction for dynamic classification in the formal context. In [23,24], note that neighborhood similarity granules induced by the Euclidean distance or cosine theorem do not consider the average information of objects. To this end, the correlation coefficient matrix is discussed to improve classification performance, which is a motivation for this paper.
The advantage of CCL is that it can integrate past experiences into itself to describe dynamic data, and some achievements have already been made. For instance, Guo et al. [27] introduced a memory-based CCL model to mine knowledge by combining the recalling and forgetting mechanisms. It is worth noting that the forgetting mechanism in the work of Guo is to forget the maximum concept in the inclusion relationship while retaining the remaining concepts. However, the principle of forgetting is that short-term memories are usually forgotten. This also means that objects are easily forgotten that are in the zone of proximal development. Therefore, the formulation of forgetting rules in fuzzy concepts remains an open research question requiring further investigation. We address this challenge in Section 3 through a novel proposed solution.
To overcome the aforementioned constraints, we propose a forgetting-based concept-cognitive learning model (FCCLM for short) for classification. Concretely, the fuzzy concept space is constructed with the correlation coefficient matrix. It is known that objects are easily forgotten that are in the zone of proximal development, which can improve knowledge discovery classification performance. Afterwards, to remove objects that are easily forgotten, we introduce a forgetting-based concept-cognitive learning mechanism for fusing concepts and class prediction. The overall workflow diagram is shown in Figure 1. Meanwhile, this paper makes the following innovations.
- The fuzzy concept based on the correlation coefficient matrix describes the average information embedded in data, which further decreases cognitive biases.
- It designs a concept-forgetting method via removing some unnecessary objects, which can improve the classification performance of concept recognition.
The remainder of this article is organized as follows. We recall some related notations about fuzzy formal concept analysis in Section 2. Section 3 studies the construction of the forgetting fuzzy concept space with the correlation coefficient and concept classification. The experimental results are presented and analyzed in Section 4. Finally, Section 5 concludes this work and outlines directions for future research.
Figure 1.
The overall workflow diagram.
Figure 1.
The overall workflow diagram.

2. Preliminaries
This section provides a concise overview of fundamental notions in fuzzy formal concept analysis. More detailed information can be found in [4,5,6].
Fuzzy Formal Concept Analysis
Consider a universe G; a fuzzy set of G is characterized by a mapping . For any , the value represents the degree of membership of g belonging to . We denote by the collection of all fuzzy sets on G.
Let E and F be two fuzzy sets on G. If , , then E is a subset of F; that is to say, . Specifically, the collection of all crisp sets defined on G is denoted as .
A triplet is referred to as a fuzzy formal context, where and are the sets of objects and attributes, respectively. Let be a fuzzy relation between G and A (i.e., ), with each indicating the degree to which object x possesses attribute a.
Definition 1.
Given a fuzzy formal context , for and , two learning operators and are described as follows:
in which the pair is a fuzzy concept satisfying and . In general, Y and are named extent and intent, respectively.
For any , we derive the following characteristic properties:
- ;
- ;
- , ;
- , ;
- ;
- , .
In addition, suppose and are two fuzzy formal contexts, where and . Then a fuzzy formal decision context (FFDC) is characterized by , where . is regarded as a decision division based on decision attribute set D with .
3. The Construction of a Fuzzy Concept with the Correlation Coefficient
3.1. Constructing the Fuzzy Concept Space
This section introduces a correlation coefficient matrix to construct the correlation similarity granule for achieving concept characterization.
Definition 2.
Given a fuzzy formal context , for any , the correlation coefficient between and is defined as follows:
where denotes the membership degree of object with respect to attribute , while represents the average membership degree of across the attribute set A. That is to say, . The value of explains the similarity degree between and . The pairwise correlation coefficients form a symmetric matrix Q, where quantifies the linear relationship between objects and . Meanwhile, this correlation coefficient matrix can be used to construct the correlation similarity granule.
Definition 3.
Given an FFDC , for any , the correlation similarity granule of object x is given by
where β is a threshold and is the correlation similarity degree between x and y.
Definition 4.
Given an FFDC , where , for any , the fuzzy concept subspace about is denoted as follows:
where β is a threshold and is the correlation similarity degree between x and y.
Subsequently, we denote by the fuzzy concept space, where is named a fuzzy subspace of . It should be emphasized that comprehensive feature learning of each individual object serves as a prerequisite for achieving optimal classification performance. Then we propose the procedure of constructing the fuzzy concept space in Algorithm 1 with the time complexity of .
| Algorithm 1: Constructing fuzzy concept space (CFCS). |
![]() |
Example 1.
Table 1 is an FFDC , where and . d is the decision attribute that partitions all objects into two classes, with and . The correlation coefficient matrix Q is computed as follows:
Given , for the objects in decision class , we can have the correlation similarity granule , , , and . Next, fuzzy concept subspace .
For the objects in decision class , their correlation similarity granule and fuzzy concept subspace are as follows: , and . .
Table 1.
An FFDC.
Table 1.
An FFDC.
| U | a1 | a2 | a3 | a4 | d |
|---|---|---|---|---|---|
| 0.7 | 0.3 | 0.4 | 0.5 | 1 | |
| 0.1 | 0.8 | 0.5 | 0.9 | 1 | |
| 0.3 | 0.5 | 0.6 | 0.7 | 1 | |
| 0.4 | 0.3 | 0.4 | 0.4 | 1 | |
| 0.6 | 0.4 | 0.3 | 0.9 | 2 | |
| 0.3 | 0.5 | 0.6 | 0.4 | 2 | |
| 0.2 | 0.5 | 0.4 | 0.3 | 2 | |
| 0.1 | 0.7 | 0.8 | 0.8 | 2 |
3.2. Constructing the Forgetting Fuzzy Concept Space
Section 3.1 discusses the construction process of the fuzzy concept space through the correlation similarity granule and a pair of cognitive operators and . However, since , some of the objects in extent may be forgotten because of knowledge forgetting.
Definition 5.
Given an FFDC , for arbitrary , the inner fringe of X is denoted as
where and for .
Definition 5 introduces a method for forgetting the minimal object set that is proximate to a certain object in the extent . Then the above forgetting minimal object set can be represented by a Boolean matrix, shown as Proposition 1.
Proposition 1.
Let be an FFDC. is an inner fringe of . Then, we know the following:
where and
for .
Proof.
For any , means that
That is to say, for any any , it implies that . Hence, for any , if , then it is obvious that . □
Proposition 2.
Let be an FFDC. is an inner fringe of . For any , is a fuzzy concept.
Proof.
It is obvious from Definition 1. □
In fact, there are multiple elements in . After applying knowledge forgetting, the obtained forgetting fuzzy concept is defined, which is maximally similar to fuzzy concept . For two concepts and , the similarity degree is .
Definition 6.
Let be an FFDC. For , the collection of forgetting fuzzy concepts is defined as follows:
where .
In a fuzzy formal decision context and , there are some fuzzy subconcepts of ; that is, certain fuzzy concepts can be obtained from after knowledge forgetting through the inner fringe . From Proposition 2, for any , is a fuzzy concept obtained from after knowledge forgetting. It is obvious that . The higher the similarity between and , the higher the probability that objects in are forgotten.
In addition, given , the forgetting fuzzy concept subspace is denoted as . At the same time, the process of constructing the forgetting fuzzy concept space is shown as Algorithm 2, where is run in Steps 3–11, which can be taken in , and some unnecessary objects are reduced in Steps 12–25, which can be measured in , where means the zone of proximal development of an object. Hence, the time complexity of Algorithm 2 is .
Example 2.
Continue with Example 1. With respect to class , we now discuss the forgetting fuzzy concept of in . In fact, , which implies that . Then we have two fuzzy concepts, and . Subsequently, and . Hence, we know that . Similarly, . Finally, we have the forgetting fuzzy concept subspace .
In addition, the forgetting fuzzy concept subspace in is .
Meanwhile, the process of learning the forgetting fuzzy concept is shown in Figure 2.
Figure 2.
The process of learning the forgetting fuzzy concept.
| Algorithm 2: Constructing forgetting fuzzy concept space (CFFCS). |
![]() |
3.3. Fusing Concept
Mutual information exists among forgetting fuzzy concepts, manifested through their bidirectional influence mechanisms. In order to overcome the limitations of individual cognition and the incomplete cognitive environment [28], we propose a new method to fuse concepts through the forgetting fuzzy concept space.
Definition 7.
Given an FFDC , for a forgetting fuzzy concept subspace , if there exist forgetting fuzzy concepts , satisfying , then is regarded as the supremum fuzzy concept, and then the fusing forgetting fuzzy pseudo-concept is defined as follows:
where n means the number of forgetting fuzzy concepts.
Generally speaking, the fusing forgetting fuzzy concept space is denoted as , where , in which n is the number of pseudo-concepts in . The intent explicitly characterizes the magnitude of the pseudo-concept, in which the intents of subconcepts have been assigned different weights according to their corresponding extents. In other words, the greater the extent is, the larger the weight of its corresponding intent is. Finally, Algorithm 3 describes the clustering process of the fusing forgetting fuzzy concept space with the worst-case time complexity of .
| Algorithm 3: Cognitive process of fusing forgetting fuzzy concept space. |
![]() |
Example 3.
Continuing to Example 2, according to Definition 7, fusing forgetting fuzzy concept spaces are represented as follows:
It should be noted that there are two fusing forgetting fuzzy concepts in each decision class. It is evident that these concepts preserve the initial information while eliminating the redundant forgetting fuzzy concepts, which significantly enhances the efficiency of concept-cognitive learning.
3.4. Class Prediction
It should be noted that the class prediction of a testing sample is primarily determined based on the Euclidean distance between the testing sample and existing fuzzy concept clustering space [27,29,30].
Definition 8.
Let be a testing sample. is a new fuzzy concept; then the distance between and pseudo-concept in is defined as follows:
As is well known, is the distance similarity. The smaller the distance value , the stronger the correlation between two fuzzy concepts. Consequently, class determination for a new sample can be achieved by computing their similarity. Algorithm 4 describes the class prediction with the time complexity of .
| Algorithm 4: Class prediction of testing sample. |
![]() |
Example 4.
In the FFDC of Table 2, where are from Example 1, are two new testing objects.
Regarding the object , the membership degree about is , while its true label is 2. Subsequently, the Euclidean distance between and the existing fusing forgetting fuzzy concept space is computed as follows: , , , and . Indeed, the distance between and attains its minimum in . Consequently, should be classified into decision class , which perfectly matches its real label 2.
Based on the above discussion, Figure 3 describes the overall flowchart of the FCCLM, which includes four parts, namely (1) constructing a fuzzy concept space; (2) constructing a forgetting fuzzy concept space; (3) constructing a fusing forgetting fuzzy concept space; and (4) class prediction. In summary, the total time complexity of our model FCCLM is .
Figure 3.
The overall procedure of the proposed method.
Table 2.
A FFDC.
Table 2.
A FFDC.
| U | a1 | a2 | a3 | a4 | d |
|---|---|---|---|---|---|
| 0.7 | 0.3 | 0.4 | 0.5 | 1 | |
| 0.1 | 0.8 | 0.5 | 0.9 | 1 | |
| 0.3 | 0.5 | 0.6 | 0.7 | 1 | |
| 0.4 | 0.3 | 0.4 | 0.4 | 1 | |
| 0.6 | 0.4 | 0.3 | 0.9 | 2 | |
| 0.3 | 0.5 | 0.6 | 0.4 | 2 | |
| 0.2 | 0.5 | 0.4 | 0.3 | 2 | |
| 0.1 | 0.7 | 0.8 | 0.8 | 2 | |
| 0.6 | 0.5 | 0.1 | 0.6 | 2 | |
| 0.2 | 0.6 | 0.4 | 0.9 | 2 |
4. Experimental Results
To evaluate the classification efficacy of model FCCLM, a comparative analysis will be conducted between it and concept-cognitive learning as well as machine learning classification algorithms. A total of eight datasets are selected from the UCI (dataset source: http://archive.ics.uci.edu/, accessed on 14 March 2015) and Gene (Dataset source: https://jundongl.github.io/scikit-feature/datasets.html, accessed on 14 March 2015) data set repositories, as detailed in Table 3.
Table 3.
Data description.
4.1. Experimental Setting
As a preprocessing step, Max–Min normalization is applied to all attributes to guarantee dataset compatibility with fuzzy processing requirements, specifically given by
where is the initial value of object for attribute . Additionally, and are the maximum and minimum values of attribute across all objects, respectively.
In the experiment, we compare FCCLM with three concept-cognitive learning classification algorithms, namely, DMPWFC [23], ILMPFTC [24], and FCLM [21]. Furthermore, we also compare FCCLM with four machine learning classification algorithms, including Complex Tree (CT), Fuzzy-Root-Sum-Square (F-RSS), Decision Tree (DT), and Fuzzy Classification and Regression Tree (F-CART). Notice that parameter plays an important role in constructing correlation similarity granules. In this investigation, parameter is varied systematically from 0.55 to 1.0 with a step size of 0.05 to influence correlation similarity granules. All analyses are conducted using five-fold cross-validation to guarantee unbiased performance estimation. Final performance metrics are computed as the mean values obtained from the five cross-validation test partitions. All results are generated in MATLAB 2021a using standardized hardware (Intel(R) Core(TM) i7-4790 CPU @ 3.6 GHz, 16 GB (Lenovo, Quanzhou, China)) to ensure computational consistency.
4.2. Results and Analyses
The results of accuracy and the optimal parameter of the proposed algorithm FCCLM and seven classification algorithms on eight datasets are shown in Table 4. The bottom row displays the mean classification accuracy with the standard deviation, where bold–underline formatting highlights the algorithm’s statistically superior performance compared to all baseline methods. The results of the table indicate that ILMPFTC and CT achieve optimal accuracy on one dataset, respectively, while FCCLM demonstrates superior performance across six datasets. Furthermore, FCCLM has a higher average accuracy and lower standard deviation than the other algorithms, which indicates that the performance of FCCLM outperforms the other seven algorithms. The average accuracy of FCCLM is increased by 9.48% when compared to F-CART on all selected datasets. In summary, the above results indicate that FCCLM is superior to the other models. Figure 4 depicts the accuracy comparison between CCL and the other seven algorithms. This figure describes the accuracy of FCCLM, which possesses a smaller fluctuation range than the other algorithms in most datasets.
Table 4.
Accuracy comparison (mean ± standard deviation%) of FCCLM and seven other algorithms.
Figure 4.
Accuracy comparison with CCL and other classification algorithms.
Subsequently, the results of recall for the proposed algorithm FCCLM and seven classification algorithms on eight datasets are presented in Table 5. The last row illustrates the average accuracy and standard deviation, with bold–underline formatting denoting statistically superior performance relative to comparator algorithms. This table indicates that ILMPFTC, CT, and F-CART achieve the best accuracy twice, once, and once, respectively, while FCCLM has the best performance on four datasets. Furthermore, FCCLM demonstrates statistically superior performance, exhibiting both a significantly higher average accuracy and lower standard deviation compared to the other algorithms, which illustrates that the performance of FCCLM outperforms the other seven algorithms.
Table 5.
Recall comparison (mean ± standard deviation%) of FCCLM and seven other algorithms.
Figure 5 describes the classification accuracy as changes. From this figure, we can observe that the accuracy shows a slow upward trend with fluctuations as the parameter varies, among which the fluctuation range is the largest in the interval [0.75, 0.85]. At the same time, most datasets achieve optimal accuracy in the range of [0.85, 0.95].
Figure 5.
Accuracy comparisons with varying on eight datasets.
Additionally, the statistical significance of the eight classification algorithms could be compared using the Friedman test [31] and Bonferoni–Dunn test [32]. For the Friedman test, a Fisher distribution measuring the performance of different algorithms is given by
where . N is the number of datasets, k is the cardinality of different algorithms, and is the average rank of algorithm across all datasets. We first assume the null hypothesis that no statistically significant differences exist in predictive accuracy among the compared classification methods. If , then the initial hypothesis will be rejected. The rank result of the eight models is represented in Table 6. In fact, , and the critical value of at level . Therefore, we reject the null hypothesis and accept the alternative hypothesis that the classification performance of the eight models is remarkably different.
Table 6.
Rank of classification algorithms.
Subsequently, the Bonferroni–Dunn test is applied to conduct pairwise comparisons of classification performance among the five top-performing models. Assume that the mean rank between two models exceeds the critical value
and then the above two models exhibit statistically significant differences in classification performance, in which is denoted as the critical value in the test.
At level , we notice that the critical value in [32]. Subsequently, ( and ). Therefore, we can see from Figure 6 that algorithm FCCLM outperforms algorithms F-RSS, FCLM, and DT at level .
Figure 6.
CD comparison of all classification algorithms with Bonferroni–Dunn test ().
4.3. Discussion
In the above subsection, we compare the performance of FCCLM with other algorithms using eight datasets. Experimental results demonstrate the effectiveness and efficiency of our proposed algorithm FCCLM from the perspective of accuracy and statistical results. Specifically, some advantages are reflected on as follows: (1) We could counterpoise the sensitivity and specificity of Algorithm 1, named CFCS, to capture rich information by adjusting the threshold, thus facilitating the interpretability and understanding of the concept learning process. (2) Knowledge forgetting of the learned fuzzy concepts is performed through the inner fringe, thereby aligning with the logic of the Ebbinghaus forgetting curve and enhancing interpretability. (3) In the classification task, the fusing forgetting fuzzy pseudo-concept and class prediction can accelerate the acquisition of sample labels and reduce the time consumption of model FCCLM. In summary, a novel forgetting-based concept-cognitive learning model for classification in a fuzzy formal decision context is introduced to enhance the cognitive process.
5. Conclusions
In the context of rapidly expanding data, how to mine knowledge and fuse information is an important issue. Some existing CCL methods in the fuzzy formal decision context mainly focus on forgetting unnecessary concepts in constructing the concept space; ignoring the object is the key to constructing concepts. In this article, we have proposed a forgetting-based concept-cognitive learning model for classification tasks in a fuzzy formal decision context. Concretely, we firstly construct the fuzzy concept space with the correlation coefficient matrix. After that, in order to eliminate unnecessary objects in the zone of proximal development, we build a forgetting fuzzy concept space by choosing the concept with the highest similarity. Next, we propose a forgetting-based fuzzy concept model, which handles concept fusion and concept class prediction. Finally, to gain a better understanding of our proposed model, a series of comparative experiments on eight datasets are conducted to show that FCCLM can attain superior classification performance.
This article introduces a forgetting mechanism in the process of concept-cognitive learning, but does not consider recognition in the incremental learning process. Our future research will address this issue with the aim of enhancing the accuracy and efficiency of incremental learning approaches in the context of CCL.
Author Contributions
Conceptualization: X.L.; Software: C.Z.; Writing—original draft: C.S. All authors have read and agreed to the published version of the manuscript.
Funding
Work is supported by the Natural Science Foundation of Fujian Province (2024J01793) and the Natural Science Basic Research Program of Shaanxi Province (Grant No. 2024JC-YBQN-0025).
Data Availability Statement
The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.
Conflicts of Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References
- Pylyshyn, Z. Computation and cognition: Issues in the foundations of cognitive science. Behav. Brain Sci. 1980, 3, 111–132. [Google Scholar] [CrossRef]
- Wang, Y.; Howard, N.; Kacprzyk, J.; Frieder, O.; Sheu, P.; Fiorini, R.A.; Gavrilova, M.L.; Patel, S.; Peng, J.; Widrow, B. Cognitive informatics: Towards cognitive machine learning and autonomous knowledge manipulation. Int. J. Cogn. Inform. Nat. Intell. 2018, 12, 1–13. [Google Scholar] [CrossRef]
- Feldman, J. Minimization of Boolean complexity in human concept learning. Nature 2000, 407, 630–633. [Google Scholar] [CrossRef] [PubMed]
- Ganter, B.; Wille, R. Formal Concept Analysis: Mathematical Foundations; Springer: New York, NY, USA, 1999. [Google Scholar]
- Wille, R. Restructuring lattice theory: An approach based on hierarchies of concepts. In Proceedings of the Formal Concept Analysis: 7th International Conference, ICFCA 2009, Darmstadt, Germany, 21–24 May 2009; Proceedings 7. Springer: Berlin/Heidelberg, Germany, 2009; pp. 314–339. [Google Scholar]
- Yahia, S.; Arour, K.; Slimani, A.; Jaoua, A. Discovery of compact rules in relational databases. Inf. Sci. 2000, 4, 497–511. [Google Scholar]
- Qi, J.; Qian, T.; Wei, L. The connections between three-way and classical concept lattices. Knowl.-Based Syst. 2016, 91, 143–151. [Google Scholar] [CrossRef]
- Xu, F.; Xing, Z.; Yin, H. Attribute reductions and concept lattices in interval-valued intuitionistic fuzzy rough set theory: Construction and properties. J. Intell. Fuzzy Syst. 2016, 30, 1231–1242. [Google Scholar] [CrossRef]
- Qi, J.; Wei, L.; Wan, Q. Multi-level granularity in formal concept analysis. Granul. Comput. 2019, 4, 351–362. [Google Scholar] [CrossRef]
- Basil, K. Papadopoulos and apostolos syropoulos, fuzzy sets and fuzzy relational structures as Chu spaces. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2000, 8, 471–479. [Google Scholar]
- Papadopoulos, B.; Syropoulos, A. Categorical relationships between Goguen sets and “two-sided” categorical models of linear logic. Fuzzy Sets Syst. 2005, 149, 501–508. [Google Scholar] [CrossRef]
- Zhang, W.; Xu, W. Cognitive model based on granular computing. Chin. J. Eng. Math. 2007, 24, 957–971. [Google Scholar]
- Yao, Y. Interpreting concept learning in cognitive informatics and granular computing. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2009, 39, 855–866. [Google Scholar] [CrossRef]
- Qiu, G.; Ma, J.; Yang, H.; Zhang, W. A mathematical model for concept granular computing systems. Sci. China Inf. Sci. 2009, 39, 1239–1247. [Google Scholar] [CrossRef]
- Kumar, C.; Ishwarya, M.; Loo, C. Formal concept analysis approach to cognitive functionalities of bidirectional associative memory. Biol. Inspired Cogn. Archit. 2015, 12, 20–33. [Google Scholar]
- Li, J.; Mei, C.; Xu, W.; Qian, Y. Concept learning via granular computing: A cognitive viewpoint. Inf. Sci. 2015, 298, 447–467. [Google Scholar] [CrossRef] [PubMed]
- Xu, W.; Li, W. Granular computing approach to two-way learning based on formal concept analysis in fuzzy datasets. IEEE Trans. Cybern. 2014, 46, 366–379. [Google Scholar] [CrossRef]
- Xu, W.; Guo, D.; Qian, Y.; Ding, W. Two-way concept-cognitive learning method: A fuzzy-based progressive learning. IEEE Trans. Fuzzy Syst. 2022, 31, 1885–1899. [Google Scholar] [CrossRef]
- Xu, W.; Guo, D.; Mi, J.; Qian, Y.; Zheng, K.; Ding, W. Two-way concept-cognitive learning via concept movement viewpoint. IEEE Trans. Neural Netw. Learn. Syst. 2023, 34, 6798–6812. [Google Scholar] [CrossRef]
- Shi, Y.; Mi, Y.; Li, J.; Liu, W. Concept-cognitive learning model for incremental concept learning. IEEE Trans. Syst. Man Cybern. Syst. 2018, 51, 809–821. [Google Scholar] [CrossRef]
- Mi, Y.; Shi, Y.; Li, J.; Liu, W.; Yan, M. Fuzzy-based concept learning method: Exploiting data with fuzzy conceptual clustering. IEEE Trans. Cybern. 2020, 52, 582–593. [Google Scholar] [CrossRef]
- Shi, Y.; Mi, Y.; Li, J.; Liu, W. Concurrent concept-cognitive learning model for classification. Inf. Sci. 2019, 496, 65–81. [Google Scholar] [CrossRef]
- Yuan, K.; Xu, W.; Li, W.; Ding, W. An incremental learning mechanism for object classification based on progressive fuzzy three-way concept. Inf. Sci. 2022, 584, 127–147. [Google Scholar] [CrossRef]
- Zhang, C.; Tsang, E.C.; Xu, W.; Lin, Y.; Yang, L. Incremental concept-cognitive learning approach for concept classification oriented to weighted fuzzy concepts. Knowl.-Based Syst. 2023, 260, 110093. [Google Scholar] [CrossRef]
- Wang, J.; Xu, W.; Ding, W.; Qian, Y. Multi-view fuzzy concept-cognitive learning with high-order information fusion of fuzzy attributes. IEEE Trans. Fuzzy Syst. 2024, 32, 6965–6978. [Google Scholar] [CrossRef]
- Liang, T.; Lin, Y.; Li, J.; Lin, G.; Wang, Q. Incremental cognitive learning approach based on concept reduction. Int. J. Approx. Reason. 2025, 179, 109359. [Google Scholar] [CrossRef]
- Guo, D.; Xu, W.; Qian, Y.; Ding, W. M-FCCL: Memory-based concept-cognitive learning for dynamic fuzzy data classification and knowledge fusion. Inf. Fusion 2023, 100, 101962. [Google Scholar] [CrossRef]
- Li, J.; Mi, Y.; Liu, W. Incremental cognition of concepts: Theories and methods. Chin. J. Comput. 2019, 42, 1–19. [Google Scholar]
- Ding, Y.; Xu, W.; Ding, W.; Qian, Y. IFCRL: Interval-intent fuzzy concept re-cognition learning model. IEEE Trans. Fuzzy Syst. 2024, 32, 3581–3593. [Google Scholar] [CrossRef]
- Guo, D.; Xu, W. Fuzzy-based concept-cognitive learning: An investigation of novel approach to tumor diagnosis analysis. Inf. Sci. 2023, 639, 118998. [Google Scholar] [CrossRef]
- Friedman, M. A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 1940, 11, 86–92. [Google Scholar] [CrossRef]
- Dunn, O. Multiple comparisons among means. J. Am. Stat. Assoc. 1961, 56, 52–64. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).








