Improved Selective Deep-Learning-Based Clustering Ensemble
Abstract
:1. Introduction
- This paper studies the limitations of clustering ensemble and deep clustering, which is an interest and important topic. Then, we propose an improved selective deep clustering ensemble (ISDCE) method to mitigate the problem. It incorporates deep clustering into selective clustering ensemble to enhance the robustness and clustering performance.
- ISDCE constructs the ensemble quality and diversity evaluation metrics of base clusterings. It is able to select higher-quality and rich-diversity base clusterings to improve ensemble performance. In addition, ISDCE measure the reliability of clusters and the local diversity of clusters within the same base clusterings to further improve the integration performance.
- Extensive experimental results on various types of datasets confirm that ISDCE performs significantly more robustly and better than existing clustering ensemble approaches.
2. Related Work
2.1. Clustering Ensemble
2.2. Deep Clustering
2.3. Selection Ensemble
3. Improved Selective Deep Clustering Ensemble
3.1. Deep Clustering Generation
Algorithm 1 Deep Autoencoder Clustering |
Input: X: Original Input data; k: the number of clusters; S: the number of base deep clustering; : Maximum iteration times; : auxiliary distribution update interval; : stopped threshold; : autoencoder layer parameters. |
Output: S deep base partitions |
for to S do Step1: Initialize autoencoder network parameters and weights. for to do if then Step2: Calculate all the embedding points according to the parameters of the corresponding dataset. Step3: Initial clustering and clustering centroid were obtained using k-means. Step4: Updating auxiliary target distribution P by Equations (2), (4) and . Step5: Save the cluster assignment pre. Step6: Calculate the new cluster assignments pre. if then Training stopped! end if end if Step7: Update of the autoencoder weights and cluster centroid. end for Step8: Save the rounds-th deep clustering end for return S deep base clustering |
3.2. Selective Clustering Ensemble
3.2.1. Selective Strategy
- (1)
- Quality Strategy (QS). For a given ensemble , the QS adopts to compute all the base clusterings in the ensemble and ranks them in descending order to select the top M () base clusterings with higher SARI values as the candidate ensemble. In general, base clusterings with higher SARI values show more overall trend consistency. Base clusterings with lower SARI values can be considered outliers in the ensemble and may be unfavorable for inclusion in the ensemble.
- (2)
- Diversity Strategy (DS). The DS is a strategy that seeks to maximize ensemble diversity. For a given ensemble , we select M () base clusterings to minimize the PNMI. We can view this objective as a problem of finding weight-maximizing subgraphs, where the edge weight of each vertex is . However, this problem is NP-hard. Therefore, we approximate the solution of the problem using a simple greedy strategy. First, the highest-quality base clusterings are selected to form a new ensemble E using SARI computation, and then a base clustering from the ensemble is gradually selected to be added to the ensemble E so as to minimize the PNMI value. This process is repeated until the number of base clusterings in the ensemble E reaches M.
- (3)
- Balance Strategy (BS). The BS is a combination of the above two strategies. The two metrics SARI and PNMI are combined to form a new metric balance strategy index (BSI) as follows:
3.2.2. Consensus Ensemble
4. Experiments
4.1. Datasets and Evaluation Measures
- Cars: Cars dataset is the 1983 ASA Data Exposition dataset, which contains mpg, cylinders, displacement, etc. (8 variables), for 392 different cars.
- Iris: Iris dataset [42] contains 150 samples for 3 classes, where each class refers to a category of iris plant.
- Wine: Wine dataset [43] is the results of a chemical analysis of wines grown in the same region, which determined the quantities of 13 constituents found in each of the three types of wines.
- MNIST: MNIST dataset is a set of well-known image data of 70,000 handwritten digits (10 class labels) with 784 pixels. Mnist5 is the subset of the MNIST [44].
- Strain: Strain dataset is a set of synthetic mock metagenome data [45]. This dataset was constructed to investigate the impact of strain-level variation on clustering.
- Species: Species dataset is also a set of synthetic mock metagenome data [45]. It was designed to resolve species-level variation in a complex community.
4.2. Experimental Settings and Clustering Performance Comparison
4.3. ISDCE Component Ablation Experiment
4.4. ISDCE t-SNE Visualization
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Naeem, M.; Jamal, T.; Diaz-Martinez, J.; Butt, S.A.; Montesano, N.; Tariq, M.I.; De-la Hoz-Franco, E.; De-La-Hoz-Valdiris, E. Trends and future perspective challenges in big data. In Advances in Intelligent Data Analysis and Applications; Springer: Berlin/Heidelberg, Germany, 2022; pp. 309–325. [Google Scholar]
- Shi, Y. Advances in Big Data Analytics. Theory, Algorithms and Practices; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
- Chamikara, M.A.P.; Bertók, P.; Liu, D.; Camtepe, S.; Khalil, I. Efficient privacy preservation of big data for accurate data mining. Inf. Sci. 2020, 527, 420–443. [Google Scholar] [CrossRef]
- Ezhilmaran, D.; Vinoth Indira, D. A survey on clustering techniques in pattern recognition. In Proceedings of the Advances in Applicable Mathematics—ICAAM2020, Coimbatore, India, 21–22 February 2020; AIP Publishing LLC: Melville, NY, USA, 2020; Volume 2261, p. 030093. [Google Scholar]
- Ghosal, A.; Nandy, A.; Das, A.K.; Goswami, S.; Panday, M. A short review on different clustering techniques and their applications. In Emerging Technology in Modelling and Graphics; Springer: Berlin/Heidelberg, Germany, 2020; pp. 69–83. [Google Scholar]
- Hartigan, J.A.; Wong, M.A. Algorithm AS 136: A k-means clustering algorithm. J. R. Stat. Soc. Ser. C (Appl. Stat.) 1979, 28, 100–108. [Google Scholar] [CrossRef]
- Ng, A.; Jordan, M.; Weiss, Y. On spectral clustering: Analysis and an algorithm. Adv. Neural Inf. Process. Syst. 2001, 14, 849–856. [Google Scholar]
- Reynolds, D.A. Gaussian mixture models. In Encyclopedia of Biometrics; Springer: Berlin/Heidelberg, Germany, 2009; Volume 741. [Google Scholar]
- Steinbach, M.; Ertöz, L.; Kumar, V. The challenges of clustering high dimensional data. In New Directions in Statistical Physics: Econophysics, Bioinformatics, and Pattern Recognition; Springer: Berlin/Heidelberg, Germany, 2004; pp. 273–309. [Google Scholar]
- Zhang, M. Weighted clustering ensemble: A review. Pattern Recognit. 2021, 124, 108428. [Google Scholar] [CrossRef]
- Niu, H.; Khozouie, N.; Parvin, H.; Alinejad-Rokny, H.; Beheshti, A.; Mahmoudi, M.R. An ensemble of locally reliable cluster solutions. Appl. Sci. 2020, 10, 1891. [Google Scholar] [CrossRef]
- Vega-Pons, S.; Ruiz-Shulcloper, J. A survey of clustering ensemble algorithms. Int. J. Pattern Recognit. Artif. Intell. 2011, 25, 337–372. [Google Scholar] [CrossRef]
- Ren, Y.; Pu, J.; Yang, Z.; Xu, J.; Li, G.; Pu, X.; Yu, P.S.; He, L. Deep clustering: A comprehensive survey. arXiv 2022, arXiv:2210.04142. [Google Scholar]
- Min, E.; Guo, X.; Liu, Q.; Zhang, G.; Cui, J.; Long, J. A survey of clustering with deep learning: From the perspective of network architecture. IEEE Access 2018, 6, 39501–39514. [Google Scholar] [CrossRef]
- Khan, A.; Hao, J.; Dong, Z.; Li, J. Adaptive Deep Clustering Network for Retinal Blood Vessel and Foveal Avascular Zone Segmentation. Appl. Sci. 2023, 13, 11259. [Google Scholar] [CrossRef]
- Ru, T.; Zhu, Z. Deep Clustering Efficient Learning Network for Motion Recognition Based on Self-Attention Mechanism. Appl. Sci. 2023, 13, 2996. [Google Scholar] [CrossRef]
- Zhou, S.; Xu, H.; Zheng, Z.; Chen, J.; Bu, J.; Wu, J.; Wang, X.; Zhu, W.; Ester, M. A comprehensive survey on deep clustering: Taxonomy, challenges, and future directions. arXiv 2022, arXiv:2206.07579. [Google Scholar]
- Li, Z.; Wu, X.M.; Chang, S.F. Segmentation using superpixels: A bipartite graph partitioning approach. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 789–796. [Google Scholar]
- Huang, D.; Lai, J.H.; Wang, C.D. Robust ensemble clustering using probability trajectories. IEEE Trans. Knowl. Data Eng. 2015, 28, 1312–1326. [Google Scholar] [CrossRef]
- Huang, D.; Lai, J.; Wang, C.D. Ensemble clustering using factor graph. Pattern Recognit. 2016, 50, 131–142. [Google Scholar] [CrossRef]
- Yousefnezhad, M.; Zhang, D. Weighted spectral cluster ensemble. In Proceedings of the 2015 IEEE International Conference on Data Mining, Atlantic City, NJ, USA, 14–17 November 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 549–558. [Google Scholar]
- Liu, H.; Wu, J.; Liu, T.; Tao, D.; Fu, Y. Spectral ensemble clustering via weighted k-means: Theoretical and practical evidence. IEEE Trans. Knowl. Data Eng. 2017, 29, 1129–1143. [Google Scholar] [CrossRef]
- Huang, D.; Wang, C.D.; Lai, J.H. Locally weighted ensemble clustering. IEEE Trans. Cybern. 2017, 48, 1460–1473. [Google Scholar] [CrossRef] [PubMed]
- Jia, Y.; Liu, H.; Hou, J.; Zhang, Q. Clustering ensemble meets low-rank tensor approximation. AAAI Conf. Artif. Intell. 2021, 35, 7970–7978. [Google Scholar] [CrossRef]
- Jia, Y.; Tao, S.; Wang, R.; Wang, Y. Ensemble Clustering via Co-association Matrix Self-enhancement. arXiv 2022, arXiv:2205.05937. [Google Scholar] [CrossRef]
- Zhou, P.; Sun, B.; Liu, X.; Du, L.; Li, X. Active Clustering Ensemble with Self-Paced Learning. IEEE Trans. Neural Netw. Learn. Syst. 2023, 1–5. [Google Scholar] [CrossRef]
- Xie, J.; Girshick, R.; Farhadi, A. Unsupervised deep embedding for clustering analysis. In Proceedings of the 33rd International Conference on Machine Learning, New York, NY, USA, 20–22 June 2016; pp. 478–487. [Google Scholar]
- Guo, X.; Gao, L.; Liu, X.; Yin, J. Improved deep embedded clustering with local structure preservation. In Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence. Melbourne, Australia 19–25 August 2017; pp. 1753–1759. [Google Scholar]
- Guo, X.; Liu, X.; Zhu, E.; Yin, J. Deep clustering with convolutional autoencoders. In Proceedings of the International Conference on Neural Information Processing, Guangzhou, China, 14–18 November 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 373–382. [Google Scholar]
- Zhang, R.; Li, X.; Zhang, H.; Nie, F. Deep fuzzy k-means with adaptive loss and entropy regularization. IEEE Trans. Fuzzy Syst. 2019, 28, 2814–2824. [Google Scholar] [CrossRef]
- Chen, J.; Han, J.; Meng, X.; Li, Y.; Li, H. Graph convolutional network combined with semantic feature guidance for deep clustering. Tsinghua Sci. Technol. 2022, 27, 855–868. [Google Scholar] [CrossRef]
- Dilokthanakul, N.; Mediano, P.A.; Garnelo, M.; Lee, M.C.; Salimbeni, H.; Arulkumaran, K.; Shanahan, M. Deep unsupervised clustering with gaussian mixture variational autoencoders. In Proceedings of the International Conference on Learning Representations, Toulon, France, 24–26 April 2017. [Google Scholar]
- Jiang, Z.; Zheng, Y.; Tan, H.; Tang, B.; Zhou, H. Variational deep embedding: An unsupervised and generative approach to clustering. arXiv 2017, arXiv:1611.05148. [Google Scholar]
- Lim, K.L.; Jiang, X.; Yi, C. Deep clustering with variational autoencoder. IEEE Signal Process. Lett. 2020, 27, 231–235. [Google Scholar] [CrossRef]
- Harchaoui, W.; Mattei, P.A.; Bouveyron, C. Deep adversarial Gaussian mixture auto-encoder for clustering. In Proceedings of the International Conference on Learning Representations, Toulon, France, 24–26 April 2017. [Google Scholar]
- Makhzani, A.; Shlens, J.; Jaitly, N.; Goodfellow, I.; Frey, B. Adversarial autoencoders. arXiv 2015, arXiv:1511.05644. [Google Scholar]
- Mukherjee, S.; Asnani, H.; Lin, E.; Kannan, S. Clustergan: Latent space clustering in generative adversarial networks. AAAI Conf. Artif. Intell. 2019, 33, 4610–4617. [Google Scholar] [CrossRef]
- Golalipour, K.; Akbari, E.; Hamidi, S.S.; Lee, M.; Enayatifar, R. From clustering to clustering ensemble selection: A review. Eng. Appl. Artif. Intell. 2021, 104, 104388. [Google Scholar] [CrossRef]
- Hadjitodorov, S.T.; Kuncheva, L.I.; Todorova, L.P. Moderate diversity for better cluster ensembles. Inf. Fusion 2006, 7, 264–275. [Google Scholar] [CrossRef]
- Fern, X.Z.; Lin, W. Cluster ensemble selection. Stat. Anal. Data Mining ASA Data Sci. J. 2008, 1, 128–141. [Google Scholar] [CrossRef]
- Jia, J.; Xiao, X.; Liu, B.; Jiao, L. Bagging-based spectral clustering ensemble selection. Pattern Recognit. Lett. 2011, 32, 1456–1467. [Google Scholar] [CrossRef]
- Iris; University of CaliforniaIrvine: Los Angeles, CA, USA, 30 June 1988. [CrossRef]
- Wine; University of CaliforniaIrvine: Los Angeles, CA, USA, 30 June 1991. [CrossRef]
- Zhou, J.; Zheng, H.; Pan, L. Ensemble clustering based on dense representation. Neurocomputing 2019, 357, 66–76. [Google Scholar] [CrossRef]
- Alneberg, J.; Bjarnason, B.S.; De Bruijn, I.; Schirmer, M.; Quick, J.; Ijaz, U.Z.; Lahti, L.; Loman, N.J.; Andersson, A.F.; Quince, C. Binning metagenomic contigs by coverage and composition. Nat. Methods 2014, 11, 1144–1146. [Google Scholar] [CrossRef]
- Ezugwu, A.E.; Ikotun, A.M.; Oyelade, O.O.; Abualigah, L.; Agushaka, J.O.; Eke, C.I.; Akinyelu, A.A. A comprehensive survey of clustering algorithms: State-of-the-art machine learning applications, taxonomy, challenges, and future research prospects. Eng. Appl. Artif. Intell. 2022, 110, 104743. [Google Scholar] [CrossRef]
Data Name | Sample | Feature | Category |
---|---|---|---|
Cars | 392 | 8 | 3 |
Iris | 150 | 4 | 3 |
Wine | 178 | 13 | 3 |
Mnist5 | 3495 | 784 | 10 |
MNIST | 70,000 | 784 | 10 |
Strain | 9401 | 200 | 20 |
Species | 37,585 | 232 | 101 |
Cars | Iris | Wine | |
---|---|---|---|
WSCE | 0.244 (0.002) | 0.758 (0.001) | 0.417 (0.001) |
PTAAL | 0.145 (0.051) | 0.568 (0.209) | 0.352 (0.121) |
PTACL | 0.149 (0.048) | 0.576 (0.192) | 0.221 (0.148) |
PTASL | 0.138 (0.059) | 0.514 (0.232) | 0.396 (0.042) |
PTGP | 0.164 (0.032) | 0.737 (0.019) | 0.420 (0.013) |
SECWK | 0.126 (0.056) | 0.542 (0.241) | 0.261 (0.136) |
LWEA | 0.207 (0.003) | 0.751 (0.006) | 0.421 (0.016) |
LWGP | 0.209 (0.002) | 0.745 (0.005) | 0.404 (0.045) |
LRTA | 0.004 (0.002) | 0.018 (0.011) | 0.014 (0.006) |
ECCMS | 0.206 (0.003) | 0.744 (0.006) | 0.409 (0.023) |
SPACE | 0.286 (0.011) | 0.508 (0.041) | 0.385 (0.002) |
ISDCEQS | 0.258 (0.002) | 0.785 (0.005) | 0.542 (0.003) |
ISDCEDS | 0.278 (0.010) | 0.792 (0.010) | 0.465 (0.014) |
ISDCEBS | 0.256 (0.005) | 0.788 (0.001) | 0.523 (0.004) |
Cars | Iris | Wine | |
---|---|---|---|
WSCE | 0.052 (0.000) | 0.725 (0.001) | 0.359 (0.000) |
PTAAL | −0.021 (0.109) | 0.423 (0.186) | 0.298 (0.106) |
PTACL | −0.020 (0.110) | 0.441 (0.179) | 0.132 (0.100) |
PTASL | −0.016 (0.111) | 0.376 (0.191) | 0.337 (0.094) |
PTGP | −0.027 (0.106) | 0.679 (0.068) | 0.362 (0.015) |
SECWK | −0.038 (0.042) | 0.451 (0.239) | 0.216 (0.115) |
LWEA | 0.071 (0.002) | 0.721 (0.005) | 0.353 (0.040) |
LWGP | 0.072 (0.001) | 0.720 (0.006) | 0.326 (0.071) |
LRTA | −0.001 (0.003) | 0.005 (0.011) | 0.003 (0.004) |
ECCMS | 0.071 (0.002) | 0.723 (0.003) | 0.342 (0.045) |
SPACE | 0.054 (0.044) | 0.392 (0.122) | 0.247 (0.008) |
ISDCEQS | 0.062 (0.001) | 0.774 (0.007) | 0.525 (0.005) |
ISDCEDS | 0.218 (0.028) | 0.782 (0.015) | 0.449 (0.015) |
ISDCEBS | 0.063 (0.001) | 0.775 (0.001) | 0.494 (0.007) |
Mnist5 | MNIST | Strain | Speices | |
---|---|---|---|---|
WSCE | 0.543 (0.008) | timeout | error | error |
PTAAL | 0.495 (0.012) | 0.497 (0.013) | 0.815 (0.045) | 0.899 (0.007) |
PTACL | 0.501 (0.011) | 0.491 (0.016) | 0.813 (0.040) | 0.894 (0.008) |
PTASL | 0.354 (0.080) | 0.038 (0.034) | 0.788 (0.050) | 0.914 (0.011) |
PTGP | 0.500 (0.011) | 0.494 (0.017) | 0.833 (0.038) | 0.860 (0.019) |
SECWK | 0.470 (0.028) | 0.470 (0.026) | 0.762 (0.056) | 0.836 (0.016) |
LWEA | 0.513 (0.006) | 0.506 (0.007) | 0.945 (0.007) | 0.981 (0.005) |
LWGP | 0.495 (0.005) | 0.498 (0.004) | 0.947 (0.008) | 0.986 (0.003) |
LRTA | 0.528 (0.007) | timeout | 0.950 (0.007) | timeout |
ECCMS | 0.345 (0.114) | timeout | 0.946 (0.006) | timeout |
SPACE | 0.444 (0.006) | timeout | 0.861 (0.011) | timeout |
ISDCEQS | 0.585 (0.010) | 0.801 (0.002) | 0.959 (0.001) | 0.993 (0.002) |
ISDCEDS | 0.584 (0.012) | 0.776 (0.003) | 0.958 (0.002) | 0.993 (0.001) |
ISDCEBS | 0.590 (0.003) | 0.800 (0.001) | 0.958 (0.001) | 0.993 (0.002) |
Mnist5 | MNIST | Strain | Speices | |
---|---|---|---|---|
WSCE | 0.377 (0.016) | timeout | error | error |
PTAAL | 0.372 (0.020) | 0.382 (0.024) | 0.567 (0.142) | 0.463 (0.037) |
PTACL | 0.380 (0.019) | 0.372 (0.022) | 0.578 (0.126) | 0.449 (0.046) |
PTASL | 0.174 (0.084) | 0.003 (0.004) | 0.476 (0.137) | 0.582 (0.068) |
PTGP | 0.382 (0.018) | 0.377 (0.032) | 0.673 (0.106) | 0.353 (0.067) |
SECWK | 0.317 (0.048) | 0.346 (0.036) | 0.410 (0.138) | 0.175 (0.034) |
LWEA | 0.393 (0.009) | 0.397 (0.014) | 0.932 (0.024) | 0.846 (0.021) |
LWGP | 0.370 (0.006) | 0.384 (0.008) | 0.934 (0.024) | 0.869 (0.018) |
LRTA | 0.404 (0.013) | timeout | 0.932 (0.024) | timeout |
ECCMS | 0.170 (0.073) | timeout | 0.894 (0.036) | timeout |
SPACE | 0.319 (0.003) | timeout | 0.724 (0.037) | timeout |
ISDCEQS | 0.475 (0.008) | 0.811 (0.005) | 0.943 (0.002) | 0.972 (0.003) |
ISDCEDS | 0.470 (0.015) | 0.754 (0.008) | 0.944 (0.003) | 0.971 (0.005) |
ISDCEBS | 0.481 (0.001) | 0.808 (0.003) | 0.944 (0.001) | 0.972 (0.002) |
Cars | Iris | Wine | Mnist5 | MNIST | Strain | Speices | |
---|---|---|---|---|---|---|---|
DAE | 0.228 (0.028) | 0.741 (0.055) | 0.416 (0.046) | 0.555 (0.034) | 0.750 (0.025) | 0.943 (0.007) | 0.987 (0.003) |
ISDCE_noSS | 0.246 (0.016) | 0.762 (0.012) | 0.448 (0.023) | 0.573 (0.018) | 0.758 (0.013) | 0.955 (0.005) | 0.989 (0.002) |
ISDCEQS | 0.258 (0.002) | 0.785 (0.005) | 0.542 (0.003) | 0.585 (0.010) | 0.801 (0.002) | 0.959 (0.001) | 0.993 (0.002) |
ISDCEDS | 0.278 (0.010) | 0.792 (0.010) | 0.465 (0.014) | 0.584 (0.012) | 0.776 (0.003) | 0.958 (0.002) | 0.993 (0.001) |
ISDCEBS | 0.256 (0.005) | 0.788 (0.001) | 0.523 (0.004) | 0.590 (0.003) | 0.800 (0.001) | 0.958 (0.001) | 0.993 (0.002) |
Cars | Iris | Wine | Mnist5 | MNIST | Strain | Speices | |
---|---|---|---|---|---|---|---|
DAE | 0.062 (0.054) | 0.731 (0.056) | 0.397 (0.076) | 0.430 (0.042) | 0.739 (0.047) | 0.928 (0.030) | 0.948 (0.032) |
ISDCE_noSS | 0.075 (0.036) | 0.756 (0.013) | 0.434 (0.028) | 0.461 (0.017) | 0.745 (0.014) | 0.938 (0.010) | 0.961 (0.020) |
ISDCEQS | 0.062 (0.001) | 0.774 (0.007) | 0.525 (0.005) | 0.475 (0.008) | 0.811 (0.005) | 0.943 (0.002) | 0.972 (0.003) |
ISDCEDS | 0.218 (0.028) | 0.782 (0.015) | 0.449 (0.015) | 0.470 (0.015) | 0.754 (0.008) | 0.944 (0.003) | 0.971 (0.005) |
ISDCEBS | 0.063 (0.001) | 0.775 (0.001) | 0.494 (0.007) | 0.481 (0.001) | 0.808 (0.003) | 0.944 (0.001) | 0.972 (0.002) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Qian, Y.; Yao, S.; Wu, T.; Huang, Y.; Zeng, L. Improved Selective Deep-Learning-Based Clustering Ensemble. Appl. Sci. 2024, 14, 719. https://doi.org/10.3390/app14020719
Qian Y, Yao S, Wu T, Huang Y, Zeng L. Improved Selective Deep-Learning-Based Clustering Ensemble. Applied Sciences. 2024; 14(2):719. https://doi.org/10.3390/app14020719
Chicago/Turabian StyleQian, Yue, Shixin Yao, Tianjun Wu, You Huang, and Lingbin Zeng. 2024. "Improved Selective Deep-Learning-Based Clustering Ensemble" Applied Sciences 14, no. 2: 719. https://doi.org/10.3390/app14020719
APA StyleQian, Y., Yao, S., Wu, T., Huang, Y., & Zeng, L. (2024). Improved Selective Deep-Learning-Based Clustering Ensemble. Applied Sciences, 14(2), 719. https://doi.org/10.3390/app14020719