Previous Article in Journal
Quantum AI in Speech Emotion Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

ICIRD: Information-Principled Deep Clustering for Invariant, Redundancy-Reduced and Discriminative Cluster Distributions

1
School of Electronic Information and Engineering, Taiyuan University of Science and Technology, Taiyuan 030024, China
2
School of Computer Science and Technology, Taiyuan University of Science and Technology, Taiyuan 030024, China
3
Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney 2007, Australia
*
Author to whom correspondence should be addressed.
Entropy 2025, 27(12), 1200; https://doi.org/10.3390/e27121200
Submission received: 4 November 2025 / Revised: 21 November 2025 / Accepted: 25 November 2025 / Published: 26 November 2025
(This article belongs to the Section Information Theory, Probability and Statistics)

Abstract

Deep clustering aims to discover meaningful data groups by jointly learning representations and cluster probability distributions. Yet existing methods rarely consider the underlying information characteristics of these distributions, causing ambiguity and redundancy in cluster assignments, particularly when different augmented views are used. To address this issue, this paper proposes a novel information-principled deep clustering framework for learning invariant, redundancy-reduced, and discriminative cluster probability distributions, termed ICIRD. Specifically, ICIRD is built upon three complementary modules for cluster probability distributions: (i) conditional entropy minimization, which increases assignment certainty and discriminability; (ii) inter-cluster mutual information minimization, which reduces redundancy among cluster distributions and sharpens separability; and (iii) cross-view mutual information maximization, which enforces semantic consistency across augmented views. Additionally, a contrastive representation mechanism is incorporated to provide stable and reliable feature inputs for the cluster probability distributions. Together, these components enable ICIRD to jointly optimize both representations and cluster probability distributions in an information-regularized manner. Extensive experiments on five image benchmark datasets demonstrate that ICIRD outperforms most existing deep clustering methods, particularly on fine-grained datasets such as CIFAR-100 and ImageNet-Dogs.
Keywords: deep clustering; contrastive learning; discriminative learning; information-principled deep clustering; discriminative distribution sharpness; multi-view inter-cluster distribution redundancy reduction deep clustering; contrastive learning; discriminative learning; information-principled deep clustering; discriminative distribution sharpness; multi-view inter-cluster distribution redundancy reduction

Share and Cite

MDPI and ACS Style

Zheng, A.; Wu, R.M.X.; Wang, Y.; He, Y. ICIRD: Information-Principled Deep Clustering for Invariant, Redundancy-Reduced and Discriminative Cluster Distributions. Entropy 2025, 27, 1200. https://doi.org/10.3390/e27121200

AMA Style

Zheng A, Wu RMX, Wang Y, He Y. ICIRD: Information-Principled Deep Clustering for Invariant, Redundancy-Reduced and Discriminative Cluster Distributions. Entropy. 2025; 27(12):1200. https://doi.org/10.3390/e27121200

Chicago/Turabian Style

Zheng, Aiyu, Robert M. X. Wu, Yupeng Wang, and Yanting He. 2025. "ICIRD: Information-Principled Deep Clustering for Invariant, Redundancy-Reduced and Discriminative Cluster Distributions" Entropy 27, no. 12: 1200. https://doi.org/10.3390/e27121200

APA Style

Zheng, A., Wu, R. M. X., Wang, Y., & He, Y. (2025). ICIRD: Information-Principled Deep Clustering for Invariant, Redundancy-Reduced and Discriminative Cluster Distributions. Entropy, 27(12), 1200. https://doi.org/10.3390/e27121200

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop