Spectral Embedded Deep Clustering
AbstractWe propose a new clustering method based on a deep neural network. Given an unlabeled dataset and the number of clusters, our method directly groups the dataset into the given number of clusters in the original space. We use a conditional discrete probability distribution defined by a deep neural network as a statistical model. Our strategy is first to estimate the cluster labels of unlabeled data points selected from a high-density region, and then to conduct semi-supervised learning to train the model by using the estimated cluster labels and the remaining unlabeled data points. Lastly, by using the trained model, we obtain the estimated cluster labels of all given unlabeled data points. The advantage of our method is that it does not require key conditions. Existing clustering methods with deep neural networks assume that the cluster balance of a given dataset is uniform. Moreover, it also can be applied to various data domains as long as the data is expressed by a feature vector. In addition, it is observed that our method is robust against outliers. Therefore, the proposed method is expected to perform, on average, better than previous methods. We conducted numerical experiments on five commonly used datasets to confirm the effectiveness of the proposed method. View Full-Text
Share & Cite This Article
Wada, Y.; Miyamoto, S.; Nakagama, T.; Andéol, L.; Kumagai, W.; Kanamori, T. Spectral Embedded Deep Clustering. Entropy 2019, 21, 795.
Wada Y, Miyamoto S, Nakagama T, Andéol L, Kumagai W, Kanamori T. Spectral Embedded Deep Clustering. Entropy. 2019; 21(8):795.Chicago/Turabian Style
Wada, Yuichiro; Miyamoto, Shugo; Nakagama, Takumi; Andéol, Léo; Kumagai, Wataru; Kanamori, Takafumi. 2019. "Spectral Embedded Deep Clustering." Entropy 21, no. 8: 795.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.