Next Article in Journal
Optimization and Evaluation of Ventilation Mode in Marine Data Center Based on AHP-Entropy Weight
Previous Article in Journal
The Introduction of Entropy and Information Methods to Ecology by Ramon Margalef
Article Menu
Issue 8 (August) cover image

Export Article

Open AccessArticle

Spectral Embedded Deep Clustering

1
Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8601, Japan
2
Department of Systems Innovation, School of Engineering, The University of Tokyo, Hongo Campus, Eng. Bldg. No. 3, 2F, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
3
Department of Mathematical and Computing Science, School of Computing, Tokyo Institute of Technology, 2-12-1 Ookayama, Meguro-ku, Tokyo 152-8552, Japan
4
Computer Science Department, Sorbonne Université, 4 place Jussieu, 75005 Paris, France
5
RIKEN AIP, Nihonbashi 1-chome Mitsui Building, 15th floor, 1-4-1 Nihonbashi, Chuo-ku, Tokyo 103-0027, Japan
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(8), 795; https://doi.org/10.3390/e21080795
Received: 13 June 2019 / Revised: 4 August 2019 / Accepted: 12 August 2019 / Published: 15 August 2019
  |  
PDF [565 KB, uploaded 15 August 2019]
  |  

Abstract

We propose a new clustering method based on a deep neural network. Given an unlabeled dataset and the number of clusters, our method directly groups the dataset into the given number of clusters in the original space. We use a conditional discrete probability distribution defined by a deep neural network as a statistical model. Our strategy is first to estimate the cluster labels of unlabeled data points selected from a high-density region, and then to conduct semi-supervised learning to train the model by using the estimated cluster labels and the remaining unlabeled data points. Lastly, by using the trained model, we obtain the estimated cluster labels of all given unlabeled data points. The advantage of our method is that it does not require key conditions. Existing clustering methods with deep neural networks assume that the cluster balance of a given dataset is uniform. Moreover, it also can be applied to various data domains as long as the data is expressed by a feature vector. In addition, it is observed that our method is robust against outliers. Therefore, the proposed method is expected to perform, on average, better than previous methods. We conducted numerical experiments on five commonly used datasets to confirm the effectiveness of the proposed method. View Full-Text
Keywords: clustering; deep neural networks; manifold learning; semi-supervised learning clustering; deep neural networks; manifold learning; semi-supervised learning
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Wada, Y.; Miyamoto, S.; Nakagama, T.; Andéol, L.; Kumagai, W.; Kanamori, T. Spectral Embedded Deep Clustering. Entropy 2019, 21, 795.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top