Next Article in Journal
Stability Analysis of the Explicit Difference Scheme for Richards Equation
Next Article in Special Issue
Cross-Modality Person Re-Identification Based on Heterogeneous Center Loss and Non-Local Features
Previous Article in Journal
Invariant-Based Inverse Engineering for Fast and Robust Load Transport in a Double Pendulum Bridge Crane

Weighted Mutual Information for Aggregated Kernel Clustering

by 1,*,†,‡ and 2,‡
Department of Mathematical Sciences, Florida Institute of Technology, Melbourne, FL 32901, USA
Department of Mathematics, King Abdulaziz University, Rabigh 21911, Saudi Arabia
Author to whom correspondence should be addressed.
Current address: 150 W. University Blvd., Melbourne, FL, USA.
These authors contributed equally to this work.
Entropy 2020, 22(3), 351;
Received: 24 January 2020 / Revised: 19 February 2020 / Accepted: 13 March 2020 / Published: 18 March 2020
Background: A common task in machine learning is clustering data into different groups based on similarities. Clustering methods can be divided in two groups: linear and nonlinear. A commonly used linear clustering method is K-means. Its extension, kernel K-means, is a non-linear technique that utilizes a kernel function to project the data to a higher dimensional space. The projected data will then be clustered in different groups. Different kernels do not perform similarly when they are applied to different datasets. Methods: A kernel function might be relevant for one application but perform poorly to project data for another application. In turn choosing the right kernel for an arbitrary dataset is a challenging task. To address this challenge, a potential approach is aggregating the clustering results to obtain an impartial clustering result regardless of the selected kernel function. To this end, the main challenge is how to aggregate the clustering results. A potential solution is to combine the clustering results using a weight function. In this work, we introduce Weighted Mutual Information (WMI) for calculating the weights for different clustering methods based on their performance to combine the results. The performance of each method is evaluated using a training set with known labels. Results: We applied the proposed Weighted Mutual Information to four data sets that cannot be linearly separated. We also tested the method in different noise conditions. Conclusions: Our results show that the proposed Weighted Mutual Information method is impartial, does not rely on a single kernel, and performs better than each individual kernel specially in high noise. View Full-Text
Keywords: weighted mutual information; aggregated clustering; kernel k-means; conditional entropy weighted mutual information; aggregated clustering; kernel k-means; conditional entropy
Show Figures

Figure 1

MDPI and ACS Style

Kachouie, N.N.; Shutaywi, M. Weighted Mutual Information for Aggregated Kernel Clustering. Entropy 2020, 22, 351.

AMA Style

Kachouie NN, Shutaywi M. Weighted Mutual Information for Aggregated Kernel Clustering. Entropy. 2020; 22(3):351.

Chicago/Turabian Style

Kachouie, Nezamoddin N., and Meshal Shutaywi. 2020. "Weighted Mutual Information for Aggregated Kernel Clustering" Entropy 22, no. 3: 351.

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop