(1) Problem Statement: The development of clustering algorithms for neural recordings has significantly evolved, reaching a mature stage with predominant approaches including partitional, hierarchical, probabilistic, fuzzy logic, density-based, and learning-based clustering. Despite this evolution, there remains a need for innovative clustering algorithms that
[...] Read more.
(1) Problem Statement: The development of clustering algorithms for neural recordings has significantly evolved, reaching a mature stage with predominant approaches including partitional, hierarchical, probabilistic, fuzzy logic, density-based, and learning-based clustering. Despite this evolution, there remains a need for innovative clustering algorithms that can efficiently analyze neural spike data, particularly in handling diverse and noise-contaminated neural recordings. (2) Methodology: This paper introduces a novel clustering algorithm named Gershgorin—nonmaximum suppression (G–NMS), which incorporates the principles of the Gershgorin circle theorem, and a deep learning post-processing method known as nonmaximum suppression. The performance of G–NMS was thoroughly evaluated through extensive testing on two publicly available, synthetic neural datasets. The evaluation involved five distinct groups of experiments, totaling eleven individual experiments, to compare G–NMS against six established clustering algorithms. (3) Results: The results highlight the superior performance of G–NMS in three out of five group experiments, achieving high average accuracy with minimal standard deviation (SD). Specifically, in Dataset 1, experiment S1 (various SNRs) recorded an accuracy of 99.94
0.01, while Dataset 2 showed accuracies of 99.68
0.15 in experiment E1 (Easy 1) and 99.27
0.35 in experiment E2 (Easy 2). Despite a slight decrease in average accuracy in the remaining two experiments, D1 (Difficult 1) and D2 (Difficult 2) from Dataset 2, compared to the top-performing clustering algorithms in these categories, G–NMS maintained lower SD, indicating consistent performance. Additionally, G–NMS demonstrated robustness and efficiency across various noise-contaminated neural recordings, ranging from low to high signal-to-noise ratios. (4) Conclusions: G–NMS’s integration of deep learning techniques and eigenvalue inclusion theorems has proven highly effective, marking a significant advancement in the clustering domain. Its superior performance, characterized by high accuracy and low variability, opens new avenues for the development of high-performing clustering algorithms, contributing significantly to the body of research in this field.
Full article