Next Article in Journal
Fault Detection of Stator Inter-Turn Short-Circuit in PMSM on Stator Current and Vibration Signal
Next Article in Special Issue
Visualization and Interpretation of Convolutional Neural Network Predictions in Detecting Pneumonia in Pediatric Chest Radiographs
Previous Article in Journal
Johnson–Holmquist-II(JH-2) Constitutive Model for Rock Materials: Parameter Determination and Application in Tunnel Smooth Blasting
Previous Article in Special Issue
Detection and Classification of Overlapping Cell Nuclei in Cytology Effusion Images Using a Double-Strategy Random Forest
Article Menu
Issue 9 (September) cover image

Export Article

Open AccessArticle
Appl. Sci. 2018, 8(9), 1676; https://doi.org/10.3390/app8091676

Associative Memories to Accelerate Approximate Nearest Neighbor Search

1
Electronics Department, IMT Atlantique, 29200 Brest, France
2
Fachbereich Mathematik und Informatik, University of Münster, Einsteinstraße 62, 48149 Münster, Germany
3
Laboratoire de Mathématiques, Université de Bretagne Occidentale, UMR CNRS 6205, 29200 Brest, France
*
Author to whom correspondence should be addressed.
Received: 22 August 2018 / Revised: 13 September 2018 / Accepted: 14 September 2018 / Published: 16 September 2018
(This article belongs to the Special Issue Advanced Intelligent Imaging Technology)
Full-Text   |   PDF [420 KB, uploaded 16 September 2018]   |  

Abstract

Nearest neighbor search is a very active field in machine learning. It appears in many application cases, including classification and object retrieval. In its naive implementation, the complexity of the search is linear in the product of the dimension and the cardinality of the collection of vectors into which the search is performed. Recently, many works have focused on reducing the dimension of vectors using quantization techniques or hashing, while providing an approximate result. In this paper, we focus instead on tackling the cardinality of the collection of vectors. Namely, we introduce a technique that partitions the collection of vectors and stores each part in its own associative memory. When a query vector is given to the system, associative memories are polled to identify which one contains the closest match. Then, an exhaustive search is conducted only on the part of vectors stored in the selected associative memory. We study the effectiveness of the system when messages to store are generated from i.i.d. uniform ±1 random variables or 0–1 sparse i.i.d. random variables. We also conduct experiments on both synthetic data and real data and show that it is possible to achieve interesting trade-offs between complexity and accuracy. View Full-Text
Keywords: similarity search; vector search; associative memory; exponential inequalities similarity search; vector search; associative memory; exponential inequalities
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Gripon, V.; Löwe, M.; Vermet, F. Associative Memories to Accelerate Approximate Nearest Neighbor Search. Appl. Sci. 2018, 8, 1676.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Appl. Sci. EISSN 2076-3417 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top