Next Article in Journal
Incremental Learning for Classification of Unstructured Data Using Extreme Learning Machine
Next Article in Special Issue
Learning Representations of Natural Language Texts with Generative Adversarial Networks at Document, Sentence, and Aspect Level
Previous Article in Journal
Online Uniformly Inserting Points on the Sphere
Previous Article in Special Issue
An Auto-Adjustable Semi-Supervised Self-Training Algorithm
Article Menu

Export Article

Open AccessArticle
Algorithms 2018, 11(10), 157; https://doi.org/10.3390/a11100157

LSTM Accelerator for Convolutional Object Identification

1
Computer Engineering and Informatics Department, University of Patras, Patras 26504, Greece
2
Department of Informatics, Ionian University, Corfu 49100, Greece
*
Author to whom correspondence should be addressed.
Received: 4 August 2018 / Revised: 10 October 2018 / Accepted: 12 October 2018 / Published: 17 October 2018
(This article belongs to the Special Issue Humanistic Data Mining: Tools and Applications)
Full-Text   |   PDF [507 KB, uploaded 25 October 2018]   |  

Abstract

Deep Learning has dramatically advanced the state of the art in vision, speech and many other areas. Recently, numerous deep learning algorithms have been proposed to solve traditional artificial intelligence problems. In this paper, in order to detect the version that can provide the best trade-off in terms of time and accuracy, convolutional networks of various depths have been implemented. Batch normalization is also considered since it acts as a regularizer and achieves the same accuracy with fewer training steps. For maximizing the yield of the complexity by diminishing, as well as minimizing the loss of accuracy, LSTM neural net layers are utilized in the process. The image sequences are proven to be classified by the LSTM in a more accelerated manner, while managing better precision. Concretely, the more complex the CNN, the higher the percentages of exactitude; in addition, but for the high-rank increase in accuracy, the time was significantly decreased, which eventually rendered the trade-off optimal. The average improvement of performance for all models regarding both datasets used amounted to 42 % . View Full-Text
Keywords: batch normalization; convolutional neural networks; deep learning; image classification; knowledge extraction; LSTM neural networks; recommendation systems batch normalization; convolutional neural networks; deep learning; image classification; knowledge extraction; LSTM neural networks; recommendation systems
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Savvopoulos, A.; Kanavos, A.; Mylonas, P.; Sioutas, S. LSTM Accelerator for Convolutional Object Identification. Algorithms 2018, 11, 157.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Algorithms EISSN 1999-4893 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top