Next Article in Journal
A 3D CFD Simulation and Analysis of Flow-Induced Forces on Polymer Piezoelectric Sensors in a Chinese Liquors Identification E-Nose
Next Article in Special Issue
GNSS Spoofing Network Monitoring Based on Differential Pseudorange
Previous Article in Journal / Special Issue
Iterative Diffusion-Based Distributed Cubature Gaussian Mixture Filter for Multisensor Estimation
Open AccessArticle

A Self-Synthesis Approach to Perceptual Learning for Multisensory Fusion in Robotics

Neuroscientific System Theory Group, Department of Electrical and Computer Engineering, Technical University of Munich, Arcisstrasse 21, Munich 80333, Germany
*
Author to whom correspondence should be addressed.
Academic Editors: Xue-Bo Jin, Feng-Bao Yang, Shuli Sun and Hong Wei
Sensors 2016, 16(10), 1751; https://doi.org/10.3390/s16101751
Received: 20 June 2016 / Revised: 11 October 2016 / Accepted: 12 October 2016 / Published: 20 October 2016
(This article belongs to the Special Issue Advances in Multi-Sensor Information Fusion: Theory and Applications)
Biological and technical systems operate in a rich multimodal environment. Due to the diversity of incoming sensory streams a system perceives and the variety of motor capabilities a system exhibits there is no single representation and no singular unambiguous interpretation of such a complex scene. In this work we propose a novel sensory processing architecture, inspired by the distributed macro-architecture of the mammalian cortex. The underlying computation is performed by a network of computational maps, each representing a different sensory quantity. All the different sensory streams enter the system through multiple parallel channels. The system autonomously associates and combines them into a coherent representation, given incoming observations. These processes are adaptive and involve learning. The proposed framework introduces mechanisms for self-creation and learning of the functional relations between the computational maps, encoding sensorimotor streams, directly from the data. Its intrinsic scalability, parallelisation, and automatic adaptation to unforeseen sensory perturbations make our approach a promising candidate for robust multisensory fusion in robotic systems. We demonstrate this by applying our model to a 3D motion estimation on a quadrotor. View Full-Text
Keywords: self-construction; self-organization; correlation learning; multisensory fusion; cortically inspired network; mobile robotics self-construction; self-organization; correlation learning; multisensory fusion; cortically inspired network; mobile robotics
Show Figures

Figure 1

MDPI and ACS Style

Axenie, C.; Richter, C.; Conradt, J. A Self-Synthesis Approach to Perceptual Learning for Multisensory Fusion in Robotics. Sensors 2016, 16, 1751.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop