Next Article in Journal
Piecewise-Linear Frequency Shifting Algorithm for Frequency Resolution Enhancement in Digital Hearing Aids
Next Article in Special Issue
Human-Like Walking with Heel Off and Toe Support for Biped Robot
Previous Article in Journal
Target Tracking Based on a Nonsingular Fast Terminal Sliding Mode Guidance Law by Fixed-Wing UAV
Article Menu
Issue 4 (April) cover image

Export Article

Open AccessReview
Appl. Sci. 2017, 7(4), 336; doi:10.3390/app7040336

Perception-Driven Obstacle-Aided Locomotion for Snake Robots: The State of the Art, Challenges and Possibilities

1
Department of Engineering Cybernetics, NTNU – Norwegian University of Science and Technology, 7491 Trondheim, Norway
2
Mathematics and Cybernetics, SINTEF Digital, 7465 Trondheim, Norway
*
Author to whom correspondence should be addressed.
Received: 8 February 2017 / Revised: 8 March 2017 / Accepted: 25 March 2017 / Published: 29 March 2017
(This article belongs to the Special Issue Bio-Inspired Robotics)
View Full-Text   |   Download PDF [3056 KB, uploaded 31 March 2017]   |  

Abstract

In nature, snakes can gracefully traverse a wide range of different and complex environments. Snake robots that can mimic this behaviour could be fitted with sensors and transport tools to hazardous or confined areas that other robots and humans are unable to access. In order to carry out such tasks, snake robots must have a high degree of awareness of their surroundings (i.e., perception-driven locomotion) and be capable of efficient obstacle exploitation (i.e., obstacle-aided locomotion) to gain propulsion. These aspects are pivotal in order to realise the large variety of possible snake robot applications in real-life operations such as fire-fighting, industrial inspection, search-and-rescue, and more. In this paper, we survey and discuss the state of the art, challenges, and possibilities of perception-driven obstacle-aided locomotion for snake robots. To this end, different levels of autonomy are identified for snake robots and categorised into environmental complexity, mission complexity, and external system independence. From this perspective, we present a step-wise approach on how to increment snake robot abilities within guidance, navigation, and control in order to target the different levels of autonomy. Pertinent to snake robots, we focus on current strategies for snake robot locomotion in the presence of obstacles. Moreover, we put obstacle-aided locomotion into the context of perception and mapping. Finally, we present an overview of relevant key technologies and methods within environment perception, mapping, and representation that constitute important aspects of perception-driven obstacle-aided locomotion. View Full-Text
Keywords: obstacle-aided locomotion; environment perception; snake robots obstacle-aided locomotion; environment perception; snake robots
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Sanfilippo, F.; Azpiazu, J.; Marafioti, G.; Transeth, A.A.; Stavdahl, Ø.; Liljebäck, P. Perception-Driven Obstacle-Aided Locomotion for Snake Robots: The State of the Art, Challenges and Possibilities . Appl. Sci. 2017, 7, 336.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Appl. Sci. EISSN 2076-3417 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top