Abstract
Biological principles draw attention to service robotics because of similar concepts when robots operate various tasks. Bioinspired perception is significant for robotic perception, which is inspired by animals’ awareness of the environment. This paper reviews the bioinspired perception and navigation of service robots in indoor environments, which are popular applications of civilian robotics. The navigation approaches are classified by perception type, including vision-based, remote sensing, tactile sensor, olfactory, sound-based, inertial, and multimodal navigation. The trend of state-of-art techniques is moving towards multimodal navigation to combine several approaches. The challenges in indoor navigation focus on precise localization and dynamic and complex environments with moving objects and people.
1. Introduction
Service robotics has been popular in indoor environments, such as offices, campuses, hotels, and homes. Modern robotic techniques promote robots for autonomous operation in dynamically changing and unstructured environments, and one main technology is to develop autonomous and intelligent robots inspired by biological systems [1].
Biological principles drawn from animals have the potential to produce new ideas for the improvement of robotic techniques. Animals usually have excellent navigation abilities and outstanding robustness, which outperforms current techniques [2]. Animals’ decision-making solutions, action selection, navigation, spatial representation, perception, and explorations make them capable of foraging, homing, and hunting [3].
Robotics aims to achieve complex goals and perform tasks in an unknown environment, so the great inspiration of nature has been promoted. The goal-directed navigation of mobile robots is regarded as animals seeking migration, finding food, and performing similar tasks [4]. With animal perception, cognition, body architecture, and behavior research, biorobotics and neurorobotics bring attention to robotics research. Bioinspired robots can perform several tasks, including transport [5], floor-sweeping [6], security [7], caring [2], and exploration [8].
A robot needs to process sensory information to obtain relevant environmental data by extensive computation or active sensors [9]. Robotic perception is significant for robot navigation to construct an environmental representation with a precise metric map or sensory snapshots [4]. Perception makes a robot aware of its position and how to deviate from an unexpected obstacle blocking its path. Moreover, robots should identify the properties of objects to perform safe and efficient tasks [10].
Reconfigurable robotic systems can respond to the application scenario with their efficiency and versatility, and many robotic platforms are based on a biomimetic design from naturally evolving mechanisms [11]. Intelligent robots can maximize extrinsic and intrinsic rewards and adapt to environmental changes [12]. Robots are created with a high degree of autonomy with autonomous training and adaptation, and exploration and navigation are essential factors.
Several sensors can be used in autonomous vehicles for location and sensing, while indoor environments and buildings are GNSS-denied environments, and virtual-based approaches play a crucial role in high-precision sensing [13]. Tactile [14] and olfactory navigation [15] are also involved in indoor environments. Multimodal navigation is the trend of bioinspired perception and navigation which incorporates the strength of each approach to enhance performance.
This review paper aimed to examine robotic perception and navigation techniques in indoor environments, addressing potential research challenges. The contents are classified by the sensors involved in the approaches. This paper introduces vision-based navigation, remote sensing, multimodel navigation, etc., in Section 2, Section 3, Section 4, Section 5, Section 6, Section 7, Section 8 and Section 9, then provides a discussion and is concluded in Section 10.
9. Others
Corrales-Paredes et al. [158] proposed an environment signaling system with radiofrequency identification (RFID) for social robot localization and navigation. It used the human–robot interaction to get information for the waymarking or signaling process, and it successfully experimented in a structured indoor environment. The robot could not learn from the environment or possible changes. Other environments were considered possible improvements, such as museums, hospitals, shopping centers, etc.
Collective gradient perception was enhanced based on the desired speed and distance modulations for a robot swarm in [159]. It used social interactions among robots to follow the gradient with an ultrawideband (UWB) sensor, assisted by a laser distance sensor, and a flow camera. It could be used for searching and localizing sources, but it would need more sensors to sense light, temperature, or gas particles.
Le et al. [160] introduced a multirobot formation with a sensor fusion of UWB, IMU, and wheel encoders within a cluttered unknown environment. The global path planning algorithm incorporated skid-steering kinematic path tracking, and the dynamic linearization decoupled the dynamics to control the leader robot of the formation. However, the experiment was only conducted in static environments.
10. Discussion and Conclusions
From the literature, vision-based, remote sensing, tactile-sensor-based, olfactory, sound-based, inertial, and multimodal navigation are the main bioinspired perception and navigation systems for service robots in indoor environments. They are inspired by animals such as rats, bats, and mammalian, etc. Environmental information is gained through different sensors, depending on the applications or surroundings. The vision-based approaches are the most popular among these methods, as well as in combination with other approaches for multimodal navigation.
More precisely, Table 1 lists the vision-based approaches with their contributions, sensors, and real-time performance, and the most popular method is the optic flow, which is used by bees. The vision-based applications include transport [9,18,39,46,50,60], exploration [8,19,24,26,40,41,45,49], tracking and assist [31,56,58], security and surveillance [7,52], homing and searching [5,54,55], floor cleaning [6,43], and search and rescue [44]. Only 43.75% of the cited papers indicate they can operate in real time, which is achieved by processing or computation speed, minimal computation load, or predefined parameters. The challenges of visual navigation include not having an optimal path or dynamic obstacles, more parameters or coefficients that should be considered, the accuracy in the navigation, an optimal implementation, limited assumptions, and more sensors or approaches required.
Table 1.
Vision-based navigation.
Additionally, Table 2 displays papers about remote sensing navigation, focusing on LiDAR, laser ranger, and infrared sensors. The tasks include exploration [63,64,66,71], wheelchair [78], transport [65,73,74,75], fire services or search and rescue [70], floor cleaning [68], tracking [76], education [72], and caring and service [79]. Some approaches are combined with path planning algorithms, including metaheuristic algorithm, potential field, and A* for navigation [66,68,70,78]. Online performance only reaches 29.41% in remote sensing perception with models, algorithms, or controllers. The weaknesses of the cited remote sensing techniques include unfiltered points, safety issues, real implementation, multirobot exploration, sensors’ limitations, control parameters, sensing systems, and environmental adaptability.
Table 2.
Remote sensing navigation.
Table 3 compares tactile-sensor-based, olfaction sensor-based, sound-based, and inertial navigation. Tactile sensor-based approaches with online performance utilize board processing, feedback, or technologies. Future work on tactile-based approaches is about improving accuracy, and the applications are autonomous transport and perception [14,81,82,83]. Olfaction-based navigation approaches are implemented for transport [15,84,86,87], gas dispersion [85], and gas source localization [88]. The limitations of olfactory navigation include obstacle avoidance techniques, active searching, several odor plumes, and precise localization. The sound-based approaches focus on sonar or ultrasonic sensors. The primary tasks are exploration [89,92] and transport [11,90,91,93]. Some approaches are combined with metaheuristic algorithms for efficiency, such as [91,93]. The challenges are sensor displacement, sensor fusion, online collision detection and planning, and a dynamic environment.
Table 3.
Tactile sensor-based, olfaction sensor-based, sound-based, and inertial navigation approaches.
Inertial navigation usually is applied for reconfigurable robots [99,100,101,102], and some approaches uses sensor fusion techniques [96,97,98]. Inertial navigation can be utilized for exploration [96,99], sweeping [100], and autonomous transport [97,101]. Other navigation implements RFID or UWB for localization for entertainment [158] and searching [159]. The drawbacks of the research include optimization reconfiguration, autonomous capabilities, performance, sensor fusion, dynamic environment with dynamic obstacles, and the kinematic model. The online performance of the sound-based and inertial navigation systems is achieved via an algorithm or the model efficiency. Improving the framework for sensors and operating within dynamic environments are considered future work.
Moreover, Table 4 lists the multimodal navigation. The applications of the multimodal navigation consist of entertainment [143], security [133], transport [2,106,107,110,112,147], assistance [2,16,108], exploration [3,134,135,138,141], social applications [12,140], tracking [105,125,131,145,146], caring and monitoring [113], disaster monitoring or search and rescue [136,149], floor cleaning [137], wheeled robot [109], person following [129], and false-ceiling inspection [157]. The combination of virtual sensors and neural networks is most commonly used in multimodal navigation, which represents 77.19% of the cited research.
Table 4.
Multimodal navigation.
Furthermore, 59.65% of the navigation approaches can perform real-time feedback due to the AI-based approach, such as learning algorithm, neural network, fuzzy logic, and optimization. Hardware, such as the architecture and controller, achieves some of the online performance. Challenges exist in multimodal navigation including complex environments, remapping and reusing different environments, reducing computational resources, undesired responses, learning datasets, cognitive phenomenon, parameter settings, layer or neuron design, energy, accuracy, dynamic obstacles, and real experiments.
From the bioinspired perception and navigation review, the main applications include autonomous transport, exploration, floor sweep, and search and rescue. These strategies allow service robots to operate safely, estimating their states and positions relative to their surroundings. Multimodal navigation offers real-time performance due to the AI-based approach, and it combines with other sensors for perception. The most popular collaboration is with visual sensors and neural networks.
In the real implementation of robot perception and navigation, indoor environments are dynamic and changing, including moving objects or people, challenging lighting [25], and stairs or carpets [99]. The dynamic obstacles are unpredictable and hard to avoid. However, most papers do not consider dynamic environments, except [13,136,154], which tried to solve the problem with learning approaches. Some studies indicate a dynamic environment as future work [13,93,97,124,157]. However, the problem of moving objects or dynamic obstacles is still not solved. Navigation in dynamic environments requires real-time performance, high adaptability, a quick decision-making ability, and object detection and avoidance.
Learning ability and adaptability are also future research directions. The trend of bioinspired perception is moving towards multimodal approaches, which are expected to provide real-time responses [132,143]. The learning ability enables a robot to use previous information to train the model to process new information and quickly respond to changes in surroundings. Neural networks and machine learning are taken into account for learning strategies, such as SNN [119], reinforcement learning [12], CNN [105], attention mechanism [143], etc. Continual detection and avoidance algorithms should also be considered. Fault detection and tolerance frameworks are expected to be developed in future research.
Sensor fusion is one of the main directions of research, which incorporates several types of sensors, such as combining visual, tactile, and auditory sensors [143], a tactile model with nine-DOF MEMS MARG [10], IMU and visual sensors [96,102], more multimodal approaches (refer to Section 8), etc. Because a single sensor easily gains some bias, other sensory inputs can be used to correct these errors. A great sensor fusion algorithm can provide accurate localization and navigation to determine the robot’s orientation and position. The dynamic and unpredictable environment requires high accuracy to locate the robot and its surroundings. It is also crucial for swarm operation.
Future research also focuses on swarm intelligence, which consists of multiple robots in large groups. Swarm navigation allows robots to execute complicated tasks, explore unknown areas, and improve efficiency. The communication between swarm individuals and the kinematics of different robots are significant challenges [61]. The sensor-based communication protocols must be addressed in physical swarm systems [62]. The issues of the optimization of the navigation algorithm, decision-making strategy, energy consumption, and safety are essential for deployment [66,73]. The swarm size, behavior, and coordinated movements must also be considered [75].
Real-world experiments remain a challenge [50,75,109]. Future research should test and validate approaches in different complex environments, not just be restricted to a specific or simple environment. The representation of the cells and obstacle should consider irregular shapes [97,132,156]. Hardware limitations and computational performance also limit the deployment of bioinspired models [72,123]. It is challenging to integrate the developed approaches into a suitable robot with the required sensors successfully.
Author Contributions
Conceptualization, S.L.; methodology, S.L.; software, S.L.; validation, S.L.; formal analysis, S.L.; investigation, S.L.; resources, S.L.; data curation, S.L.; writing—original draft preparation, S.L.; writing—review and editing, S.L., A.L. and J.W.; visualization, S.L.; supervision, J.W.; project administration, J.W.; funding acquisition, J.W. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
| SLAM | Simultaneous localization and mapping system |
| ACO | Ant colony optimization |
| HSI | Hue, saturation, and intensity |
| HiLAM | Hierarchical look-ahead trajectory model |
| APF | Artificial potential field |
| PF | Particle filter |
| PSO | Particle swarm optimization |
| GDM | Gas distribution mapping |
| IMU | Inertial measurement unit |
| EKF | Extended Kalman filter |
| SLIC-SVM | Simple-linear-iterative-clustering-based support vector machine |
| SL | Supervised learning |
| DRL | Deep reinforcement learning |
| RFID | Radiofrequency identification |
| UWB | Ultrawideband |
| MDP | Markov decision process |
| AER | Address-event representation |
| SNN | Spiking neural network |
| CNN | Convolutional neural network |
| MEMS | Micro-electromechanical systems |
References
- Fukuda, T.; Chen, F.; Shi, Q. Special feature on bio-inspired robotics. Appl. Sci. 2018, 8, 817. [Google Scholar] [CrossRef]
- Metka, B.; Franzius, M.; Bauer-Wersing, U. Bio-inspired visual self-localization in real world scenarios using Slow Feature Analysis. PLoS ONE 2018, 13, e0203994. [Google Scholar] [CrossRef] [PubMed]
- Pardo-Cabrera, J.; Rivero-Ortega, J.D.; Hurtado-López, J.; Ramírez-Moreno, D.F. Bio-inspired navigation and exploration system for a hexapod robotic platform. Eng. Res. Express 2022, 4, 025019. [Google Scholar] [CrossRef]
- Milford, M.; Schulz, R. Principles of goal-directed spatial robot navigation in biomimetic models. Philos. Trans. R. Soc. B Biol. Sci. 2014, 369, 20130484. [Google Scholar] [CrossRef] [PubMed]
- Maravall, D.; De Lope, J.; Fuentes, J.P. Navigation and self-semantic location of drones in indoor environments by combining the visual bug algorithm and entropy-based vision. Front. Neurorobot. 2017, 11, 46. [Google Scholar] [CrossRef]
- Rao, J.; Bian, H.; Xu, X.; Chen, J. Autonomous Visual Navigation System Based on a Single Camera for Floor-Sweeping Robot. Appl. Sci. 2023, 13, 1562. [Google Scholar] [CrossRef]
- Ayuso, F.; Botella, G.; García, C.; Prieto, M.; Tirado, F. GPU-based acceleration of bio-inspired motion estimation model. Concurr. Comput. 2013, 25, 1037–1056. [Google Scholar] [CrossRef]
- Gibaldi, A.; Vanegas, M.; Canessa, A.; Sabatini, S.P. A Portable Bio-Inspired Architecture for Efficient Robotic Vergence Control. Int. J. Comput. Vis. 2017, 121, 281–302. [Google Scholar] [CrossRef]
- Meyer, H.G.; Klimeck, D.; Paskarbeit, J.; Rückert, U.; Egelhaaf, M.; Porrmann, M.; Schneider, A. Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PLoS ONE 2020, 15, e0230620. [Google Scholar] [CrossRef]
- De Oliveira, T.E.A.; Cretu, A.M.; Petriu, E.M. Multimodal bio-inspired tactile sensing module for surface characterization. Sensors 2017, 17, 1187. [Google Scholar] [CrossRef]
- Rao, A.; Elara, M.R.; Elangovan, K. Constrained VPH+: A local path planning algorithm for a bio-inspired crawling robot with customized ultrasonic scanning sensor. Robot. Biomim. 2016, 3, 12. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Ramezani Dooraki, A.; Lee, D.J. An End-to-End Deep Reinforcement Learning-Based Intelligent Agent Capable of Autonomous Exploration in Unknown Environments. Sensors 2018, 18, 3575. [Google Scholar] [CrossRef]
- Wang, Y.; Shao, B.; Zhang, C.; Zhao, J.; Cai, Z. REVIO: Range- and Event-Based Visual-Inertial Odometry for Bio-Inspired Sensors. Biomimetics 2022, 7, 169. [Google Scholar] [CrossRef] [PubMed]
- Luneckas, M.; Luneckas, T.; Udris, D.; Plonis, D.; Maskeliūnas, R.; Damaševičius, R. A hybrid tactile sensor-based obstacle overcoming method for hexapod walking robots. Intell. Serv. Robot. 2021, 14, 9–24. [Google Scholar] [CrossRef]
- Villarreal, B.L.; Olague, G.; Gordillo, J.L. Synthesis of odor tracking algorithms with genetic programming. Neurocomputing 2016, 175, 1019–1032. [Google Scholar] [CrossRef]
- Gay, S.; Le Run, K.; Pissaloux, E.; Romeo, K.; Lecomte, C. Towards a Predictive Bio-Inspired Navigation Model. Information 2021, 12, 100. [Google Scholar] [CrossRef]
- Roubieu, F.L.; Serres, J.R.; Colonnier, F.; Franceschini, N.; Viollet, S.; Ruffier, F. A biomimetic vision-based hovercraft accounts for bees’ complex behaviour in various corridors. Bioinspir. Biomim. 2014, 9, 36003. [Google Scholar] [CrossRef]
- Bertrand, O.J.N.; Lindemann, J.P.; Egelhaaf, M. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes. PLoS Comput. Biol. 2015, 11, e1004339. [Google Scholar] [CrossRef]
- Yadipour, M.; Billah, M.A.; Faruque, I.A. Optic flow enrichment via Drosophila head and retina motions to support inflight position regulation. J. Theor. Biol. 2023, 562, 111416. [Google Scholar] [CrossRef]
- Hyslop, A.; Krapp, H.G.; Humbert, J.S. Control theoretic interpretation of directional motion preferences in optic flow processing interneurons. Biol. Cybern. 2010, 103, 353–364. [Google Scholar] [CrossRef]
- Liu, S.C.; Delbruck, T.; Indiveri, G.; Whatley, A.; Douglas, R.; Douglas, R. Event-Based Neuromorphic Systems; John Wiley & Sons, Incorporated: New York, NY, UK, 2015. [Google Scholar]
- Gallego, G.; Delbruck, T.; Orchard, G.; Bartolozzi, C.; Taba, B.; Censi, A.; Leutenegger, S.; Davison, A.J.; Conradt, J.; Daniilidis, K.; et al. Event-Based Vision: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 154–180. [Google Scholar] [CrossRef] [PubMed]
- Paredes-Valles, F.; Scheper, K.Y.W.; de Croon, G.C.H.E. Unsupervised Learning of a Hierarchical Spiking Neural Network for Optical Flow Estimation: From Events to Global Motion Perception. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2051–2064. [Google Scholar] [CrossRef] [PubMed]
- Xu, P.; Humbert, J.S.; Abshire, P. Analog VLSI Implementation of Wide-field Integration Methods. J. Intell. Robot. Syst. 2011, 64, 465–487. [Google Scholar] [CrossRef]
- Zhu, A.Z.; Yuan, L.; Chaney, K.; Daniilidis, K. EV-FlowNet: Self-Supervised Optical Flow Estimation for Event-based Cameras. arXiv 2018. [Google Scholar] [CrossRef]
- Ruffier, F.; Viollet, S.; Franceschini, N. Visual control of two aerial micro-robots by insect-based autopilots. Adv. Robot. 2004, 18, 771–786. [Google Scholar] [CrossRef]
- Li, J.; Lindemann, J.P.; Egelhaaf, M. Peripheral Processing Facilitates Optic Flow-Based Depth Perception. Front. Comput. Neurosci. 2016, 10, 111. [Google Scholar] [CrossRef]
- de Croon, G.C.H.E.; Dupeyroux, J.J.G.; de Wagter, C.; Chatterjee, A.; Olejnik, D.A.; Ruffier, F. Accommodating unobservability to control flight attitude with optic flow. Nature 2022, 610, 485–490. [Google Scholar] [CrossRef]
- Vanhoutte, E.; Mafrica, S.; Ruffier, F.; Bootsma, R.J.; Serres, J. Time-of-travel methods for measuring optical flow on board a micro flying robot. Sensors 2017, 17, 571. [Google Scholar] [CrossRef]
- Serres, J.R.; Ruffier, F. Optic flow-based collision-free strategies: From insects to robots. Arthropod Struct. Dev. 2017, 46, 703–717. [Google Scholar] [CrossRef]
- Igual, F.D.; Botella, G.; García, C.; Prieto, M.; Tirado, F. Robust motion estimation on a low-power multi-core DSP. EURASIP J. Adv. Signal Process. 2013, 2013, 99. [Google Scholar] [CrossRef]
- Gremillion, G.; Humbert, J.S.; Krapp, H.G. Bio-inspired modeling and implementation of the ocelli visual system of flying insects. Biol. Cybern. 2014, 108, 735–746. [Google Scholar] [CrossRef] [PubMed]
- Zufferey, J.C.; Klaptocz, A.; Beyeler, A.; Nicoud, J.D.; Floreano, D. A 10-gram vision-based flying robot. Adv. Robot. 2007, 21, 1671–1684. [Google Scholar] [CrossRef]
- Serres, J.; Dray, D.; Ruffier, F.; Franceschini, N. A vision-based autopilot for a miniature air vehicle: Joint speed control and lateral obstacle avoidance. Auton. Robot. 2008, 25, 103–122. [Google Scholar] [CrossRef]
- Serres, J.R.; Ruffier, F. Biomimetic Autopilot Based on Minimalistic Motion Vision for Navigating along Corridors Comprising U-shaped and S-shaped Turns. J. Bionics Eng. 2015, 12, 47–60. [Google Scholar] [CrossRef]
- Kobayashi, N.; Bando, M.; Hokamoto, S.; Kubo, D. Guidelines for practical navigation systems based on wide-field-integration of optic flow. Asian J. Control 2021, 23, 2381–2392. [Google Scholar] [CrossRef]
- Serres, J.; Ruffier, F.; Viollet, S.; Franceschini, N. Toward Optic Flow Regulation for Wall-Following and Centring Behaviours. Int. J. Adv. Robot. Syst. 2006, 3, 23. [Google Scholar] [CrossRef]
- McGuire, K.; de Croon, G.; De Wagter, C.; Tuyls, K.; Kappen, H. Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone. IEEE Robot. Autom. Lett. 2017, 2, 1070–1076. [Google Scholar] [CrossRef]
- Mounir, A.; Rachid, L.; Ouardi, A.E.; Tajer, A. Workload Partitioning of a Bio-inspired Simultaneous Localization and Mapping Algorithm on an Embedded Architecture. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 221–229. [Google Scholar] [CrossRef]
- Vidal, A.R.; Rebecq, H.; Horstschaefer, T.; Scaramuzza, D. Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios. IEEE Robot. Autom. Lett. 2018, 3, 994–1001. [Google Scholar] [CrossRef]
- Ghosh, S.; Gallego, G. Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion. Adv. Intell. Syst. 2022, 4, 2200221. [Google Scholar] [CrossRef]
- Gelen, A.G.; Atasoy, A. An Artificial Neural SLAM Framework for Event-Based Vision. IEEE Access 2023, 11, 58436–58450. [Google Scholar] [CrossRef]
- Pathmakumar, T.; Muthugala, M.A.V.J.; Samarakoon, S.M.B.P.; Gómez, B.F.; Elara, M.R. A Novel Path Planning Strategy for a Cleaning Audit Robot Using Geometrical Features and Swarm Algorithms. Sensors 2022, 22, 5317. [Google Scholar] [CrossRef] [PubMed]
- Nantogma, S.; Ran, W.; Liu, P.; Yu, Z.; Xu, Y. Immunized Token-Based Approach for Autonomous Deployment of Multiple Mobile Robots in Burnt Area. Remote Sens. 2021, 13, 4135. [Google Scholar] [CrossRef]
- Jacobson, A.; Chen, Z.; Milford, M. Autonomous Multisensor Calibration and Closed-loop Fusion for SLAM. J. Field Robot. 2015, 32, 85–122. [Google Scholar] [CrossRef]
- Wu, C.; Yu, S.; Chen, L.; Sun, R. An Environmental-Adaptability-Improved RatSLAM Method Based on a Biological Vision Model. Machines 2022, 10, 259. [Google Scholar] [CrossRef]
- Erdem, U.M.; Milford, M.J.; Hasselmo, M.E. A hierarchical model of goal directed navigation selects trajectories in a visual environment. Neurobiol. Learn. Mem. 2015, 117, 109–121. [Google Scholar] [CrossRef]
- Sadeghi Amjadi, A.; Raoufi, M.; Turgut, A.E. A self-adaptive landmark-based aggregation method for robot swarms. Adapt. Behav. 2022, 30, 223–236. [Google Scholar] [CrossRef]
- Yu, S.E.; Lee, C.; Kim, D. Analyzing the effect of landmark vectors in homing navigation. Adapt. Behav. 2012, 20, 337–359. [Google Scholar] [CrossRef]
- Yu, X.; Yu, H. A novel low-altitude reconnaissance strategy for smart UAVs: Active perception and chaotic navigation. Trans. Inst. Meas. Control 2011, 33, 610–630. [Google Scholar] [CrossRef]
- Mair, E.; Augustine, M.; Jäger, B.; Stelzer, A.; Brand, C.; Burschka, D.; Suppa, M. A biologically inspired navigation concept based on the Landmark-Tree map for efficient long-distance robot navigation. Adv. Robot. 2014, 28, 289–302. [Google Scholar] [CrossRef]
- Salih, T.A.; Ghazal, M.T.; Mohammed, Z.G. Development of a dynamic intelligent recognition system for a real-time tracking robot. IAES Int. J. Robot. Autom. 2021, 10, 161. [Google Scholar] [CrossRef]
- Cheng, Y.; Jiang, P.; Hu, Y.F. A biologically inspired intelligent environment architecture for mobile robot navigation. Int. J. Intell. Syst. Technol. Appl. 2012, 11, 138–156. [Google Scholar] [CrossRef]
- Li, H.; Wang, H.; Cui, L.; Li, J.; Wei, Q.; Xia, J. Design and Experiments of a Compact Self-Assembling Mobile Modular Robot with Joint Actuation and Onboard Visual-Based Perception. Appl. Sci. 2022, 12, 3050. [Google Scholar] [CrossRef]
- Mathai, N.J.; Zourntos, T.; Kundur, D. Vector Field Driven Design for Lightweight Signal Processing and Control Schemes for Autonomous Robotic Navigation. EURASIP J. Adv. Signal Process. 2009, 2009, 984752. [Google Scholar] [CrossRef][Green Version]
- Boudra, S.; Berrached, N.E.; Dahane, A. Efficient and secure real-time mobile robots cooperation using visual servoing. Int. J. Electr. Comput. Eng. 2020, 10, 3022. [Google Scholar] [CrossRef]
- Ahmad, S.; Sunberg, Z.N.; Humbert, J.S. End-to-End Probabilistic Depth Perception and 3D Obstacle Avoidance using POMDP. J. Intell. Robot. Syst. 2021, 103, 33. [Google Scholar] [CrossRef]
- Nguyen, T.; Mann, G.K.I.; Gosine, R.G.; Vardy, A. Appearance-Based Visual-Teach-And-Repeat Navigation Technique for Micro Aerial Vehicle. J. Intell. Robot. Syst. 2016, 84, 217–240. [Google Scholar] [CrossRef]
- Sinha, A.; Tan, N.; Mohan, R.E. Terrain perception for a reconfigurable biomimetic robot using monocular vision. Robot. Biomim. 2014, 1, 1–23. [Google Scholar] [CrossRef]
- Montiel-Ross, O.; Sepúlveda, R.; Castillo, O.; Quiñones, J. Efficient Stereoscopic Video Matching and Map Reconstruction for a Wheeled Mobile Robot. Int. J. Adv. Robot. Syst. 2012, 9, 120. [Google Scholar] [CrossRef]
- Aznar, F.; Pujol, M.; Rizo, R.; Rizo, C. Modelling multi-rotor UAVs swarm deployment using virtual pheromones. PLoS ONE 2018, 13, e0190692. [Google Scholar] [CrossRef]
- Yang, J.; Wang, X.; Bauer, P. V-Shaped Formation Control for Robotic Swarms Constrained by Field of View. Appl. Sci. 2018, 8, 2120. [Google Scholar] [CrossRef]
- Ohradzansky, M.T.; Humbert, J.S. Lidar-Based Navigation of Subterranean Environments Using Bio-Inspired Wide-Field Integration of Nearness. Sensors 2022, 22, 849. [Google Scholar] [CrossRef]
- Lopes, L.; Bodo, B.; Rossi, C.; Henley, S.; Zibret, G.; Kot-Niewiadomska, A.; Correia, V. ROBOMINERS; developing a bio-inspired modular robot miner for difficult to access mineral deposits. Adv. Geosci. 2020, 54, 99–108. [Google Scholar] [CrossRef]
- Jiang, Y.; Peng, P.; Wang, L.; Wang, J.; Wu, J.; Liu, Y. LiDAR-Based Local Path Planning Method for Reactive Navigation in Underground Mines. Remote Sens. 2023, 15, 309. [Google Scholar] [CrossRef]
- Romeh, A.E.; Mirjalili, S. Multi-Robot Exploration of Unknown Space Using Combined Meta-Heuristic Salp Swarm Algorithm and Deterministic Coordinated Multi-Robot Exploration. Sensors 2023, 23, 2156. [Google Scholar] [CrossRef]
- Moreno, L.; Garrido, S.; Blanco, D. Mobile Robot Global Localization using an Evolutionary MAP Filter. J. Glob. Optim. 2007, 37, 381–403. [Google Scholar] [CrossRef]
- Le, A.V.; Prabakaran, V.; Sivanantham, V.; Mohan, R.E. Modified A-Star Algorithm for Efficient Coverage Path Planning in Tetris Inspired Self-Reconfigurable Robot with Integrated Laser Sensor. Sensors 2018, 18, 2585. [Google Scholar] [CrossRef] [PubMed]
- García, R.M.; Prieto-Castrillo, F.; González, G.V.; Tejedor, J.P.; Corchado, J.M. Stochastic navigation in smart cities. Energies 2017, 10, 929. [Google Scholar] [CrossRef]
- Saez-Pons, J.; Alboul, L.; Penders, J.; Nomdedeu, L. Multi-robot team formation control in the GUARDIANS project. Ind. Robot 2010, 37, 372–383. [Google Scholar] [CrossRef]
- Martinez, E.; Laguna, G.; Murrieta-Cid, R.; Becerra, H.M.; Lopez-Padilla, R.; LaValle, S.M. A motion strategy for exploration driven by an automaton activating feedback-based controllers. Auton. Robot. 2019, 43, 1801–1825. [Google Scholar] [CrossRef]
- Arvin, F.; Espinosa, J.; Bird, B.; West, A.; Watson, S.; Lennox, B. Mona: An Affordable Open-Source Mobile Robot for Education and Research. J. Intell. Robot. Syst. 2019, 94, 761–775. [Google Scholar] [CrossRef]
- Tarapore, D.; Christensen, A.L.; Timmis, J. Generic, scalable and decentralized fault detection for robot swarms. PLoS ONE 2017, 12, e0182058. [Google Scholar] [CrossRef] [PubMed]
- Ordaz-Rivas, E.; Rodriguez-Liñan, A.; Torres-Treviño, L. Autonomous foraging with a pack of robots based on repulsion, attraction and influence. Auton. Robot. 2021, 45, 919–935. [Google Scholar] [CrossRef]
- Gia Luan, P.; Truong Thinh, N. Self-Organized Aggregation Behavior Based on Virtual Expectation of Individuals with Wave-Based Communication. Electronics 2023, 12, 2220. [Google Scholar] [CrossRef]
- Baker, C.J.; Smith, G.E.; Balleri, A.; Holderied, M.; Griffiths, H.D. Biomimetic Echolocation With Application to Radar and Sonar Sensing. Proc. IEEE 2014, 102, 447–458. [Google Scholar] [CrossRef]
- Ordaz-Rivas, E.; Rodriguez-Liñan, A.; Aguilera-Ruíz, M.; Torres-Treviño, L. Collective Tasks for a Flock of Robots Using Influence Factor. J. Intell. Robot. Syst. 2019, 94, 439–453. [Google Scholar] [CrossRef]
- Bouraine, S.; Azouaoui, O. Safe Motion Planning Based on a New Encoding Technique for Tree Expansion Using Particle Swarm Optimization. Robotica 2021, 39, 885–927. [Google Scholar] [CrossRef]
- Martinez, F.; Rendon, A. Autonomous Motion Planning for a Differential Robot using Particle Swarm Optimization. Int. J. Adv. Comput. Sci. Appl. 2023, 14. [Google Scholar] [CrossRef]
- Arena, E.; Arena, P.; Strauss, R.; Patané, L. Motor-Skill Learning in an Insect Inspired Neuro-Computational Control System. Front. Neurorobot. 2017, 11, 12. [Google Scholar] [CrossRef]
- Xu, P.; Liu, J.; Liu, X.; Wang, X.; Zheng, J.; Wang, S.; Chen, T.; Wang, H.; Wang, C.; Fu, X.; et al. A bio-inspired and self-powered triboelectric tactile sensor for underwater vehicle perception. NPJ Flex. Electron. 2022, 6, 25. [Google Scholar] [CrossRef]
- Mulvey, B.W.; Lalitharatne, T.D.; Nanayakkara, T. DeforMoBot: A Bio-Inspired Deformable Mobile Robot for Navigation among Obstacles. IEEE Robot. Autom. Lett. 2023, 8, 3827–3834. [Google Scholar] [CrossRef]
- Yu, Z.; Sadati, S.M.H.; Perera, S.; Hauser, H.; Childs, P.R.N.; Nanayakkara, T. Tapered whisker reservoir computing for real-time terrain identification-based navigation. Sci. Rep. 2023, 13, 5213. [Google Scholar] [CrossRef] [PubMed]
- Rausyan Fikri, M.; Wibowo Djamari, D. Palm-sized quadrotor source localization using modified bio-inspired algorithm in obstacle region. Int. J. Electr. Comput. Eng. 2022, 12, 3494. [Google Scholar] [CrossRef]
- Ojeda, P.; Monroy, J.; Gonzalez-Jimenez, J. A Simulation Framework for the Integration of Artificial Olfaction into Multi-Sensor Mobile Robots. Sensors 2021, 21, 2041. [Google Scholar] [CrossRef]
- Yamada, M.; Ohashi, H.; Hosoda, K.; Kurabayashi, D.; Shigaki, S. Multisensory-motor integration in olfactory navigation of silkmoth, Bombyx mori, using virtual reality system. eLife 2021, 10, e72001. [Google Scholar] [CrossRef] [PubMed]
- Martinez, D.; Rochel, O.; Hugues, E. A biomimetic robot for tracking specific odors in turbulent plumes. Auton. Robot. 2006, 20, 185–195. [Google Scholar] [CrossRef]
- Soegiarto, D.; Trilaksono, B.R.; Adiprawita, W.; Idris Hidayat, E.M.; Prabowo, Y.A. Combining SLAM, GDM, and Anemotaxis for Gas Source Localization in Unknown and GPS-denied Environments. Int. J. Electr. Eng. Inform. 2022, 14, 514–551. [Google Scholar] [CrossRef]
- Schillebeeckx, F.; De Mey, F.; Vanderelst, D.; Peremans, H. Biomimetic Sonar: Binaural 3D Localization using Artificial Bat Pinnae. Int. J. Robot. Res. 2011, 30, 975–987. [Google Scholar] [CrossRef]
- Steckel, J.; Peremans, H. BatSLAM: Simultaneous localization and mapping using biomimetic sonar. PLoS ONE 2013, 8, e54076. [Google Scholar] [CrossRef]
- Abbasi, A.; MahmoudZadeh, S.; Yazdani, A.; Moshayedi, A.J. Feasibility assessment of Kian-I mobile robot for autonomous navigation. Neural Comput. Appl. 2022, 34, 1199–1218. [Google Scholar] [CrossRef]
- Tidoni, E.; Gergondet, P.; Kheddar, A.; Aglioti, S.M. Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot. Front. Neurorobot. 2014, 8, 20. [Google Scholar] [CrossRef] [PubMed]
- Ghosh, S.; Panigrahi, P.K.; Parhi, D.R. Analysis of FPA and BA meta-heuristic controllers for optimal path planning of mobile robot in cluttered environment. IET Sci. Meas. Technol. 2017, 11, 817–828. [Google Scholar] [CrossRef]
- Anumula, J.; Neil, D.; Delbruck, T.; Liu, S.C. Feature Representations for Neuromorphic Audio Spike Streams. Front. Neurosci. 2018, 12, 23. [Google Scholar] [CrossRef] [PubMed]
- Glackin, B.; Wall, J.A.; McGinnity, T.M.; Maguire, L.P.; McDaid, L.J. A spiking neural network model of the medial superior olive using spike timing dependent plasticity for sound localization. Front. Comput. Neurosci. 2010, 4, 18. [Google Scholar] [CrossRef]
- Kanoulas, D.; Tsagarakis, N.G.; Vona, M. Curved patch mapping and tracking for irregular terrain modeling: Application to bipedal robot foot placement. Robot. Auton. Syst. 2019, 119, 13–30. [Google Scholar] [CrossRef]
- Sabiha, A.D.; Kamel, M.A.; Said, E.; Hussein, W.M. Real-time path planning for autonomous vehicle based on teaching–learning-based optimization. Intell. Serv. Robot. 2022, 15, 381–398. [Google Scholar] [CrossRef]
- Chen, C.P.; Chen, J.Y.; Huang, C.K.; Lu, J.C.; Lin, P.C. Sensor data fusion for body state estimation in a bipedal robot and its feedback control application for stable walking. Sensors 2015, 15, 4925–4946. [Google Scholar] [CrossRef]
- Tan, N.; Mohan, R.E.; Elangovan, K. Scorpio: A biomimetic reconfigurable rolling–crawling robot. Int. J. Adv. Robot. Syst. 2016, 13. [Google Scholar] [CrossRef]
- Yi, L.; Le, A.V.; Hoong, J.C.C.; Hayat, A.A.; Ramalingam, B.; Mohan, R.E.; Leong, K.; Elangovan, K.; Tran, M.; Bui, M.V.; et al. Multi-Objective Instantaneous Center of Rotation Optimization Using Sensors Feedback for Navigation in Self-Reconfigurable Pavement Sweeping Robot. Mathematics 2022, 10, 3169. [Google Scholar] [CrossRef]
- Duivon, A.; Kirsch, P.; Mauboussin, B.; Mougard, G.; Woszczyk, J.; Sanfilippo, F. The Redesigned Serpens, a Low-Cost, Highly Compliant Snake Robot. Robotics 2022, 11, 42. [Google Scholar] [CrossRef]
- Kim, J.Y.; Kashino, Z.; Colaco, T.; Nejat, G.; Benhabib, B. Design and implementation of a millirobot for swarm studies–mROBerTO. Robotica 2018, 36, 1591–1612. [Google Scholar] [CrossRef]
- Fiack, L.; Cuperlier, N.; Miramond, B. Embedded and real-time architecture for bio-inspired vision-based robot navigation. J.-Real-Time Image Process. 2015, 10, 699–722. [Google Scholar] [CrossRef]
- Hartbauer, M. Simplified bionic solutions: A simple bio-inspired vehicle collision detection system. Bioinspir. Biomim. 2017, 12, 026007. [Google Scholar] [CrossRef] [PubMed]
- Porod, W.; Werblin, F.; Chua, L.O.; Roska, T.; Rodriguez-VÁZquez, Á.; Roska, B.; Fay, P.; Bernstein, G.H.; Huang, Y.F.; Csurgay, Á.I. Bio-Inspired Nano-Sensor-Enhanced CNN Visual Computer. Ann. N. Y. Acad. Sci. 2004, 1013, 92–109. [Google Scholar] [CrossRef]
- Colomer, S.; Cuperlier, N.; Bresson, G.; Gaussier, P.; Romain, O. LPMP: A Bio-Inspired Model for Visual Localization in Challenging Environments. Front. Robot. AI 2022, 8, 703811. [Google Scholar] [CrossRef]
- Tejera, G.; Llofriu, M.; Barrera, A.; Weitzenfeld, A. Bio-Inspired Robotics: A Spatial Cognition Model integrating Place Cells, Grid Cells and Head Direction Cells. J. Intell. Robot. Syst. 2018, 91, 85–99. [Google Scholar] [CrossRef]
- Jauffret, A.; Cuperlier, N.; Tarroux, P.; Gaussier, P. From self-assessment to frustration, a small step toward autonomy in robotic navigation. Front. Neurorobot. 2013, 7, 16. [Google Scholar] [CrossRef]
- Suzuki, M.; Floreano, D. Enactive Robot Vision. Adapt. Behav. 2008, 16, 122–128. [Google Scholar] [CrossRef]
- Li, W.; Wu, D.; Zhou, Y.; Du, J. A bio-inspired method of autonomous positioning using spatial association based on place cells firing. Int. J. Adv. Robot. Syst. 2017, 14, 172988141772801. [Google Scholar] [CrossRef]
- Yu, N.; Liao, Y.; Yu, H.; Sie, O. Construction of the rat brain spatial cell firing model on a quadruped robot. CAAI Trans. Intell. Technol. 2022, 7, 732–743. [Google Scholar] [CrossRef]
- Kyriacou, T. Using an evolutionary algorithm to determine the parameters of a biologically inspired model of head direction cells. J. Comput. Neurosci. 2012, 32, 281–295. [Google Scholar] [CrossRef]
- Montiel, H.; Martínez, F.; Martínez, F. Parallel control model for navigation tasks on service robots. J. Phys. Conf. Ser. 2021, 2135, 12002. [Google Scholar] [CrossRef]
- Yoo, H.; Cha, G.; Oh, S. Deep ego-motion classifiers for compound eye cameras. Sensors 2019, 19, 5275. [Google Scholar] [CrossRef]
- Skatchkovsky, N.; Jang, H.; Simeone, O. Spiking Neural Networks-Part III: Neuromorphic Communications. IEEE Commun. Lett. 2021, 25, 1746–1750. [Google Scholar] [CrossRef]
- Miskowicz, M. Send-On-Delta Concept: An Event-Based Data Reporting Strategy. Sensors 2006, 6, 49–63. [Google Scholar] [CrossRef]
- Tayarani-Najaran, M.H.; Schmuker, M. Event-Based Sensing and Signal Processing in the Visual, Auditory, and Olfactory Domain: A Review. Front. Neural Circuits 2021, 15, 610446. [Google Scholar] [CrossRef] [PubMed]
- Cheng, G.; Dean-Leon, E.; Bergner, F.; Rogelio Guadarrama Olvera, J.; Leboutet, Q.; Mittendorfer, P. A Comprehensive Realization of Robot Skin: Sensors, Sensing, Control, and Applications. Proc. IEEE 2019, 107, 2034–2051. [Google Scholar] [CrossRef]
- Cyr, A.; Thériault, F. Bio-inspired visual attention process using spiking neural networks controlling a camera. Int. J. Comput. Vis. Robot. 2019, 9, 39–55. [Google Scholar] [CrossRef]
- Floreano, D.; Zufferey, J.C.; Nicoud, J.D. From Wheels to Wings with Evolutionary Spiking Circuits. Artif. Life 2005, 11, 121–138. [Google Scholar] [CrossRef][Green Version]
- Alnajjar, F.; Bin Mohd Zin, I.; Murase, K. A Hierarchical Autonomous Robot Controller for Learning and Memory: Adaptation in a Dynamic Environment. Adapt. Behav. 2009, 17, 179–196. [Google Scholar] [CrossRef]
- Arena, P.; De Fiore, S.; Fortuna, L.; Frasca, M.; Patané, L.; Vagliasindi, G. Reactive navigation through multiscroll systems: From theory to real-time implementation. Auton. Robot. 2008, 25, 123–146. [Google Scholar] [CrossRef]
- Botella, G.; Martín H, J.A.; Santos, M.; Meyer-Baese, U. FPGA-based multimodal embedded sensor system integrating low- and mid-level vision. Sensors 2011, 11, 8164–8179. [Google Scholar] [CrossRef] [PubMed]
- Elouaret, T.; Colomer, S.; De Melo, F.; Cuperlier, N.; Romain, O.; Kessal, L.; Zuckerman, S. Implementation of a Bio-Inspired Neural Architecture for Autonomous Vehicles on a Multi-FPGA Platform. Sensors 2023, 23, 4631. [Google Scholar] [CrossRef] [PubMed]
- Sanket, N.J.; Singh, C.D.; Ganguly, K.; Fermuller, C.; Aloimonos, Y. GapFlyt: Active Vision Based Minimalist Structure-Less Gap Detection For Quadrotor Flight. IEEE Robot. Autom. Lett. 2018, 3, 2799–2806. [Google Scholar] [CrossRef]
- Luan, H.; Fu, Q.; Zhang, Y.; Hua, M.; Chen, S.; Yue, S. A Looming Spatial Localization Neural Network Inspired by MLG1 Neurons in the Crab Neohelice. Front. Neurosci. 2022, 15, 787256. [Google Scholar] [CrossRef] [PubMed]
- Wang, J.; Yan, R.; Tang, H. Multi-Scale Extension in an Entorhinal-Hippocampal Model for Cognitive Map Building. Front. Neurorobot. 2021, 14, 592057. [Google Scholar] [CrossRef] [PubMed]
- Barrera, A.; Cáceres, A.; Weitzenfeld, A.; Ramirez-Amaya, V. Comparative Experimental Studies on Spatial Memory and Learning in Rats and Robots. J. Intell. Robot. Syst. 2011, 63, 361–397. [Google Scholar] [CrossRef]
- Pang, L.; Zhang, Y.; Coleman, S.; Cao, H. Efficient Hybrid-Supervised Deep Reinforcement Learning for Person Following Robot. J. Intell. Robot. Syst. 2020, 97, 299–312. [Google Scholar] [CrossRef]
- Zhu, Y.; Luo, K.; Ma, C.; Liu, Q.; Jin, B. Superpixel Segmentation Based Synthetic Classifications with Clear Boundary Information for a Legged Robot. Sensors 2018, 18, 2808. [Google Scholar] [CrossRef]
- Arena, P.; Fortuna, L.; Lombardo, D.; Patané, L. Perception for Action: Dynamic Spatiotemporal Patterns Applied on a Roving Robot. Adapt. Behav. 2008, 16, 104–121. [Google Scholar] [CrossRef]
- Zeng, T.; Si, B. Cognitive mapping based on conjunctive representations of space and movement. Front. Neurorobot. 2017, 11, 61. [Google Scholar] [CrossRef]
- Shrivastava, R.; Kumar, P.; Tripathi, S.; Tiwari, V.; Rajput, D.S.; Gadekallu, T.R.; Suthar, B.; Singh, S.; Ra, I.H. A Novel Grid and Place Neuron’s Computational Modeling to Learn Spatial Semantics of an Environment. Appl. Sci. 2020, 10, 5147. [Google Scholar] [CrossRef]
- Kazmi, S.M.A.M.; Mertsching, B. Gist+RatSLAM: An Incremental Bio-inspired Place Recognition Front-End for RatSLAM. EAI Endorsed Trans. Creat. Technol. 2016, 3, 27–34. [Google Scholar] [CrossRef]
- Yu, F.; Shang, J.; Hu, Y.; Milford, M. NeuroSLAM: A brain-inspired SLAM system for 3D environments. Biol. Cybern. 2019, 113, 515–545. [Google Scholar] [CrossRef]
- Ni, J.; Wang, C.; Fan, X.; Yang, S.X. A Bioinspired Neural Model Based Extended Kalman Filter for Robot SLAM. Math. Probl. Eng. 2014, 2014, 905826. [Google Scholar] [CrossRef]
- Ramalingam, B.; Le, A.V.; Lin, Z.; Weng, Z.; Mohan, R.E.; Pookkuttath, S. Optimal selective floor cleaning using deep learning algorithms and reconfigurable robot hTetro. Sci. Rep. 2022, 12, 15938. [Google Scholar] [CrossRef] [PubMed]
- Tai, L.; Li, S.; Liu, M. Autonomous exploration of mobile robots through deep neural networks. Int. J. Adv. Robot. Syst. 2017, 14, 1–9. [Google Scholar] [CrossRef]
- Chatty, A.; Gaussier, P.; Hasnain, S.K.; Kallel, I.; Alimi, A.M. The effect of learning by imitation on a multi-robot system based on the coupling of low-level imitation strategy and online learning for cognitive map building. Adv. Robot. 2014, 28, 731–743. [Google Scholar] [CrossRef]
- Martín, F.; Ginés, J.; Rodríguez-Lera, F.J.; Guerrero-Higueras, A.M.; Matellán Olivera, V. Client-Server Approach for Managing Visual Attention, Integrated in a Cognitive Architecture for a Social Robot. Front. Neurorobot. 2021, 15, 630386. [Google Scholar] [CrossRef]
- Huang, W.; Tang, H.; Tian, B. Vision enhanced neuro-cognitive structure for robotic spatial cognition. Neurocomputing 2014, 129, 49–58. [Google Scholar] [CrossRef]
- Kulvicius, T.; Tamosiunaite, M.; Ainge, J.; Dudchenko, P.; Wörgötter, F. Odor supported place cell model and goal navigation in rodents. J. Comput. Neurosci. 2008, 25, 481–500. [Google Scholar] [CrossRef][Green Version]
- Marques-Villarroya, S.; Castillo, J.C.; Gamboa-Montero, J.J.; Sevilla-Salcedo, J.; Salichs, M.A. A Bio-Inspired Endogenous Attention-Based Architecture for a Social Robot. Sensors 2022, 22, 5248. [Google Scholar] [CrossRef]
- Zhu, D.; Li, W.; Yan, M.; Yang, S.X. The Path Planning of AUV Based on D-S Information Fusion Map Building and Bio-Inspired Neural Network in Unknown Dynamic Environment. Int. J. Adv. Robot. Syst. 2014, 11, 34. [Google Scholar] [CrossRef]
- Zhang, X.; Ding, W.; Wang, Y.; Luo, Y.; Zhang, Z.; Xiao, J. Bio-Inspired Self-Organized Fission–Fusion Control Algorithm for UAV Swarm. Aerospace 2022, 9, 714. [Google Scholar] [CrossRef]
- Yin, X.H.; Yang, C.; Xiong, D. Bio-inspired neurodynamics-based cascade tracking control for automated guided vehicles. Int. J. Adv. Manuf. Technol. 2014, 74, 519–530. [Google Scholar] [CrossRef]
- Rozsypálek, Z.; Broughton, G.; Linder, P.; Rouček, T.; Blaha, J.; Mentzl, L.; Kusumam, K.; Krajník, T. Contrastive Learning for Image Registration in Visual Teach and Repeat Navigation. Sensors 2022, 22, 2975. [Google Scholar] [CrossRef] [PubMed]
- Dasgupta, S.; Goldschmidt, D.; Wörgötter, F.; Manoonpong, P. Distributed recurrent neural forward models with synaptic adaptation and CPG-based control for complex behaviors of walking robots. Front. Neurorobot. 2015, 9, 10. [Google Scholar] [CrossRef] [PubMed]
- Hodge, V.J.; Hawkins, R.; Alexander, R. Deep reinforcement learning for drone navigation using sensor data. Neural Comput. Appl. 2021, 33, 2015–2033. [Google Scholar] [CrossRef]
- Al-Muteb, K.; Faisal, M.; Emaduddin, M.; Arafah, M.; Alsulaiman, M.; Mekhtiche, M.; Hedjar, R.; Mathkoor, H.; Algabri, M.; Bencherif, M.A. An autonomous stereovision-based navigation system (ASNS) for mobile robots. Intell. Serv. Robot. 2016, 9, 187–205. [Google Scholar] [CrossRef]
- Lazreg, M.; Benamrane, N. Intelligent System for Robotic Navigation Using ANFIS and ACOr. Appl. Artif. Intell. 2019, 33, 399–419. [Google Scholar] [CrossRef]
- Chen, C.; Richardson, P. Mobile robot obstacle avoidance using short memory: A dynamic recurrent neuro-fuzzy approach. Trans. Inst. Meas. Control 2012, 34, 148–164. [Google Scholar] [CrossRef]
- Nadour, M.; Boumehraz, M.; Cherroun, L.; Puig, V. Hybrid Type-2 Fuzzy Logic Obstacle Avoidance System based on Horn-Schunck Method. Electroteh. Electron. Autom. 2019, 67, 45–51. [Google Scholar]
- Singh, M.K.; Parhi, D.R. Path optimisation of a mobile robot using an artificial neural network controller. Int. J. Syst. Sci. 2011, 42, 107–120. [Google Scholar] [CrossRef]
- Arena, P.; Fortuna, L.; Lombardo, D.; Patanè, L.; Velarde, M.G. The winnerless competition paradigm in cellular nonlinear networks: Models and applications. Int. J. Circuit Theory Appl. 2009, 37, 505–528. [Google Scholar] [CrossRef]
- Liu, C.; Yang, J.; An, K.; Chen, Q. Rhythmic-Reflex Hybrid Adaptive Walking Control of Biped Robot. J. Intell. Robot. Syst. 2019, 94, 603–619. [Google Scholar] [CrossRef]
- Pathmakumar, T.; Sivanantham, V.; Anantha Padmanabha, S.G.; Elara, M.R.; Tun, T.T. Towards an Optimal Footprint Based Area Coverage Strategy for a False-Ceiling Inspection Robot. Sensors 2021, 21, 5168. [Google Scholar] [CrossRef]
- Corrales-Paredes, A.; Malfaz, M.; Egido-García, V.; Salichs, M.A. Waymarking in Social Robots: Environment Signaling Using Human–Robot Interaction. Sensors 2021, 21, 8145. [Google Scholar] [CrossRef] [PubMed]
- Karagüzel, T.A.; Turgut, A.E.; Eiben, A.E.; Ferrante, E. Collective gradient perception with a flying robot swarm. Swarm Intell. 2023, 17, 117–146. [Google Scholar] [CrossRef]
- Le, A.V.; Apuroop, K.G.S.; Konduri, S.; Do, H.; Elara, M.R.; Xi, R.C.C.; Wen, R.Y.W.; Vu, M.B.; Duc, P.V.; Tran, M. Multirobot Formation with Sensor Fusion-Based Localization in Unknown Environment. Symmetry 2021, 13, 1788. [Google Scholar] [CrossRef]
- Zhu, H.; Liu, H.; Ataei, A.; Munk, Y.; Daniel, T.; Paschalidis, I.C. Learning from animals: How to Navigate Complex Terrains. PLoS Comput. Biol. 2020, 16, e1007452. [Google Scholar] [CrossRef]
- de Croon, G.C.H.E.; De Wagter, C.; Seidl, T. Enhancing optical-flow-based control by learning visual appearance cues for flying robots. Nat. Mach. Intell. 2021, 3, 33–41. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
