Next Article in Journal
Dhad—A Children’s Handwritten Arabic Characters Dataset for Automated Recognition
Previous Article in Journal
Influence of Air Drying, Freeze Drying and Oven Drying on the Biflavone Content in Yellow Ginkgo (Ginkgo biloba L.) Leaves
Previous Article in Special Issue
Modelling and Measuring Trust in Human–Robot Collaboration
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Self-Configurable Centipede-Inspired Rescue Robot

School of Automation, Beijing Information Science and Technology University, Beijing 100192, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(6), 2331; https://doi.org/10.3390/app14062331
Submission received: 30 January 2024 / Revised: 6 March 2024 / Accepted: 7 March 2024 / Published: 10 March 2024
(This article belongs to the Special Issue Recent Advances in Human-Robot Interactions)

Abstract

:
Drawing from the characteristics of centipedes, such as their low center of gravity, high stability in movement, adaptability to complex terrains, and ability to continue moving even after losing a limb, this paper designs a self-reconfigurable centipede-type rescue robot with relatively high stability while moving. The robot’s body can lift and traverse higher obstacles, and its multi-segmented structure enables self-disconnection and reconstruction for docking. Moreover, the proposed robot is adept at navigating diverse terrains and surmounting obstacles, equipped with a camera sensor facilitating life recognition, terrain surveying, scene understanding, and obstacle avoidance. Its capabilities prove advantageous for achieving challenging ground rescue missions. Motion stability tests, conducted across various terrains, showcase the robot’s ability to maintain a consistent movement path in rugged environments. Operating with a leg lift height of 0.02 m, the robot achieves a speed of 0.09 m per second. In simulated damaged conditions, the robot demonstrates the capacity to disconnect and reconnect its limbs swiftly, restoring movement capabilities within a single second. During environmental perception tasks, the robot processes and analyzes environmental data in real time at a rate of approximately 15 frames per second, with an 80% confidence level. With an F1 score exceeding 93% and an average precision rate surpassing 98%, the robot showcases its reliability and efficiency.

1. Introduction

In light of the growing frequency of natural disasters and emergency situations, the development of intelligent rescue robots, particularly those taking inspiration from multi-legged organisms like centipedes, has become increasingly imperative. Traditional rescue robots often encounter challenges such as high risk, low efficiency, and a scarcity of human resources, particularly in complex and unpredictable environments. Addressing these challenges requires harnessing natural biological mechanisms and integrating intelligent technologies to enhance the adaptability, flexibility, and intelligence of rescue robots. Consequently, bio-inspired multi-legged robots, such as those resembling centipedes, have garnered attention for their superior adaptability in navigating complex terrains [1,2,3,4,5,6].
Centipede-inspired rescue robots derive significant benefits from research conducted on multi-legged biological robots and collective behaviors. Studies have illustrated that creatures, such as insects, fish, and birds, can solve tasks that are challenging for individuals through self-organization and environmental interaction. These insights have propelled the development of aerial, aquatic, and terrestrial robotic swarm systems, capable of proficient real-world navigation and task execution, including mapping, tracking, inspection, and transportation [7,8,9,10]. While these studies provide valuable theoretical foundations for robotic technology, they frequently focus on a single type of robot and lack comprehensive adaptability to complex environments.
Recent innovations in the design and control of multi-legged robots have emerged. For example, several studies [11,12,13,14,15,16,17,18] have demonstrated robots autonomously adjusting to various terrains by modifying body flexibility and gait. These advancements have notably improved robot efficiency in specific environments, yet they still exhibit limited adaptability in diverse and unpredictable rescue scenarios. Furthermore, breakthroughs in soft materials and actuation technologies [19,20,21,22,23,24] have offered enhanced mobility and environmental adaptability, albeit challenges persist in terms of durability and energy efficiency.
The evolution of multimodal sensing and communication technologies has significantly bolstered robots’ environmental perception and processing abilities. Studies [25,26,27,28,29,30,31] have highlighted the integration of diverse sensing methods to enhance robots’ environmental perception and decision-making capabilities, albeit stability and accuracy in complex environments still require improvement. Moreover, advancements in collective intelligence and collaboration technologies [32,33,34,35,36,37,38] empower robotic swarms to coordinate and execute tasks efficiently, though coordination and robustness in large-scale, dynamic environments remain challenging.
Insect-inspired robot design has emerged as a prominent area of research, encompassing optimizations of locust jumping mechanisms, explorations of insect-robot hybrid interactions, and breakthroughs in soft robotics mimicking insect locomotion. These studies not only drive forward the advancement of robot technology but also open up new research avenues in insect biology [39,40,41,42,43,44].
The present study aims to consolidate recent research findings pertaining to insect-inspired robots and centipede-mimicking rescue robots. It delves into the intricacies of their deep simulation, drawing inspiration from biological behaviors and locomotion patterns. Through meticulous case studies and empirical research, the unique methodologies and strategies employed by these robots in simulating intricate movement patterns, perceiving the environment, and making behavioral decisions akin to those of insects or centipedes, are examined [40,41,42,43,45,46,47,48]. This endeavor seeks to provide profound insights and inspiration for future robotic innovations and insect biology research. Specifically, it focuses on enhancing the efficiency and precision of rescue operations through the design and optimization of a centipede-inspired intelligent rescue robot.
Insect-inspired robots, particularly those that emulate the remarkable navigation [40,41,42,43,46,47,49] and localization abilities of select insects, offer crucial insights into the design of robots capable of autonomous navigation in complex environments. Similarly, the centipede-mimicking rescue robot, inspired by the centipede’s exceptional crawling abilities and terrain adaptability, promises efficient locomotion in challenging and rugged terrains. By emulating these biological traits, the environmental adaptability of robots is enhanced, unveiling unique potential for their applications in search and rescue [50], environmental monitoring, and disaster response.
Future research directions aim to augment the capabilities of these robots by incorporating deep learning [40,41,42,43,46,47,51] technologies to enhance their human–robot interaction capabilities. This involves developing deep learning modules that enable robots to recognize the postures of rescue personnel and assess their status. This technology will facilitate more accurate pinpointing of individuals in need of rescue and the formulation of precision-based rescue strategies informed by postural assessments. Intelligent interaction aims to optimize rescue operations while fostering trust and collaboration between rescue personnel and robots.
Moreover, ongoing research delves into the mechanisms of insect and centipede perception and decision-making [52,53] in their environments. This aims to introduce further innovations in robot design by unpacking how these biological organisms perceive subtle environmental changes and predict future scenarios. This research aims to endow robots with superior environmental adaptability, enabling them to respond flexibly to challenges across diverse scenarios.
While insect-inspired robots and centipede-mimicking rescue robots have demonstrated promising potential in terms of design and functionality, challenges remain in their practical applications. Comprehensive case studies and empirical research are necessary to assess the robots’ actual effectiveness in rescue operations and to continually improve and optimize their design based on practical needs.
In conclusion, insect-inspired robots and centipede-mimicking rescue robots exhibit tremendous potential and value in terms of design, functionality, and applications. Future research endeavors aim to further enhance the performance of these robots and foster their seamless integration with actual rescue needs, ultimately serving humanity to the best of their capabilities.
In more detail, the proposed robot will amalgamate the advantages of existing research, such as high adaptability, flexible body mechanisms, and advanced sensing capabilities, while addressing the limitations of current technologies, including stability in variable environments and energy efficiency issues. We will prioritize scene understanding, task coordination and planning, and terrain surveying to improve the rescue robot’s overall performance [54,55,56,57,58].
Therefore, the designed multi-legged robot diverges from traditional tracked and wheeled robots in terms of environmental adaptability. It adopts a structure akin to a six-legged robot, with each leg boasting three degrees of freedom. This innovation surpasses the limitations of typical bionic-legged robots, which are often restricted to two Degrees of Freedom (DoF), thereby enabling the robot to negotiate taller obstacles and potentially climb stairs through adjustments in servo angles and gait control.
In terms of sensors, this robot deviates from the norm by utilizing a head-mounted camera instead of just relying on mechanical or infrared sensors. This integration of Deep Learning (DL) algorithms endows the robot with enhanced capabilities, including scene understanding. Furthermore, unlike traditional legged robots that may become incapacitated upon losing a leg, this design incorporates a novel magnetic docking mechanism, allowing multiple segments to detach and continue operating independently.
In summary, this research endeavors to leverage multi-legged robot design principles, collective intelligence technologies, and advanced sensing systems to create a centipede-like rescue robot capable of operating effectively in diverse complex environments. This robot aims not only to improve rescue efficiency and accuracy but also to adapt to various rescue scenarios, playing a critical role in natural disasters and emergency rescue operations [57,58,59,60].

2. Materials and Methods

2.1. Mechanical Structure Design

A flat-structured robot was successfully constructed by utilizing SOLIDWORKS to design and analyze a multi-section centipede-inspired robot. Each individual component was initially printed out one by one using 3D printing technology and subsequently assembled to form a complete robot. Throughout the design process, particular enhancement was placed on the robot’s capability to traverse uneven terrain [61]. As such, a three-DoF-legged structure was adopted, with servos employed to drive the movement of each joint. This design enables the robot to navigate small obstacles efficiently, with each segmented limb measuring 11 cm in length and 32 cm, as illustrated in Figure 1. The centipede robot consists of multiple sections of four-legged robot parts, interconnected via magnetic docking devices. With the exception of the head section, which is equipped with additional cameras and a Raspberry Pi 4B, all other sections share an identical structure and are individually controlled by Raspberry Pi Pico microcontrollers. Communication between these sections is facilitated through GPIO ports.
The kinematics modeling and gait planning design of the centipede-inspired robot has been finalized, resulting in the development of a two-part control system. This system consists of an upper computer and a lower computer. The upper computer, operating on the Ubuntu system on a Raspberry Pi 4B platform, processes environmental information. However, the lower computer, utilizing the RP2040 chip Raspberry Pi Pico microcontroller, receives decision-making results from the upper computer, as illustrated in Figure 2. This configuration governs the overall movement and posture of the centipede robot, enabling it to execute specific tasks in the current scenario.
Each four-legged module of the robot is equipped with a rechargeable battery, allowing it to autonomously carry out tasks. Additionally, the Raspberry Pi 4B is equipped with a separate power module that can be applied to power the camera.
A magnetic docking device system has been designed to overcome the limitation of the robot’s four-legged structure in traversing large obstacles or stairs. Illustrated in Figure 3, this system facilitates the connection of multiple sections (three or more) of the four-legged structures. By emulating the crawling form of the centipede, the robot acquires the ability to negotiate higher obstacles. This innovation significantly enhances the robot’s flexibility and responsiveness.
The docking magnets within the docking system are designed as electromagnets, controlled by the Raspberry Pi Pico through the application and removal of electrical current. In the presented centipede robot, these magnetic docking devices serve to provide mechanical coupling, allowing for the transfer of mechanical forces and moments between adjacent robot sections. This ensures a secure and stable connection during movement. Additionally, the Raspberry Pi Pico microcontrollers facilitate the transmission of electrical power and data, enabling synchronized control and communication throughout the entire robot.
It is noteworthy that the magnetic docking device does not compromise the robot’s flexible body feature. Instead, it serves to provide mechanical coupling between adjacent robot sections, controlled by the Raspberry Pi Pico via electrical current manipulation. This mechanism ensures the secure and stable connection of the robot sections during movement. The robot’s flexible body feature is achieved through other design considerations, such as the use of joint configurations, enabling the robot to bend and adapt to its environment.
A camera has been installed on the front of the robot, as depicted in Figure 4, to facilitate environmental understanding, terrain surveying, and rescue missions. This camera enables the robot to analyze and recognize its surroundings in real-time, aiding in decision-making processes, such as obstacle avoidance and other responses. By perceiving and analyzing the environment, the robot can accurately identify the location and type of obstacles, thereby adjusting its actions to ensure safe movement.
The design not only emphasizes advanced mechanical structures and control systems but also integrates modern manufacturing and perception technologies. Utilizing 3D printing technology enables the precise and efficient creation of complex mechanical parts, thereby enhancing the manufacturing process of the robot. Equipped with cameras and scene recognition software, the robot is awarded with autonomous perception capabilities, enabling it to adapt to various environmental challenges. This augmentation enhances its practicality and reliability in applications including terrain exploration and disaster rescue operations.
The design and research of the multi-section centipede-inspired robot have culminated in the development of a robot characterized by a flat structure and multi-legged maneuverability. Equipped with intelligent obstacle avoidance and scene recognition functions, this robot represents a significant academic value and shows great potential regarding practical applications. The robot is well-suited for terrain exploration and disaster rescue missions, providing safer and more efficient technical support for human endeavors.

2.2. Machine Vision Design

The visual design of the system selects the Raspberry Pi 4B development board as the upper computer for deploying and reasoning the machine vision model. To enable the robot to crawl, avoid obstacles, and recognize objects in the terrain as required, a substantial number of images of the relevant environment are acquired through the Raspberry Pi 4B development board. The collected image data is processed with corresponding annotations and input into the DL model for training and testing on the PC or cloud platform, facilitating remote interaction with the Raspberry Pi development board. Subsequently, the trained model framework parameter information is deployed to the Raspberry Pi system, achieving real-time target classification and recognition on the Raspberry Pi development board.
As illustrated in Figure 5, the MobileNetV3 lightweight classification network framework, implemented with PyTorch (0.8.2+cu110) [62], has been chosen for obstacle avoidance and object recognition in the environment. This framework predominantly consists of Inverted Residuals and Linear Bottleneck modules, aimed at reducing computational complexity through smaller output sizes, fewer linear layers, and minimized bottlenecks. Moreover, it integrated Squeeze-and-Excitation (SE) modules to assess the significance of each channel, enhancing useful features while suppressing non-essential ones. This network achieves a balance between small model size and computational demands while maintaining high classification accuracy, making it suitable for deployment on edge computing devices such as embedded systems.

2.3. Machine Gait Design

Firstly, the servo motor’s rotation range is initially determined through a debugging test. It is found that the servo motor rotates counterclockwise to the limit of 8192 (about 95°), and clockwise to the limit of 1638 (about 95°). This indicates that the servo motor is at its initial position at a duty cycle of 5012; moreover, rotating counterclockwise will rotate the foot clockwise, while rotating clockwise will rotate it counterclockwise.
Secondly, adjustments are made for lifting and placing the independent feet. Each foot is controlled by three servo motors, with the top servo motor governing the left and right swing of the foot. Decreasing the duty cycle of the servo motor causes the foot to rotate clockwise, and increasing it causes counterclockwise rotation.
Thirdly, adjustments are made to the overall gait of the robot. Each section of the robot has four feet, labeled as foot 1, 2, 3, and 4. Following the observation of quadruped robots and animals’ gaits, a diagonal gait is adopted for forward movement [63]. As shown in Figure 6, in this gait, two feet opposite each other on the diagonal touch the ground simultaneously. During the standing phase, when the foot is on the ground, it moves from front to back, and during the swing phase, when it is in the air, it moves from back to front. Due to the feet coupling, in half a step cycle, each foot is on the ground. Thus, during forward movement, opposite feet alternate lifting and lowering, rotating forward, and placing down.
To sum up, the specific operation is made as follows: first lift feet 1 and 4, then rotate foot 1 clockwise and foot 4 counterclockwise, and finally lower down feet 1 and 4. Then, lift feet 2 and 3, rotate foot 2 counterclockwise and foot 3 clockwise. This operational mode is repeatedly executed. Finally, adjustments are made for backward walking, left turning, and right turning. For backward walking, the rotation direction of clockwise and counterclockwise during forward walking is swapped. During left turning, the rotation direction of feet 1 and 4 is reversed, while the rotation direction of feet 2 and 3 remains unchanged. Similarly, for right turning, the rotation direction of feet 1 and foot 4 is reversed, while maintaining the rotation direction of feet 2 and foot 3 as it is. These adjustments enable the robot to perform backward and directional movements. Through these gait adjustments, the robot can complete basic walking, forward movement, backward movement, left turning, and right turning actions.

3. System Testing

3.1. Mechanical Structure Design

A series of experiments were conducted to verify the functionality and performance of the multi-segmented centipede robot. First, the robot’s travel abilities were tested on a flat surface to assess movement effects, such as speed, stability, and flexibility. The robot demonstrated the capability to advance, retreat, and turn at different speeds, including its ability to traverse small and large obstacles. The findings showcase good travel performance and sufficient crossing ability, driven by servo motors. As depicted in Figure 7, the robot exhibits good adaptability in various terrains.
The functionality of the magnetic docking device system was verified by integrating the robot’s multi-segmented quadrupedal parts to simulate scenarios with large obstacles. The robot’s capability to traverse larger obstacles was assessed to evaluate the effectiveness of the magnetic docking system. Experimental results, as depicted in Figure 8, confirm that the system can achieve multi-segmented connection, enabling the robot to surmount larger obstacles and improve its traversability in complex terrain. Furthermore, walking diagrams of 2-body and 4-body models show that an increase in body length does not impact the robot’s movement speed but improves stability. Notably, as shown in Figure 8, the multi-segmented body can continue walking independently and steadily after disconnection. These findings offer valuable theoretical insights and practical guidelines for employing multi-segmented robots in complex environments.

3.2. Machine Vision Design

The model utilizes 400 images with corresponding labels assigned to the training data according to requirements, such as ‘left’ and ‘right’ labels. The corresponding model’s hyperparameters are adjusted for multi-round training accordingly. Achieving a training set loss value of 0.128 and a test set classification accuracy of 97.5%, the model accurately classifies the current environmental statuses. Post-training, the model’s framework parameter information is saved for deployment and inference. Referring to Figure 9, a set of performance evaluation curves for binary classification models is presented. The model’s ability to classify two categories, ‘left’ and ‘right’, is assessed using both Precision-Recall (P-R) curves and Receiver Operating Characteristic (ROC) curves. Regarding the ‘left’ category, the P-R curve showcased an almost perfect Average Precision (AP) score of 0.99, presenting a high precision across all recall levels. Similarly, the ROC curve for the ‘left’ category highlights an exceptional Area Under the Curve (AUC) score of 0.99, reflecting a high true positive rate coupled with a low false positive rate. The ‘right’ category exhibits similar performance, with a P-R curve AP score of 0.98 and an ROC curve AUC score of 0.99, highlighting outstanding classification accuracy. The step-shaped curves imply a discrete classifier or prediction mechanism. Overall, high precision and AUC values in these graphs emphasize the model’s robust discriminatory power across both categories.
Due to the slow inference speed of the complex model framework file, incompatible with the Raspberry Pi 4B development environment, the model was converted to the ONNX format, a universal DL model exchange format. This format stores weights, structural information, input and output of each network layer, and other auxiliary details, facilitating deployment and reasoning on the Raspberry Pi 4B. This conversion enhances the model’s flexibility, compatibility, and efficiency. Real-time reasoning using the Raspberry Pi camera achieves a frame rate of approximately 15 frames per second, with an 80% confidence level in the real-time acquired images, meeting the model’s real-time reasoning requirements. Classification results for ‘left’ and ‘right’ are presented in Table 1, demonstrating high precision and recall rates, resulting in F1 Scores exceeding 93% and Average Precision rates surpassing 98%, affirming the reliability of the classification model.
The robot’s intelligent obstacle avoidance function was validated through a scene recognition experiment. Images of the surrounding environment, captured by the onboard camera, underwent real-time processing by a computer system. The system’s real-time analysis enabled it to identify and locate obstacles. Various types and sizes of obstacles were applied to test the robot’s response and accuracy. Results showcased successful obstacle avoidance and accurate recognition of different types of obstacles, such as walls and furniture, as illustrated in Figure 10.
In each experiment, essential data including the robot’s movement trajectory, response time, and obstacle avoidance accuracy were meticulously recorded. Analysis and comparison of this data were conducted to evaluate the robot’s performance and effectiveness. These findings facilitate further improvements and optimizations.
The design and functionality of the multi-segmented centipede robot underwent rigorous verification through extensive experiments. These tests conclusively showcased the robot’s adept travel capabilities, resilient obstacle-crossing ability, and effective scene understanding. Such experimental validations serve as a crucial foundation for future scientific research and technological advancements, driving forward the development of more sophisticated multi-legged robots.

3.3. Machine Gait Analysis

In robotics, gait design plays a crucial role in enabling robots to navigate diverse terrains with mobility and efficiency. To evaluate the effectiveness of the robot’s gait design, experiments were conducted across smooth and rough environments. Employing a camera to capture the entire experimental process enables a comprehensive analysis of the robot’s motion trajectory, speed, foot lift height, and gait cycle in detail.
Referring to Figure 11, the gait design exhibited commendable stability and adaptability across both experimental terrains. On the smooth surface, the robot exhibited a stable motion trajectory, achieving a speed of 0.10 m/s, a foot lift height of 0.11 m, and a gait cycle of 1.0 s. However, on the rough surface, where ground protrusions introduced instability, the robot’s movement direction and displacement decreased, leading to a decrease in speed to 0.05 m/s. Despite this, the robot maintained its mobility, while keeping a foot lift height of 0.11 m. These findings demonstrate the efficacy of the designed gait in ensuring the robot’s mobility across different terrain environments.
Based on the experimental results, an in-depth analysis was conducted to assess the gait performance, providing detailed data on foot lift height, speed, and gait cycle. These findings serve as a basis for refining and improving the robot’s gait design. Enhancements may include adjusting the speed or gait cycle to improve the robot’s performance in specific environments. To ensure stability on rough terrain, adjustments were performed to lower the foot lift height to 0.02 m, reduce the step length, decrease the gait cycle to 0.5 s, and increase the movement speed to 0.09 m/s.
To sum up, through the refinement of the robot’s gait, it attains stable and efficient movement across various complex environments. These experimental findings serve as valuable benchmarks for guiding future robot design and research.

4. Conclusions and Future Work

Taking cues from the distinctive locomotion and anatomical structure of the centipede, the robot marks a revolutionary addition to the field of rescue robotics. It boasts exceptional stability, optimized gait patterns, remarkable robustness, and cutting-edge intelligence. By blending centipede-inspired adaptability with state-of-the-art robotics technology, this innovation positions the robot as a pioneer in insect-inspired rescue technology, offering unprecedented capabilities and performance in emergency response scenarios. With its unique design, the robot lays the way for future advancements in rescue robotics, rendering it a critical asset in emergency response operations.
Through research and experimental verification, the design and functionality of the multi-segmented centipede robot were successfully validated. In our experiments, the robot exhibited distinct motion characteristics on both smooth and rough surfaces. On smooth surfaces, it maintained a stable trajectory, reaching a speed of 0.10 m/s with a foot lift height of 0.11 m and a gait cycle of 1.0 s. However, on rough terrain, ground protrusions introduced instability, leading to a decrease in displacement and speed to 0.05 m/s. Nevertheless, despite these challenges, the robot retains its mobility and foot lift height.
These results underscore the robot’s adaptability across diverse terrain environments, attributable to its carefully designed gait. Detailed analysis of gait performance yielded valuable insights into foot lift height, speed, and gait cycle, laying the groundwork for future optimizations. Potential improvements could target adjustments in speed or gait cycle to optimize performance for specific environments. To ensure stability on rough terrain, adjustments to the gait were implemented, including lowering the foot lift height to 0.02 m, reducing the step length, shortening the gait cycle to 0.5 s, and increasing the movement speed to 0.09 m/s. This robot, characterized by its flat structure and multi-legged maneuverability, is additionally equipped with scene-understanding capabilities. Its applicability broadens its potential applications to terrain exploration and disaster rescue, and beyond.
The results demonstrate the robot’s adaptability to distinct terrain environments due to its gait design. The gait optimization strategies discussed, including adjusting speed, gait cycle, and foot lift height, offer potential improvements for enhancing performance in specific scenarios.
During the design and experimentation process, modern manufacturing and perception technologies, such as 3D printing and cameras, were fully implemented. These advanced technologies were crucial for the manufacturing and operational aspects of the robot, significantly enhancing its manufacturing efficiency and accuracy. Moreover, the robot was equipped with autonomous perception, enabling it to effectively respond to environmental challenges. When considering environmental comprehension, the robot is able of processing and analyzing environmental data in real-time, achieving an approximate speed of 15 frames per second. Moreover, it maintains a high confidence level of 80%, exhibits an F1 score of 93%, and maintains an average PR exceeding 98%. These metrics highlight the robot’s reliability and efficiency in environmental understanding tasks.
The findings of this study have direct implications for the field of robotics, particularly in the development of mobile robots capable of navigating across distinct terrains. The robot’s performance characteristics and gait optimization strategies can inform future research regarding the design and development.
However, the current study has several limitations that need to be addressed in future research. Firstly, the evaluation was conducted in controlled environments where predefined terrain types were considered; moreover, the robot was not exposed to a wide range of environmental variations.
Therefore, future work should consider evaluating the robot’s performance in more diverse and unstructured environments, including changeable terrain roughness, obstacles, and dynamic changes. Secondly, the study focused primarily on kinematic performance metrics, including speed, foot lift height, and gait cycle. While these metrics provide valuable insights into the robot’s locomotory capabilities, they do not capture the robot’s sensorimotor control, decision-making, or autonomous navigation abilities. Hence, future research should aim to evaluate these aspects to gain a more comprehensive understanding of the robot’s performance.
Furthermore, the study did not consider the robot’s energy efficiency, durability, or maintenance requirements. These factors are crucial for practical applications, specifically in scenarios where the robot needs to operate continuously for extended periods. Thus, future work should incorporate these metrics into the evaluation framework to assess the robot’s performance more comprehensively.
Finally, the multi-segmented centipede robot, with its yielded adaptability and performance, has potential applications in various fields, including search and rescue operations, exploration of remote or inaccessible areas, and environmental monitoring tasks. Therefore, the robot’s ability to maintain mobility across different terrain environments makes it a promising candidate for such applications.

Author Contributions

Conceptualization, Q.C. and J.H.; methodology, J.H.; software, Y.L. and Z.X.; validation, Y.L. and Z.X.; formal analysis, J.H. and Y.S.; writing—original draft preparation, Y.Z. and Y.S.; writing—review and editing, J.H. and Q.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the 2023 Innovation and Entrepreneurship Training Program for College Students at Beijing Information Science and Technology University (S202311232518), as well as the National Natural Science Foundation of China (62103056 and 62276028).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Chong, B.; Aydin, Y.O.; Rieser, J.M.; Sartoretti, G.; Wang, T.; Whitman, J.; Kaba, A.; Aydin, E.; McFarland, C.; Cruz, K.D.; et al. A general locomotion control framework for multi-legged locomotors. Bioinspiration Biomim. 2022, 17, 046015. [Google Scholar] [CrossRef] [PubMed]
  2. Kuroda, S.; Uchida, N.; Nakagaki, T. Gait switching with phase reversal of locomotory waves in the centipede Scolopocryptops rubiginosus. Bioinspir. Biomim. 2022, 17, 026005. [Google Scholar] [CrossRef] [PubMed]
  3. Bulichev, O.; Klimchik, A. Concept Development of Biomimetic Centipede Robot StriRus. In Proceedings of the 2018 23rd Conference of Open Innovations Association (FRUCT), Bologna, Italy, 13–16 November 2018; pp. 85–90. [Google Scholar]
  4. Lim, K.B.; Kim, S.J.; Yoon, Y.S. Deliberative Planner for UGV with Actively Articulated Suspension to Negotiate Geometric Obstacles by Using Centipede Locomotion Pattern. In Proceedings of the ICCAS 2010, Gyeonggi-do, Republic of Korea, 27–30 October 2010; pp. 1482–1486. [Google Scholar]
  5. Wright, M.; Xiao, Q.; Dai, S.; Post, M.; Yue, H.; Sarkar, B. Design and development of modular magnetic bio-inspired autonomous underwater robot–MMBAUV. Ocean Eng. 2023, 273, 113968. [Google Scholar] [CrossRef]
  6. Kashiwada, S.; Ito, K. Proposal of Semiautonomous Centipede-Like Robot for Rubbles. In Proceedings of the The Seventeenth International Symposium on Artificial Life and Robotics, Oita, Japan, 19–21 January 2012; pp. 1127–1130. [Google Scholar]
  7. Homchanthanakul, J.; Manoonpong, P. Proactive body joint adaptation for energy-efficient locomotion of bio-inspired multi-segmented robots. IEEE Robot. Autom. Lett. 2023, 8, 904–911. [Google Scholar] [CrossRef]
  8. Jamisola, R.S.; Mastalli, C. Bio-Inspired Holistic Control through Modular Relative Jacobian for Combined Four-Arm Robots. In Proceedings of the 2017 18th International Conference on Advanced Robotics (ICAR), Hong Kong, China, 10–12 July 2017; pp. 346–352. [Google Scholar]
  9. Inagaki, S.; Niwa, T.; Suzuki, T. Navigation control and walking control on uneven terrain for centipede-like multi-legged robot based on fcp gait control. In Emerging Trends in Mobile Robotics; World Scientific Publishing: Hackensack, NJ, USA, 2010; pp. 656–663. [Google Scholar]
  10. Inagaki, S.; Niwa, T.; Suzuki, T. Follow-the-contact-point gait control of centipede-like multi-legged robot to navigate and walk on uneven terrain. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 5341–5346. [Google Scholar]
  11. Wang, X.Q.; Chan, K.H.; Cheng, Y.; Ding, T.; Li, T.; Achavananthadith, S.; Ahmet, S.; Ho, J.S.; Ho, G.W. Somatosensory, light-driven, thin-film robots capable of integrated perception and motility. Adv. Mater. 2020, 32, 2000351. [Google Scholar] [CrossRef] [PubMed]
  12. Ozkan-Aydin, Y.; Chong, B.; Aydin, E.; Goldman, D.I. A Systematic Approach to Creating Terrain-Capable Hybrid Soft/Hard Myriapod Robots. In Proceedings of the 2020 3rd IEEE International Conference on Soft Robotics (RoboSoft), New Haven, CT, USA, 15 May–15 July 2020; pp. 156–163. [Google Scholar]
  13. Matthey, L.; Righetti, L.; Ijspeert, A.J. Experimental Study of Limit Cycle and Chaotic Controllers for the Locomotion of Centipede Robots. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008; pp. 1860–1865. [Google Scholar]
  14. Xu, R.; Xu, Q. Design of a Bio-Inspired Untethered Soft Octopodal Robot Driven by Magnetic Field. Biomimetics 2023, 8, 269. [Google Scholar] [CrossRef] [PubMed]
  15. Zhang, F.; Xiong, F.; Liu, Z. Learning Individual Features to Decompose State Space for Robotic Skill Learning. In Proceedings of the 2020 Chinese Control and Decision Conference (CCDC), Hefei, China, 22–24 August 2020; pp. 3169–3174. [Google Scholar]
  16. Coskuner-Weber, O.; Yuce-Erarslan, E.; Uversky, V.N. Paving the Way for Synthetic Intrinsically Disordered Polymers for Soft Robotics. Polymers 2023, 15, 763. [Google Scholar] [CrossRef]
  17. Chesnitskiy, A.V.; Gayduk, A.E.; Seleznev, V.A.; Prinz, V.Y. Bio-Inspired Micro-and Nanorobotics Driven by Magnetic Field. Materials 2022, 15, 7781. [Google Scholar] [CrossRef]
  18. Gomez-Tamm, A.E.; Ramon-Soria, P.; Arrue, B.; Ollero, A. Current State and Trends on Bioinspired Actuators for Aerial Manipulation. In Proceedings of the 2019 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED UAS), Cranfield, UK, 25–27 November 2019; pp. 352–361. [Google Scholar]
  19. Liu, Y.; Li, J.; Deng, J.; Zhang, S.; Chen, W.; Xie, H.; Zhao, J. Arthropod-metamerism-inspired resonant piezoelectric millirobot. Adv. Intell. Syst. 2021, 3, 2100015. [Google Scholar] [CrossRef]
  20. Duan, S.; Shi, Q.; Wu, J. Multimodal Sensors and ML-Based Data Fusion for Advanced Robots. Adv. Intell. Syst. 2022, 4, 2200213. [Google Scholar] [CrossRef]
  21. Zazoum, B.; Batoo, K.M.; Khanm, M.A.A. Recent advances in flexible sensors and their applications. Sensors 2022, 22, 4653. [Google Scholar] [CrossRef]
  22. Úbeda, A.; Torres, F.; Puente, S.T. Assistance Robotics and Biosensors 2019. Sensors 2020, 20, 1335. [Google Scholar] [CrossRef]
  23. Wang, C.; Dong, L.; Peng, D.; Pan, C. Tactile sensors for advanced intelligent systems. Adv. Intell. Syst. 2019, 1, 1900090. [Google Scholar] [CrossRef]
  24. Torres, F.; Puente, S.T.; Úbeda, A. Assistance Robotics and Biosensors. Sensors 2018, 18, 3502. [Google Scholar] [CrossRef]
  25. Obute, S.O.; Kilby, P.; Dogar, M.R.; Boyle, J.H. Swarm foraging under communication and vision uncertainties. IEEE Trans. Autom. Sci. Eng. 2022, 19, 1446–1457. [Google Scholar] [CrossRef]
  26. Sayed, M.E.; Roberts, J.O.; Donaldson, K.; Mahon, S.T.; Iqbal, F.; Li, B.; Aixela, S.F.; Mastorakis, G.; Jonasson, E.T.; Nemitz, M.P.; et al. Modular robots for enabling operations in unstructured extreme environments. Adv. Intell. Syst. 2022, 4, 2000227. [Google Scholar] [CrossRef]
  27. Xiao, Z.; Wang, X.; Huang, J.; Hong, L. D-World: Decay Small-World for Optimizing Swarm Knowledge Synchronization. IEEE Access 2022, 10, 60060–60077. [Google Scholar] [CrossRef]
  28. Zhou, Y.; Chen, A.; He, X.; Bian, X. Multi-Target Coordinated Search Algorithm for Swarm Robotics Considering Practical Constraints. Front. Neurorobotics 2021, 15, 753052. [Google Scholar] [CrossRef] [PubMed]
  29. Du, Y. A novel approach for swarm robotic target searches based on the DPSO algorithm. IEEE Access 2020, 8, 226484–226505. [Google Scholar] [CrossRef]
  30. Qiao, Z.; Zhang, J.; Qu, X.; Xiong, J. Dynamic self-organizing leader-follower control in a swarm mobile robots system under limited communication. IEEE Access 2020, 8, 53850–53856. [Google Scholar] [CrossRef]
  31. Fujisawa, R.; Dobata, S.; Sugawara, K.; Matsuno, F. Designing pheromone communication in swarm robotics: Group foraging behavior mediated by chemical substance. Swarm Intell. 2014, 8, 227–246. [Google Scholar] [CrossRef]
  32. Aoi, S.; Yabuuchi, Y.; Morozumi, D.; Okamoto, K.; Adachi, M.; Senda, K.; Tsuchiya, K. Maneuverable and Efficient Locomotion of a Myriapod Robot with Variable Body-Axis Flexibility via Instability and Bifurcation. Soft Robot. 2023, 10, 1028–1040. [Google Scholar] [CrossRef] [PubMed]
  33. Yasui, K.; Takano, S.; Kano, T.; Ishiguro, A. Simple Reactive Head Motion Control Enhances Adaptability to Rough Terrain in Centipede Walking. In Proceedings of the Conference on Biomimetic and Biohybrid Systems, Virtual, 19–22 July 2022; Springer International Publishing: Cham, Swizterland, 2022; pp. 262–266. [Google Scholar]
  34. Ambe, Y.; Aoi, S.; Tsuchiya, K.; Matsuno, F. Generation of direct-, retrograde-, and source-wave gaits in multi-legged locomotion in a decentralized manner via embodied sensorimotor interaction. Front. Neural Circuits 2021, 15, 706064. [Google Scholar] [CrossRef] [PubMed]
  35. Bulichev, O.; Klimchik, A.; Mavridis, N. Optimization of Centipede Robot Body Designs through Evolutionary Algorithms and Multiple Rough Terrains Simulation. In Proceedings of the 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), Macau, Macao, 5–8 December 2017; pp. 290–295. [Google Scholar]
  36. Aoi, S.; Tanaka, T.; Fujiki, S.; Funato, T.; Senda, K.; Tsuchiya, K. Advantage of straight walk instability in turning maneuver of multilegged locomotion: A robotics approach. Sci. Rep. 2016, 6, 30199. [Google Scholar] [CrossRef] [PubMed]
  37. Koh, D.; Yang, J.; Kim, S. Centipede Robot for Uneven Terrain Exploration: Design and Experiment of the Flexible Biomimetic Robot Mechanism. In Proceedings of the 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Tokyo, Japan, 26–29 September 2010; pp. 877–881. [Google Scholar]
  38. Hoffman, K.L.; Wood, R.J. Robustness of Centipede-Inspired Millirobot Locomotion to Leg Failures. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1472–1479. [Google Scholar]
  39. Mo, X.; Ge, W.; Ren, Y.; Zhao, D.; Wei, D.; Romano, D. Locust-inspired jumping mechanism design and improvement based on takeoff stability. J. Mech. Robot. 2024, 16, 061013. [Google Scholar] [CrossRef]
  40. Romano, D.; Benelli, G.; Kavallieratos, N.G.; Athanassiou, C.G.; Canale, A.; Stefanini, C. Beetle-robot hybrid interaction: Sex, lateralization and mating experience modulate behavioural responses to robotic cues in the larger grain borer Prostephanus truncatus (Horn). Biol. Cybern. 2020, 114, 473–483. [Google Scholar] [CrossRef]
  41. Wu, Y.; Yim, J.K.; Liang, J.; Shao, Z.; Qi, M.; Zhong, J.; Luo, Z.; Yan, X.; Zhang, M.; Wang, X.; et al. Insect-scale fast moving and ultrarobust soft robot. Sci. Robot. 2019, 4, eaax1594. [Google Scholar] [CrossRef]
  42. Li, T.; Zou, Z.; Mao, G.; Yang, X.; Liang, Y.; Li, C.; Qu, S.; Suo, Z.; Yang, W. Agile and resilient insect-scale robot. Soft Robot. 2019, 6, 133–141. [Google Scholar] [CrossRef]
  43. Vo-Doan, T.T.; Dung, V.T.; Sato, H. A cyborg insect reveals a function of a muscle in free flight. Cyborg Bionic Syst. 2022, 2022, 9780504. [Google Scholar] [CrossRef]
  44. Poon, K.C.; Tan DC, L.; Li, Y.; Cao, F.; Vo Doan, T.T.; Sato, H. Cyborg Insect: Insect Computer Hybrid Robot. In Electrochemical Society Meeting Abstracts 230; The Electrochemical Society, Inc.: Pennington, NJ, USA, 2016; p. 3221. [Google Scholar] [CrossRef]
  45. Romano, D.; Bloemberg, J.; Tannous, M.; Stefanini, C. Impact of aging and cognitive mechanisms on high-speed motor activation patterns: Evidence from an orthoptera-robot interaction. IEEE Trans. Med. Robot. Bionics 2020, 2, 292–296. [Google Scholar] [CrossRef]
  46. Romano, D.; Benelli, G.; Stefanini, C. How aggressive interactions with biomimetic agents optimize reproductive performances in mass-reared males of the Mediterranean fruit fly. Biol. Cybern. 2023, 117, 249–258. [Google Scholar] [CrossRef] [PubMed]
  47. Butail, S.; Abaid, N.; Macrì, S.; Porfiri, M. Fish–robot interactions: Robot fish in animal behavioral studies. Robot Fish Bio-Inspired Fishlike Underw. Robot. 2015, 359–377. [Google Scholar]
  48. Abaid, N.; Bartolini, T.; Macrì, S.; Porfiri, M. Zebrafish responds differentially to a robotic fish of varying aspect ratio, tail beat frequency, noise, and color. Behav. Brain Res. 2012, 233, 545–553. [Google Scholar] [CrossRef] [PubMed]
  49. de Croon, G.C.H.E.; Dupeyroux, J.J.G.; Fuller, S.B.; Marshall, J.A.R. Insect-inspired AI for autonomous robots. Sci. Robot. 2022, 7, eabl6334. [Google Scholar] [CrossRef] [PubMed]
  50. Arunkumar, V.; Rajasekar, D.; Aishwarya, N. A Review Paper on Mobile Robots Applications in Search and Rescue Operations. Adv. Sci. Technol. 2023, 130, 65–74. [Google Scholar]
  51. Guan, J.; Su, Y.; Su, L.; Sivaparthipan, C.B.; Muthu, B. Bio-inspired algorithms for industrial robot control using deep learning methods. Sustain. Energy Technol. Assess. 2021, 47, 101473. [Google Scholar] [CrossRef]
  52. Lewis, F.L.; Ge, S.S. (Eds.) . Autonomous Mobile Robots: Sensing, Control, Decision Making and Applications; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  53. Iida, F.; Nurzaman, S.G. Adaptation of sensor morphology: An integrative view of perception from biologically inspired robotics perspective. Interface Focus 2016, 6, 20160016. [Google Scholar] [CrossRef]
  54. Yasui, K.; Sakai, K.; Kano, T.; Owaki, D.; Ishiguro, A. Decentralized control scheme for myriapod robot inspired by adaptive and resilient centipede locomotion. PLoS ONE 2017, 12, e0171421. [Google Scholar] [CrossRef]
  55. Sheynkman, D.; Wang, C.; Lawhorn, R.; Lu, L. Actuator Design and Whole-Body Dynamics of Micro Centipede Robots. In Proceedings of the Dynamic Systems and Control Conference, Minneapolis, MN, USA, 12–14 October 2016; American Society of Mechanical Engineers: New York, NY, USA, 2016; Volume 50695, p. V001T03A002. [Google Scholar]
  56. Lamperti, R.D.; de Arruda LV, R. Distributed strategy for communication between multiple robots during formation navigation task. Robot. Auton. Syst. 2023, 169, 104509. [Google Scholar] [CrossRef]
  57. Ito, K.; Ishigaki, Y. Semiautonomous centipede-like robot for rubble-development of an actual scale robot for rescue operation. Int. J. Adv. Mechatron. Syst. 2015, 6, 75–83. [Google Scholar] [CrossRef]
  58. Huang, X.; Arvin, F.; West, C.; Watson, S.; Lennox, B. Exploration in Extreme Environments with Swarm Robotic System. In Proceedings of the 2019 IEEE International Conference on Mechatronics (ICM), Ilmenau, Germany, 18–20 March 2019; Volume 1, pp. 193–198. [Google Scholar]
  59. Qin, Z.; Wu, Y.-T.; Eizad, A.; Lyu, S.-K.; Lee, C.-M. Advancement of mechanical engineering in extreme environments. Int. J. Precis. Eng. Manuf. -Green Technol. 2021, 8, 1767–1782. [Google Scholar] [CrossRef]
  60. Tan, N.; Sun, Z.; Mohan, R.E.; Brahmananthan, N.; Venkataraman, S.; Sosa, R.; Wood, K. A system-of-systems bio-inspired design process: Conceptual design and physical prototype of a reconfigurable robot capable of multi-modal locomotion. Front. Neurorobotics 2019, 13, 78. [Google Scholar] [CrossRef] [PubMed]
  61. Rastgar, H.; Naeimi, H.R.; Agheli, M. Characterization, validation, and stability analysis of maximized reachable workspace of radially symmetric hexapod machines. Mech. Mach. Theory 2019, 137, 315–335. [Google Scholar] [CrossRef]
  62. Howard, A.; Sandler, M.; Chen, B.; Wang, W.; Chen, L.-C.; Tan, M.; Chu, G.; Vasudevan, V.; Zhu, Y.; Pang, R.; et al. Searching for MobileNetV3. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019; pp. 1314–1324. [Google Scholar] [CrossRef]
  63. Chong, B.; He, J.; Soto, D.; Wang, T.; Irvine, D.; Blekherman, G.; Goldman, D.I. Multilegged matter transport: A framework for locomotion on noisy landscapes. Science 2023, 380, 509–515. [Google Scholar] [CrossRef]
Figure 1. SOLIDWORKS simulation diagram of the four-legged robot frame.
Figure 1. SOLIDWORKS simulation diagram of the four-legged robot frame.
Applsci 14 02331 g001
Figure 2. Schematic diagram of the internal structure of a single-segment quadrupedal robot.
Figure 2. Schematic diagram of the internal structure of a single-segment quadrupedal robot.
Applsci 14 02331 g002
Figure 3. Display of magnetic docking device: (a) When there is no power, the magnets separate and (b) After energization, the magnets attract each other.
Figure 3. Display of magnetic docking device: (a) When there is no power, the magnets separate and (b) After energization, the magnets attract each other.
Applsci 14 02331 g003
Figure 4. Overall structural design of the robot. When the robot encounters an obstacle in its right front, the camera transmits image information. After classification by the classification model on Raspberry Pi 4B, it will judge the forward direction as “left” and transmit the signal to the controller, enabling the robot to turn left and continue moving forward. If an obstacle is encountered on the left side, the robot will turn right.
Figure 4. Overall structural design of the robot. When the robot encounters an obstacle in its right front, the camera transmits image information. After classification by the classification model on Raspberry Pi 4B, it will judge the forward direction as “left” and transmit the signal to the controller, enabling the robot to turn left and continue moving forward. If an obstacle is encountered on the left side, the robot will turn right.
Applsci 14 02331 g004
Figure 5. MobileNetV3 Lightweight Classification Network.
Figure 5. MobileNetV3 Lightweight Classification Network.
Applsci 14 02331 g005
Figure 6. The gait design process, with time annotation on the upper right corner of the image. There are three servos on a single leg.
Figure 6. The gait design process, with time annotation on the upper right corner of the image. There are three servos on a single leg.
Applsci 14 02331 g006
Figure 7. Motion process records in different environments: (a) uneven ground and (b) smooth surface.
Figure 7. Motion process records in different environments: (a) uneven ground and (b) smooth surface.
Applsci 14 02331 g007
Figure 8. Experimental results schematic of the multi-segmented robot. It includes frame images extracted from videos and the walking path annotated in blue. The yellow box represents the docking device where (a) frame images extracted from the video of the two sections of the machine body walking in the end-to-end docking route, (b) frame images extracted from the video of the four sections of the machine body walking in the end-to-end docking route, and (c) video stills of the two-body machine body separating, with a magnified view at the top right corner.
Figure 8. Experimental results schematic of the multi-segmented robot. It includes frame images extracted from videos and the walking path annotated in blue. The yellow box represents the docking device where (a) frame images extracted from the video of the two sections of the machine body walking in the end-to-end docking route, (b) frame images extracted from the video of the four sections of the machine body walking in the end-to-end docking route, and (c) video stills of the two-body machine body separating, with a magnified view at the top right corner.
Applsci 14 02331 g008
Figure 9. Performance Evaluation Curves for Binary Classification where (ad) includes, respectively, Precision-Recall (P-R) curves and Receiver Operating Characteristic (ROC) curves for the ‘left’ and ‘right’ categories.
Figure 9. Performance Evaluation Curves for Binary Classification where (ad) includes, respectively, Precision-Recall (P-R) curves and Receiver Operating Characteristic (ROC) curves for the ‘left’ and ‘right’ categories.
Applsci 14 02331 g009
Figure 10. Scene recognition, distinguish left turn or right turn where (a) 15 frames per second, the score for left turn is 0.288 and the score for right turn is 0.712 (therefore, it is judged as right turn) and (b) 15 frames per second, the score for left turn is 0.723 and the score for right turn is 0.277 (therefore, it is judged as left turn).
Figure 10. Scene recognition, distinguish left turn or right turn where (a) 15 frames per second, the score for left turn is 0.288 and the score for right turn is 0.712 (therefore, it is judged as right turn) and (b) 15 frames per second, the score for left turn is 0.723 and the score for right turn is 0.277 (therefore, it is judged as left turn).
Applsci 14 02331 g010
Figure 11. The screenshot of one gait cycle where (a) non-smooth ground gait map, (b) smooth ground gait map, and (c) trajectory chart after gait adjustment, due to the short period, the gait period chart change is not obvious, so 2.5 s motion capture is given.
Figure 11. The screenshot of one gait cycle where (a) non-smooth ground gait map, (b) smooth ground gait map, and (c) trajectory chart after gait adjustment, due to the short period, the gait period chart change is not obvious, so 2.5 s motion capture is given.
Applsci 14 02331 g011
Table 1. Assessment of Classification Precision and Recall.
Table 1. Assessment of Classification Precision and Recall.
ClassesRecallF1 ScoreAverage
Precision
Precision
left92.7594.8199.2196.97
right96.1593.4698.2090.91
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hou, J.; Xue, Z.; Liang, Y.; Sun, Y.; Zhao, Y.; Chen, Q. Self-Configurable Centipede-Inspired Rescue Robot. Appl. Sci. 2024, 14, 2331. https://doi.org/10.3390/app14062331

AMA Style

Hou J, Xue Z, Liang Y, Sun Y, Zhao Y, Chen Q. Self-Configurable Centipede-Inspired Rescue Robot. Applied Sciences. 2024; 14(6):2331. https://doi.org/10.3390/app14062331

Chicago/Turabian Style

Hou, Jingbo, Zhifeng Xue, Yue Liang, Yipeng Sun, Yu Zhao, and Qili Chen. 2024. "Self-Configurable Centipede-Inspired Rescue Robot" Applied Sciences 14, no. 6: 2331. https://doi.org/10.3390/app14062331

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop