Next Article in Journal
AncientTRD: A Novel Text Reuse Detection Method for Ancient Chinese Literature
Previous Article in Journal
Block Copolymer-Templated Synthesis of Fe–Ni–Co-Modified Nanoporous Alumina Films
Previous Article in Special Issue
Data-Driven Tracing and Directional Control Strategy for a Simulated Continuum Robot Within Anguilliform Locomotion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design of a Mobile Assisting Robot for Blind and Elderly People

1
Department of Mechanical Engineering, Universidad Carlos III de Madrid, Avda. de la Universidad 30, 28911 Leganés, Spain
2
Department of Industrial, Engineering, Università di Roma Tor Vergata, Via del Politecnico 1, 00133 Roma, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(19), 10474; https://doi.org/10.3390/app151910474
Submission received: 19 July 2025 / Revised: 20 September 2025 / Accepted: 25 September 2025 / Published: 27 September 2025
(This article belongs to the Special Issue Application of Computer Science in Mobile Robots, 3rd Edition)

Abstract

This paper presents the design, development, and experimental evaluation of a hybrid wheel–leg guide robot intended to assist blind and elderly people with mobility tasks indoors and outdoors. The design requirements are derived from an analysis of safety, usability, and affordability needs for assisting devices. The resulting design consists of a compact platform with two front leg–wheel assemblies and three additional wheels, two of which are motorized, arranged in a triangular configuration that provides stable support and reliable traction. The proposed locomotion system is innovative because existing guide robots typically rely exclusively on either wheels or legs. In contrast, this hybrid configuration combines the energy efficiency of wheeled locomotion with the capability of leg-assisted stepping, enabling improved terrain adaptability. Experiments with a prototype were carried out in indoor environments, including straight-line motion, turning, and obstacle-overcoming tests. The prototype, with a total weight of 1.9 kg and a material cost of 255 euros, maintained stable movement and achieved a 100% success rate for obstacles up to 30 mm, with partial success up to 40 mm. Additional test results indicate an average cruising speed of 0.1 m/s, and a practical endurance of 4.5–5 h. The proposed design aims to contribute to the development of more inclusive, efficient, and user-centered robotic solutions, promoting greater autonomy and quality of life for blind and elderly people.

1. Introduction

According to the World Health Organization (WHO), at least 2.2 billion people have some form of visual impairment, either near or far [1]. Population growth and aging are expected to increase the risk of visual impairment for more and more people. Between 2015 and 2050, the proportion of the world’s population over the age of 60 will nearly double from 12% to 22% [2]. Therefore, there is a great need to develop robotic systems that can help blind and elderly people to be independent and feel autonomous.
Eighty percent of the information we need for our daily lives comes from our eyes. This means that most of the skills we possess, the knowledge we acquire, and the activities we perform are based on visual information. Vision is fundamental to the autonomy and development of every human being. For blind people, today the white cane is the most effective and widely used mobility tool. This mechanical device detects obstacles on the ground, uneven surfaces, holes, steps, and other hazards. However, this device does not provide the information necessary to determine how to reach a destination given the position and orientation. Similarly, guide dogs are a very valuable support to facilitate the mobility of blind people, as they not only help them to navigate, but also provide emotional support. However, their use is limited by economic factors, the availability of trained dogs, and the effort required to care for and train them. In response to these limitations, several technological solutions have been proposed, such as smart canes with ultrasonic sensors, smart glasses, haptic devices, and robotic guides. These tools can warn of obstacles at different heights and can improve awareness of the environment. However, many of these technologies still have limitations in terms of accessibility, cost, and ease of use. The three-wheeled RG-I robot proposed in [3] can navigate indoors efficiently thanks to its laser rangefinder and radio frequency identification (RFID) sensors. In [4], the GuideCane robot is presented as a lightweight design with which users can interact. The robot incorporates ultrasonic sensors that detect obstacles and provide a safe path. In [5], the robotic system is equipped with a visual sensor, a laser rangefinder, and a speaker to provide blind people with information about their environment. Clustering techniques analyze the laser data to detect obstacles, steps, and stairs. In [6], the iPath intelligent guidance robot is presented. It has the capability to follow a marked path on the floor and detect user movements and obstacles using infrared and ultrasonic sensors. In [7], the Co-Robotic Cane, a new robotic navigation aid that uses a three-dimensional camera to estimate position and recognize objects in indoor environments, is presented. In [8], a robot is designed to help blind people navigate unfamiliar terrain safely and independently. It uses ultrasonic, LIDAR, and infrared sensors, as well as a camera system, to perceive and analyze its dynamic environment. The robot is controlled by voice commands from a smartphone application. Other related designs are proposed in [9,10,11,12]. The use of legged robots to guide blind people has been limited due to the complexity of their control systems, resulting in a paucity of literature in this area. As described in [13], the HexGuide is a hexapod guide robot designed to help blind people navigate challenging environments, such as airports and intersections. It accomplishes this using an inertial measurement unit (IMU), a depth camera, a LIDAR system, and voice commands. The Mini Cheetah quadruped robot [14], used in [15], incorporates a LiDAR sensor for localization and a depth camera to detect obstacles and guide an assisted person using a leash through confined indoor spaces. Similarly, the Laikago quadruped robot system [16] is used in [17], incorporating a LiDAR, a camera, and a controllable traction device. This device is capable of adjusting the length and strength of the interaction between the robot and the blind person. This ensures safe and comfortable assistance in confined indoor environments. Nippon Seiko Kabushiki-gaisha (NSK) developed a robotic guide dog with wheels at the end of each leg. It uses these wheels to move around and its mechanical legs to climb obstacles, such as stairs. A distance image sensor on its head allows it to detect steps, and proximity sensors obtain information from its legs [18]. In [19], a mobile guide robot designed for outdoor environments is presented. The robot plans its path using data from ultrasonic radar and IMU sensors. Its vision system identifies obstacles and their distance using a deep learning and binocular ranging algorithm, then informs the blind person via voice. In [20], a wheeled hexapod robot with the capability to detect traffic lights and adjust its speed between itself and a blind person is proposed. In terms of commercial solutions, Lysa [21] is the first robotic guide dog designed specifically to assist blind people. It has two motors, eight infrared sensors, a LiDAR sensor, and a camera to detect potholes, obstacles, and collision risks at different heights. In addition, Lysa provides warnings through recorded voice messages and can analyze the environment to find safe alternative routes. Other solutions that have been developed to assist blind people in different environments are wearable devices, such as those presented in [22,23,24].
Global demographic change is a reality. As a result, the number of older people is increasing in both developed and developing countries. As this population segment grows, so does the demand for health and social services, such as elderly care. This raises a number of social and economic issues. Many elderly people prefer to live alone rather than with family members who could care for them. Robots can improve elderly care by providing assistance, companionship, and monitoring services. Several types of service robots have been designed to assist the elderly in their daily lives, improve their quality of life, and ensure their autonomy. Companion and emotional support robots have been designed. For example, Paro is shaped like a seal and responds to touch and voice. Paro is designed to have three types of effects: psychological (e.g., relaxation and motivation), physiological (e.g., improved vital signs), and social (e.g., facilitating communication between patients and caregivers) [25]. There are also home assistance robots. For example, Roomba is an autonomous vacuum cleaner that helps keep homes clean without the intervention of elderly people [26]. Health monitoring and medication reminder robots are also available. One example is Mabu, an interactive robot that reminds users to take their medication and collects health information to share with caregivers or doctors [27]. Other solutions are mobility and physical support robots, such as LEA (Lean Empowering Assistant) [28]. LEA is specifically designed to help the elderly and people with reduced mobility move around and assist them in becoming more independent. Telepresence robots, such as the Beam robot, allow family members to interact with elderly people via videoconferencing, giving them a sense of companionship [29]. There are also cognitive assistance robots, such as Elli-Q [30]. Elli-Q remembers tasks, helps in routines, and suggests activities to promote a healthy lifestyle, such as reading or walking. Emergency assistance robots have also been proposed. For example, Temi is designed to detect falls and send alerts to family members or emergency services [31]. There are several rehabilitation robots available. One example is the Armeo, which is specifically designed to assist in the rehabilitation of the arms and hands [32]. It is particularly useful for elderly people who have suffered a stroke, had neurological injuries, or had a condition that affects the mobility of their upper extremities. Other robots designed to assist the elderly with movement include Kompaï Assist [33], Care-O-bot [34], and Zora [35]. The Kompaï robot provides multiple services, including day and night surveillance, mobility assistance, fall detection, shopping list management, agenda organization, social connectivity, cognitive stimulation, and health monitoring [36]. It features speech recognition and can navigate autonomously in unfamiliar environments while detecting obstacles and assessing risks. Its user interface combines touchscreen controls with voice commands, and it has a handle to make standing up easier. On the other hand, the Care-O-bot is a versatile robotic system that can perform various functions, such as picking up and delivering objects, supporting independent living, assisting in security and surveillance, and welcoming and accompanying people in public spaces, like shops and museums [37]. Similar to Kompaï, Care-O-bot allows interaction via touch screen and voice commands and has a handle for ease of use. Finally, Zora is a humanoid robot equipped with sensors for visual and auditory perception. It is mainly used in rehabilitation environments and for social and leisure activities for elderly people [38].
However, the use of current guide robots is limited by several factors. The high cost resulting from advanced technologies, such as sensors, artificial intelligence (AI), and specialized materials, limit their accessibility. In addition, human-robot interaction is a barrier, as many blind and elderly people may lack experience with advanced technological devices, which hinder their use and acceptance. The configuration and customization of these robots often require specialized technical assistance, which increases the difficulty of their implementation. On the other hand, the mobility and navigation capabilities of these robots are limited, especially in dynamic, unstructured, or crowded environments. These factors highlight the need to improve their accessibility, to reduce their cost, and to facilitate their interaction with users so that these robots can have a real impact on the daily lives of blind and elderly people after proper user-oriented operation modes are established.
This paper presents the development of a guide robot designed to assist blind and elderly people in their mobility. The need for assistive solutions for these user groups is analyzed with key requirements including safe navigation, reliable human–robot interaction, adaptability to environmental variability, and affordability. Existing solutions often rely on either purely wheeled or purely legged locomotion, which can limit terrain adaptability or energy efficiency.
Based on an analysis of these requirements, an innovative guide robot was designed featuring a hybrid locomotion system. The initial prototype consisted of a hexagonal platform with four symmetrically arranged wheels (two motorized at the rear and two passive) and two front legs equipped with linear actuators and wheels. This configuration aims to combine the efficiency of wheeled locomotion with the adaptability of leg-assisted motion to handle minor unevenness and improve user safety.
Following the fabrication of the first prototype, preliminary tests were performed in indoor environments, including straight-line displacement and obstacle-overcoming trials. These tests revealed several limitations related to load distribution and stability, such as excessive loading of the central wheels, loss of platform horizontality, and insufficient rear wheel traction. To address these issues, the robot was structurally redesigned. The revised prototype adopts a triangular stability configuration with a single central wheel and dual rear wheels, resulting in improved static and dynamic stability and better overall load distribution.
This paper describes the design process and experimental evaluation of the prototype, as well as the insights gained from the redesign. The results demonstrate the feasibility of the hybrid locomotion approach and lay the groundwork for further development toward a fully functional assistive guide robot.
The manuscript is organized as follows. Section 2 discusses the problems and requirements for the design of a guide robot for blind and elderly people. It also presents the conceptual design and modeling using CAD tools. Section 3 presents the results of the design and evaluation of the prototype. Section 4 provides a critical discussion of these results, analyzing their significance and possible implications. Finally, Section 5 presents the conclusions of the work.

2. Materials and Methods

2.1. Requirements for Design and Operation

To develop an effective solution, it is necessary to analyze the issues and requirements. The design of a guide robot for blind and elderly people requires consideration of several aspects, as pointed out in [39]. One of the main issues is the integration of advanced sensors that allow the robot to detect obstacles, terrain changes, and dynamic situations in real time. These sensors must provide reliable data that allow the robot to react appropriately to different environmental scenarios, thus ensuring the user’s safety while moving. In addition, each user has different needs, preferences, and abilities, so the robot must be able to adapt and provide a personalized experience. Safety and autonomy are fundamental factors for blind and elderly people to be able to move independently and confidently in their environment. In addition, a guide robot must provide sufficient autonomy to travel long distances without the need for frequent battery recharging. The mechanical design of the robot is also important. A bulky or overly heavy robot could make it difficult to transport and use, becoming a barrier rather than a solution. In addition, it must be robust and functional in a variety of environmental conditions. Another basic feature is that the robot should be available at an affordable cost. Finally, social acceptance of the robot is a determining factor for its acceptance. It is essential to consider how assisted individuals perceive their interactions with robots and how their confidence in the system influences their willingness to use it in their daily lives. Figure 1 summarizes the issues to consider when designing a guide robot for blind and elderly people.
In terms of requirements, the robot must be user-friendly, which implies a natural and accessible interaction with users. It must also be adaptable to the individual needs of each user. Its operation must be efficient and perform its primary function of assisting mobility with precision, even in changing environments. It must also ensure user safety and prevent accidents by protecting the user from potential hazards in the environment. It must also be reliable, ensuring consistent and robust operation at all times. Finally, it must be intuitive with a simple user-friendly interface. Figure 2 summarizes those requirements to be considered when designing a guide robot for the elderly and blind.
The design of a guide robot must take into account also key features such as mechanical structure, locomotion systems, drive and propulsion systems, payload, motion conditions, and gait. The robot must be equipped with advanced technologies to effectively and safely guide blind and elderly people. Sensors such as cameras, LiDAR, ultrasonic and infrared detectors enable it to map the environment, and they can locate a user and detect obstacles in real time, for accurately safe routes. The interaction between a user and a robot must be intuitive, based on voice commands and clear auditory signals. The robot must be able to provide spoken navigation instructions, environmental descriptions, and safety warnings, and must understand voice commands. The robot must be stable, and its design must include strong lightweight materials that make it portable. The battery must have sufficient autonomy for an extended use. In addition, it is essential to implement an AI system that allows the robot to adapt to the specific needs of the assisted person and the changing conditions of the environment.
Figure 3 summarizes technical and functional requirements to be met by a guide robot for the blind and elderly. Table 1 shows a comparison of the main characteristics of available commercial robots for mobility assistance for the blind and elderly, highlighting limits and open issues referring to the above-mentioned considerations in Figure 1, Figure 2 and Figure 3.
Table 1 presents a comparative summary of selected commercially available robots designed to assist blind and elderly users. The robots were chosen based on three criteria: (i) commercial availability, (ii) relevance to mobility guidance and assistive applications, and (iii) availability of technical data in literature or manufacturer documentation. The comparison includes platform size, weight, autonomy, payload capacity, and cost range when reported. These parameters are fundamental for a user-centered design, directly influencing the usability, safety, and affordability of a final product. These values were defined from published specifications to provide a representative overview of the current state of the art.
To contextualize the contribution of this work, the main characteristics of the designed prototype are also included. This allows a direct comparison in terms of weight, cost, and autonomy, highlighting that the designed service robot is significantly lighter and more affordable than most existing solutions.

2.2. Conceptual Design

A conceptual architecture of the mechanical design is conceived consisting of six main functional blocks as shown in Figure 4.
The mechanical structure includes the robot body and the locomotion unit, which is specifically designed as a hybrid system of legs and wheels. This configuration is designed to allow the robot to better adapt to environment, move efficiently over different types of surfaces and obstacles. The sensory system consists of an IMU that provides data on the robot’s orientation, acceleration, and relative position. This information is fundamental for maintaining stability and adjusting motion in real time. As for the actuators, the system includes a combination of rotary servomotors and linear actuators, which are responsible for generating the movement of the robot, either to move, stabilize or perform actions related to interaction with the environment or with a user. The power supply system is based on a rechargeable battery that acts as a power source for all the modules of the system. The control module consists of a microcontroller and a wireless joystick that can help also the voice-based controlled operation. Finally, the communication system integrates a microphone to receive and process voice commands from a user. Overall, this modular design allows for scalable and adaptable development to meet the specific needs of users. It also lays the foundation for future integration of more advanced systems. This includes the integration of advanced AI algorithms for route planning and autonomous decision making, autonomous navigation systems based on more complex sensors (such as cameras and LiDAR), and multisensory feedback mechanisms (haptic, visual) to enrich the user–robot interaction and improve the perception of the environment. The modularity of the design allows functional blocks to be added or modified with minimal impact on the overall architecture, promoting the continuous evolution of the guide robot.
The main requirements for the design of a new robot guide are summarized in Table 2. All over components should not raise the cost to inaccessible value.
The target values for the design of the proposed guide robot, as detailed in Table 2 referring to Figure 4, were established as based on a systematic review of existing literature and a market analysis of mobility assistance devices using the reported references and mainly [40]. While parameters were not derived from formal, large-scale user studies, they represent a set of technical and functional criteria for operational needs of a service robot in real-world usage scenarios.
The following guidelines were considered when defining the requirements:
  • Economic and technical feasibility: a target cost of less than 500 euros is set to ensure the accessibility and competitiveness of the proposed design when compared to other non-robotic assisting devices, such as smart canes and walkers, on the market. Similarly, the weight and size of the proposed robot are defined to ensure ease of transport and maneuverability by users themselves, seeking a balance between robustness and portability.
  • User comfort and safety: the travel speed is adjusted to the average human walking range (0.5–1.5 m/s) to ensure smooth and natural interaction. The payload is set up between 2 and 3 kg to allow transport of small everyday objects thus limiting power source and risks in non-proper handling of the payload. The autonomy is set for between 4 and 6 h to ensure device operation for extended reasonable day periods without recharging.

2.3. Components

The proposed design for the guide robot in Figure 4 and Figure 5 combines structural and functional features aimed at maximizing stability, maneuverability, and adaptability to different environments. It is a hybrid configuration that integrates wheels and legs, providing a balance between locomotion efficiency and obstacle avoidance capability. The locomotion system consists of two front legs with integrated wheels and four additional wheels distributed around a central platform of hexagonal geometry that forms the main body of the robot. The two leg wheels and the two rear wheels are driven by continuously rotating servomotors, providing active traction and directional control. The symmetrical arrangement of the wheels around the platform not only ensures good structural stability, but also promotes balanced and smooth motion, especially on flat surfaces. The front legs are equipped with ACTUONIX L12-100-50-6R [41], linear actuators, which allow precise vertical movements. These can be used to lift the carriage, to adjust the height of the robot or to overcome small obstacles in the environment. This combination of elements allows the robot not only to move efficiently, but also to adapt to uneven terrain or to respond to variations in the contact surface. The hexagonal structure of the platform not only provides rigidity and mechanical robustness but also optimizes the distribution of internal space for the housing of system components such as the battery, electronic board and communication modules. As for the control system, the onboard hardware consists of an Arduino Nano [42] as the main processing unit, together with a PCA9685 servo controller module [43] responsible for efficiently managing multiple PWM outputs. The drive system is supported by six Tower Pro MG996R continuous rotation servo motors [44], which provide the necessary force for displacement and responsiveness to navigation commands. The system is powered by 14.8 V–2600 mAh lithium polymer battery. An XL4015 DC-DC converter [45] regulates the voltage for the various electronic components. An ACS712 current sensor [46] measures the system’s total current consumption to monitor power usage and provide real-time information. The housings for the servomotors and actuators are custom-designed structural components that are 3D-printed for optimized integration and weight.
Table 3 lists the main characteristics of the components used in the built prototype design. The table details fundamental aspects, such as the device type, main function, operating voltages, communication interfaces, and load/current capabilities These components cost a total of €197.39.
Figure 5. CAD model of the proposed guide robot for blind and elderly people in Figure 4.
Figure 5. CAD model of the proposed guide robot for blind and elderly people in Figure 4.
Applsci 15 10474 g005
Figure 6. Design parameters of the designed guide robot in Figure 5, Table 4: (a) Top view; (b) Front view (L: Linear actuator; M: Driving motor).
Figure 6. Design parameters of the designed guide robot in Figure 5, Table 4: (a) Top view; (b) Front view (L: Linear actuator; M: Driving motor).
Applsci 15 10474 g006

2.4. CAD Design

Figure 5 shows the 3D CAD model of the guide robot as designed in Inventor. Figure 6 shows the main dimensions, and Table 4 lists their values.
The mechanical configuration of the proposed guide robot is determined through an iterative CAD-based design process using Autodesk Inventor. The primary objective was to achieve a compact and lightweight structure that is safe for operation by blind and elderly users while ensuring mechanical stability on various surfaces.
During the design stage, the dimensions of the hexagonal base platform and the leg–wheel assemblies were progressively adjusted to balance usability, portability, and structural robustness. Instead of employing a formal optimization algorithm, design decisions were guided by standard mechanical design principles and were validated through preliminary prototyping. Particular emphasis was addressed to reducing the total mass of the system, as weight strongly influences device handling, energy efficiency, and user acceptance.
Table 4. Main design dimensions of the guide robot in Figure 5 and Figure 6.
Table 4. Main design dimensions of the guide robot in Figure 5 and Figure 6.
a (mm)b (mm)c (mm)d (mm)e (mm)f (mm)g (mm)h (mm)
20030050022422450 65235.45

3. Results

To analyze the feasibility of the designed guide robot during rectilinear displacement, a numerical simulation was carried out using Autodesk Inventor Professional (Version 2025). A constant angular velocity input of 15.38 rad/s was applied to the rear drive wheels, which have a diameter of 65 mm. Gravity was considered in the −Z direction, and the solver was set to an accuracy of 1 × 10–6. Two reference points were selected for the kinematic and dynamic analyses: point A that is located at the geometric center of the hexagonal platform constituting the robot’s main body, with an initial position of (−314,522, −20,414, 54,473) mm, and point B that is located at the right front wheel support, with an initial coordinate of (63,896, −63,023, 175,473) mm. The motion of point B enables direct evaluation of the active adjustment mechanism of the right front leg, whose initial length and angle were set to 12 mm and 173.31°, respectively. While the displacement of point A allows an evaluation of the overall displacement of the main platform, the trajectory of the center of mass (CoM) is analyzed as a global indicator of robot stability. The combination of points A, B, and CoM gives a global evaluation of the motion behavior of leg–wheels and overall hexapod. To ensure realistic and stable behavior, the contact between the wheels and the ground was modeled using 3D contacts with a friction coefficient of 0.6, a stiffness of 1,000,000 N/mm, and a damping of 1000 N·s/mm. The friction and stiffness parameters of the wheel-ground contact are selected from standard values available in literature such as [47], with the aim of modeling realistic behavior on a hard surface, such as asphalt, without direct empirical validation. The stiffness value is chosen to model the wheels as rigid bodies, which is appropriate for the scope of the simulation. A robot weight is considered of 1.9 kg and a payload of 2 kg for onboard equipment.
Figure 7 shows a snapshot of the simulated rectilinear motion of the designed guide robot.
The feasibility of the proposed design as in Figure 5 and Figure 6 and the efficient operation performance were evaluated by considering the output calculation from the simulated motion as in Table 5 so that it was proceeded with the construction of a prototype as in Figure 8.
Figure 8 shows the prototype of the guide robot built at LARM2 (https://larm2.ing.uniroma2.it/, visited on 15 September 2025). The central platform structure, made of plastic and rectangular geometry, has been adapted with squares that give it a shape similar to the hexagonal geometry of the conceptual design. This structure houses electronic components and battery. The prototype’s layout shows the spatial distribution of the subsystems and the feasibility of the hybrid locomotion concept.
It should be noted that the physical prototype differs slightly from the initial CAD model. While the CAD design employed a hexagonal platform with six wheels, the built prototype adopted a five-wheel configuration with a central wheel and dual rear wheels arranged in a triangular stability layout. This modification was introduced to improve static and dynamic operation following testing feedback. In addition, minor adjustments in dimensions and mounting details were made during prototype assembly to accommodate available components and to simplify construction, without altering the fundamental locomotion principle of the proposed hybrid wheel–leg system.
After assembling the prototype shown in Figure 8, a critical defect was identified in the design of the front leg mechanism. The mechanism, which consists of a linear actuator, a leg motor housing, a front wheel motor mount, and a front wheel motor, generates a combined torque that exceeds the capacity of the MG996R servo motor that actuates it. This resulted in two issues: (1) an insufficient driving force, which prevented lifting movements and caused spontaneous gravitational restart, and (2) forward weight distribution, which caused instability and risk of rear wheel lift.
To mitigate these issues, the L12-100-50-6R actuator was replaced with the L16-50-63-6-R linear actuator [48]. This modification was crucial, as the shorter stroke of the L16 linear actuator (50 mm) meant a shorter length of the actuator and, therefore, of the leg mechanism, reducing the torque required for its movement. Changing the actuator solved the problem of excessive torque, ensuring stable and reliable leg-lifting movement.
After an experimental evaluation of the initial prototype shown in Figure 8 through straight-line displacement and obstacle-overcoming tests, stability issues related to the mechanical configuration were identified. The tests revealed that the system’s center of gravity displaced towards the front leg relative to the geometric center of the platform. This misalignment resulted in three main operational limitations, namely the following:
  • Overloading of the central wheels: This significantly restricts robot freedom of rotation.
  • Loss of platform horizontality: Relying solely on support from the center and rear wheels compromised the effectiveness of the front leg adjustment when overcoming obstacles.
  • Insufficient rear wheel traction: Poor load distribution limited the ability to propel forward.
To address these deficiencies, the prototype underwent a structural redesign, as shown in Figure 9b. Modifications included strategically repositioning the center wheels and adopting a triangular wheel configuration. This new configuration consists of a single center wheel paired with a set of dual rear wheels to improve the system’s static and dynamic stability. Table 6 shows the main characteristics of the designed five-wheeled guide robot prototype.
The prototype was experimentally characterized in terms of speed, autonomy, and terrain-handling capability. Straight-line tests over a measured distance revealed an average cruising speed of approximately 0.1 m/s. Although this value is below the target design range of 0.5 to 1.5 m/s, it confirms the feasibility of basic movement and highlights the stability of the hybrid wheel-and-leg configuration. The limited speed of the prototype is primarily due to conservative control settings and the limited power of the current servomotors. These aspects will be improved in future iterations of the design. In terms of energy autonomy, the platform uses a 14.8 V, 2.6 Ah lithium polymer battery that provides 38 Wh of available energy. Based on a measured average power consumption of 6 W, the expected autonomy is 6.4 h. Considering conversion inefficiencies and actual operating conditions, the practical autonomy can be estimated at 4.5 to 5 h.
The total cost of prototype assembly, including mechanical and electronic components, is about 255 euros, well below the target threshold of 500 euros. This cost includes expenditures for actuators, sensors, structure materials, and control unit, demonstrating that the proposed design can be realized with affordable, readily available market components.
The designed test protocol includes repeated trials of straight-line motion, turning maneuvers, and overcoming obstacles. The robot prototype demonstrated proper and consistent performance in both linear displacement and rotational movements, confirming efficiency in locomotion and control strategy. For overcoming obstacles, the test results showed a 100% success rate when attaching steps of up to 30 mm, while higher obstacles revealed limitations due to the maximum torque of the leg servomotors (about 1 Nm), which prevented reliable clearance of the obstacles.
Validation tests confirmed that the new design meets the considered functional requirements. The new design demonstrates improved obstacle negotiation and more balanced mass distribution. Figure 9b shows the redesigned prototype overcoming a 40-mm-high step.
The main improvements introduced by this redesign are the following:
  • Optimization of mass distribution through geometric reconfiguration of the chassis.
  • Implementation of a triangular support configuration improves stability in motion and at rest.
  • Maintaining the horizontal orientation of the platform during front leg adjustment.
  • Reduced rear wheel slip due to improved traction distribution.
This redesign effectively addresses the shortcomings observed in the initial prototype in Figure 9a, while maintaining the guidance robot’s fundamental functional objectives, particularly its ability to navigate autonomously around obstacles.
Figure 10 shows a snapshot from an experimental test in which the redesigned prototype successfully overpasses a 20-mm-high step. The first image shows the robot approaching the step while keeping its platform horizontal. The second and third images depict the coordinated deployment of the front legs’ linear actuator, which raises the front wheels and positions them over the obstacle. The fourth image shows the rest of the robot body ascending stably, with the platform preserving its horizontal orientation throughout the maneuver. This performance is attributed to optimal mass distribution, triangular support configuration, and optimized wheel traction, which ensure continuous movement and robust stability when overcoming obstacles. Figure 9b shows the robot overcoming a 40 mm high step which corresponds to the kinematic limit of the leg–wheel assembly and the available actuator torque. Beyond this height, slippage and loss of stability were observed preventing the obstacle overpassing. In Figure 10 the prototype overpasses successfully a 20 mm obstacle with success rate of 100% of 10 tests while Figure 9b shows a successively overpassing of an obstacle of 40 mm with a success rate of 40% from 10 tests yet.
Test results show that the prototype meets the objectives that are defined in the design requirements regarding weight, cost, and autonomy, while speed optimization and improved obstacle-handling capacity will be addressed in future work.
The sensor equipment has been selected to meet the user’s main needs and requirements as in the conceptual design in Figure 11. It allows users to be aware of their environment while moving, and to move safely with adequate interaction with the guide robot through voice feedback. The guide robot’s autonomous navigation capability is based on proper path planning and obstacle avoidance algorithms, as well as appropriate sensors. The IMU will be installed in the central part of the robot’s body, close to its main processing unit. This position enables the robot to take precise measurements of its inclination, acceleration, and orientation in real time. The LiDAR sensor, mounted on the front of the robot, performs 360-degree scanning of the surrounding environment. It generates a three-dimensional map and detects obstacles at varying distances and heights, enabling precise localization and dynamic path planning. A front-facing camera complements the LiDAR by capturing visual information in the direction of travel. This allows the robot to recognize objects and visual landmarks, detect humans, and perform real-time scene analysis to enhance the user’s situational awareness. In addition to the primary sensors, a distributed array of infrared (IR) sensors is strategically placed around the platform. These sensors provide short-range obstacle detection and edge-following capabilities. These capabilities are especially useful for avoiding low-lying objects and detecting changes in terrain, such as curbs and stair edges. An ultrasonic sensor is installed near the front of the robot to enhance obstacle detection in areas where IR or LiDAR might struggle, such as with transparent surfaces or soft materials. This sensor improves the robot’s ability to perceive its surroundings under various environmental conditions. To support global localization and outdoor navigation, the robot is equipped with a GPS module, enabling geolocation and route tracking in open spaces. To support effective user interaction, the robot incorporates directional microphones and a voice command interface, placed at the top section of the robot. These components enable speech recognition for receiving verbal instructions and provide voice feedback for auditory guidance, supporting inclusive and adaptive human–robot interaction. Together, these sensors allow the guide robot to effectively operate in dynamic environments, maintain situational awareness, and provide intelligent, personalized assistance to blind and elderly users.
Figure 11 aims to illustrate the layout and physical integration of different sensors in the prototype design. However, details on accuracy data for the LiDAR and IMU are not included, since the scope of the paper focuses on the mechanical design and feasibility of the hybrid locomotion system.
In view of the intended use with blind and elderly users, particular attention is given to safety considerations. While the presented prototype focuses on validating design feasibility, a final design will incorporate several key safety aspects. These features will include a physical emergency stop button, pre-programmed speed limits to prevent sudden movements, and fail-safe modes that ensure the robot enters a safe state in the event of system failure. Additionally, sensor redundancy will be implemented for attaching critical situations like obstacle detection and operating in complex environments. Additionally, the design and operation will be analyzed for formal risk assessment as a function of the relevant safety standards for service robots.

4. Discussion

The current prototype has a total mass of 1.9 kg, which is slightly below the 2–3 kg target range defined in the design requirements. This limited weight ensures portability and ease of handling. However, future iterations will reinforce certain structural elements to align more closely with the specified range. With a cost of about 255 euros, the prototype is well below the established threshold of 500 euros, confirming the affordability of the proposed design. Experimental tests indicated a practical endurance of approximately 4.5–5 h, which is consistent with the design objective. Though the measured average speed of 0.1 m/s is below the initial specification range, it is sufficient for demonstrating the proper operation of the hybrid locomotion system. Future work will focus on improving the drive system and control strategy to increase cruising speed. Additionally, battery discharge tests will be performed under realistic load conditions to validate endurance in operational scenarios.
The current prototype integrates two inertial measurement units (IMUs) as its primary sensing elements. One IMU is installed near the geometric center of the platform, close to the main processing unit, where it provides real-time measurements of inclination, acceleration, and orientation for maintaining stable locomotion and correcting the robot’s trajectory. The second IMU is mounted directly on the right front wheel motor to monitor the movement of the corresponding leg. These measurements are used to analyze the kinematics of the leg–wheel assembly during obstacle-overcoming tests and to ensure stable operation.
Additional perception and interaction modules—including LiDAR for environmental mapping, a front-facing camera for object detection, infrared and ultrasonic sensors for short-range obstacle detection, and a GPS module for outdoor navigation—are not yet implemented but are planned for future iterations. Their integration will enable autonomous navigation, robust localization, and operation in more complex environments.
Likewise, the voice interaction interface, consisting of directional microphones and speech synthesis for feedback, is still at the design stage. Its implementation, together with structured human–robot interaction (HRI) trials using standard metrics such as guidance accuracy, perceived safety, and user workload, is planned for future work.
Although limited, the implemented sensorization in the prototype ensures that the robot expected operation and basic sensing have been validated first, providing a reliable foundation for the subsequent integration of navigation and interaction capabilities.
Simulation work has been carried out both to define the design solution and to check its feasibility whereas the performance characteristics have been determined with experimental tests also in more demanding situations. Because of that, a comparison between simulation computations and test results was not possible and indeed not useful for a characterization of the prototype performance while the validation of the design model can be considered obtained with respect to the design requirements.
The experimental results from the built prototype identify the problems of the mechanical design for mobility and stability of the proposed guide robot that were attached with modifications as discussed referring to Figure 8 and Figure 9. The implementation of a new design configuration improved the robot’s ability to maintain its horizontal orientation during complex maneuvers and facilitated a more efficient action of the front legs, which are equipped with a linear actuator. Experimental validation of the redesigned prototype demonstrated that the robot can overcome obstacles without compromising its stability. In addition, the platform was verified to remain levelled during the process, which is important to protect the electronics and ensure user safety. Wheel slippage was also eliminated, which improved trajectory control during travel.
The proposed mobile guide robot can be considered a mobile platform for housing sensorization and human-machine interaction units to provide an intelligent responsive service robot in assistance blind and elderly people with the following features:
  • Guiding user’s motion;
  • Identifying surrounding scenario for user’s awareness;
  • Interacting with users as a companion.
The lightweight small design ensures portability also in possibility that users can handle the mobile guide robot for easy transportation. The user-oriented design solution with market components provides low-cost system and low-cost operation and maintenance. The modular design challenges the customized features in selecting and adjusting the on-board equipment as function of the specific needs and capability of a user.
The experimental protocol included repeated straight-line motion tests, turning maneuvers, and obstacle-crossing trials in indoor environments. These trials confirmed the design feasibility of the hybrid locomotion system and highlighted the robot’s ability to reliably overcome steps of up to 30 mm. However, the current evaluation does not cover outdoor surfaces, variable slopes, multiple obstacle heights beyond this threshold, or operation under different payload conditions. These aspects will be addressed in future testing campaigns to provide a broader and more structured validation of performance.
Quantitative experimental evidence is presented with the reported test results to validate the design feasibility of the proposed design as proof-of-concept and to characterize the performance of the prototype with respect to the planned targets in terms of locomotion operation. Table 6 summarizing the characteristics of the prototype shows a satisfactory demonstration of respect of the requirements in Table 2 in the proposed solution.
Future developments can be planned in improvements in design and operation aspects as following results of a proper testing of the built prototype on field in home and outdoor environments by potential users with their differentiated characteristics.

5. Conclusions

This paper provides a detailed analysis of the essential design issues and requirements for developing a service robot guide for blind and elderly people. The analysis highlights the need for a system that ensures safe and independent mobility by integrating key attributes such as usability, intuitive interaction, portability, and affordability. The main outcome of the analysis is a functional prototype of a guide robot with a novel hybrid wheel-and-leg configuration. The built prototype has demonstrated its ability to move stably in various conditions, representing a significant advance in mobility assistance solutions. The next steps of the research will include testing of the prototype in real operating environments to validate its performance and functionality in everyday situations. Future work will consider a more formal structural optimization using finite element analysis (FEA) to further refine the trade-off between weight and strength. Future work will focus on evaluating autonomous navigation and human-robot interaction (HRI) with real users. While the present work is focused on proof-of-concept validation of the mechanical design and operation of the proposed mobile assisting robot, comprehensive validation as an assistive technology requires structured trials with blind and elderly participants. Future experiments will employ standard HRI performance metrics, such as guidance accuracy, user comfort, perceived safety, and workload assessment.

Author Contributions

Conceptualization, M.G., M.C. and M.R.; methodology, M.G. and M.C.; software, M.G. and B.Y.; validation, M.G. and B.Y.; formal analysis, M.G., M.C. and M.R.; investigation, M.G., M.C. and M.R.; resources, M.G. and M.C.; data curation, M.G. and B.Y.; writing—original draft preparation, M.G., M.C. and M.R.; writing—review and editing, M.G., M.C. and M.R.; visualization, M.G. and B.Y.; supervision, M.G., M.C. and M.R.; project administration, M.G. and M.C.; funding acquisition, M.G. and M.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The first author wishes to gratefully acknowledge Universidad Carlos III de Madrid “Ayudas para la movilidad de investigadores/as de la UC3M en centros de investigación nacionales y extranjeros en su modalidad: estancias de jóvenes doctores/as” for permiting her a period of research at the LARM2 of the University of Tor Vergata in 2024–2025.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. World Report on Vision. Available online: https://www.who.int/publications/i/item/world-report-on-vision (accessed on 21 January 2025).
  2. World Health Organization Ageing and Health. Available online: https://www.who.int/news-room/fact-sheets/detail/ageing-and-health (accessed on 21 January 2025).
  3. Kulyukin, V.; Gharpure, C.; Nicholson, J.; Osborne, G. Robot-assisted wayfinding for the visually impaired in structured indoor environments. Auton. Robot. 2006, 21, 29–41. [Google Scholar] [CrossRef]
  4. Ulrich, I.; Borenstein, J. The GuideCane-applying mobile robot technologies to assist the visually impaired. IEEE Trans. Syst. Man Cybern. Part. A Syst. Hum. 2001, 31, 131–136. [Google Scholar] [CrossRef]
  5. Capi, G. Assisting and guiding visually impaired in indoor environments. Int. J. Mech. Eng. Mechatron. 2012, 1, 9–14. [Google Scholar] [CrossRef]
  6. Toha, S.F.; Yusof, H.M.; Razali, M.F.; Halim, A.H.A. Intelligent path guidance robot for blind person assistance. In Proceedings of the 2015 International Conference on Informatics, Electronics & Vision (ICIEV), Fukuoka, Japan, 15–18 June 2015. [Google Scholar] [CrossRef]
  7. Ye, C.; Hong, S.; Qian, X.; Wu, W. Co-robotic cane: A new robotic navigation aid for the visually impaired. IEEE Syst. Man Cybern. Mag. 2016, 2, 33–42. [Google Scholar] [CrossRef]
  8. Bhaskar Nikhil, S.; Sharma, A.; Nair, N.S.; Sai Srikar, C.; Wutla, Y.; Rahul, B.; Jhavar, S.; Tambe, P. Design and Evaluation of a Multi-Sensor Assistive Robot for the Visually Impaired. In Advances in Mechanical Engineering and Material Science (ICAMEMS 2023); Lecture Notes in Mechanical Engineering; Tambe, P., Huang, P., Jhavar, S., Eds.; Springer: Singapore, 2023; pp. 119–131. [Google Scholar] [CrossRef]
  9. Kulkarni, A.; Wang, A.; Urbina, L.; Steinfeld, A.; Dias, B. Robotic assistance in indoor navigation for people who are blind. In Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016. [Google Scholar] [CrossRef]
  10. Guerreiro, J.; Sato, D.; Asakawa, S.; Dong, H.; Kitani, K.M.; Asakawa, C. Cabot: Designing and evaluating an autonomous navigation robot for blind people. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 28–30 October 2019. [Google Scholar] [CrossRef]
  11. Albogamy, F.; Alotaibi, T.; Alhawdan, G.; Mohammed, F. SRAVIP: Smart robot assistant for visually impaired persons. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 345–352. [Google Scholar] [CrossRef]
  12. Wachaja, A.; Agarwal, P.; Zink, M.; Adame, M.R.; Möller, K.; Burgard, W. Navigating blind people with walking impairments using a smart walker. Auton. Robot. 2017, 41, 555–573. [Google Scholar] [CrossRef]
  13. Wang, Z.; Yang, L.; Liu, X.; Wang, T.; Gao, F. HexGuide: A Hexapod Robot for Autonomous Blind Guidance in Challenging Environments. In Intelligent Robotics and Applications (ICIRA 2023); Lecture Notes in Computer Science; Yang, H., Liu, H., Zou, J., Yin, Z., Liu, L., Yang, G., Ouyang, X., Wang, Z., Eds.; Springer: Singapore, 2023; Volume 14271, pp. 494–505. [Google Scholar] [CrossRef]
  14. Katz, B.; Di Carlo, J.; Kim, S. Mini cheetah: A platform for pushing the limits of dynamic quadruped control. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019. [Google Scholar] [CrossRef]
  15. Xiao, A.; Tong, W.; Yang, L.; Zeng, J.; Li, Z.; Sreenath, K. Robotic guide dog: Leading a human with leash-guided hybrid physical interaction. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021. [Google Scholar] [CrossRef]
  16. Robots. Laikago. Available online: https://robotsguide.com/robots/laikago (accessed on 28 March 2025).
  17. Chen, Y.; Xu, Z.; Jian, Z.; Tang, G.; Yang, L.; Xiao, A.; Wang, X.; Liang, B. Quadruped guidance robot for the visually impaired: A comfort-based approach. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023. [Google Scholar] [CrossRef]
  18. NSK Develops Four-Legged Robot “Guide Dog”. Available online: https://newatlas.com/nsk-four-legged-robot-guide-dog/20559/ (accessed on 21 January 2025).
  19. Guo, Y.; Hao, L.; Wu, Y. Research on vision based outdoor blind guiding robot. In Proceedings of the IEEE 12th International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Baishan, China, 27–31 July 2022. [Google Scholar] [CrossRef]
  20. Zhang, L.; Jia, K.; Liu, J.; Wang, G.; Huang, W. Design of Blind Guiding Robot Based on Speed Adaptation and Visual Recognition. IEEE Access 2023, 11, 75971–75978. [Google Scholar] [CrossRef]
  21. Quem é Lysa? VixSystem. Available online: http://www.caoguiarobo.com.br/ (accessed on 28 March 2025).
  22. Bai, J.; Lian, S.; Liu, Z.; Wang, K.; Liu, D. Virtual-blind-road following-based wearable navigation device for blind people. IEEE Trans. Consum. Electron. 2018, 64, 136–143. [Google Scholar] [CrossRef]
  23. Hsieh, Y.Z.; Lin, S.S.; Xu, F.X. Development of a wearable guide device based on convolutional neural network for blind or visually impaired persons. Multimed. Tools Appl. 2020, 79, 29473–29491. [Google Scholar] [CrossRef]
  24. Rahman, M.A.; Siddika, S.; Al-Baky, M.A.; Mia, M.J. An automated navigation system for blind people. Bull. Electr. Eng. Inform. 2022, 11, 201–212. [Google Scholar] [CrossRef]
  25. Wangmo, T.; Duong, V.; Felber, N.A.; Tian, Y.J.; Mihailov, E. No playing around with robots? Ambivalent attitudes toward the use of Paro in elder care. Nurs. Inq. 2024, 31, e12645. [Google Scholar] [CrossRef] [PubMed]
  26. Jones, J.L. Robots at the tipping point: The road to iRobot Roomba. IEEE Robot. Autom. Mag. 2006, 13, 76–78. [Google Scholar] [CrossRef]
  27. Silvera-Tawil, D. Robotics in Healthcare: A Survey. SN Comput. Sci. 2024, 5, 189. [Google Scholar] [CrossRef]
  28. Graf, B.; Eckstein, J. Service Robots and Automation for the Disabled and Nursing Home Care. In Springer Handbook of Automation, 2nd ed.; Nof, S.Y., Ed.; Springer: Cham, Switzerland, 2023; pp. 1331–1347. [Google Scholar] [CrossRef]
  29. Isabet, B.; Pino, M.; Lewis, M.; Benveniste, S.; Rigaud, A.S. Social telepresence robots: A narrative review of experiments involving older adults before and during the COVID-19 pandemic. Int. J. Environ. Res. Public Health 2021, 18, 3597. [Google Scholar] [CrossRef]
  30. Broadbent, E.; Loveys, K.; Ilan, G.; Chen, G.; Chilukuri, M.M.; Boardman, S.G.; Doraiswamy, P.M.; Skuler, D. ElliQ, an AI Driven Social Robot to Alleviate Loneliness: Progress and Lessons Learned. J. Aging Res. Lifestyle 2024, 13, 22–28. [Google Scholar] [CrossRef]
  31. Follmann, A.; Schollemann, F.; Arnolds, A.; Weismann, P.; Laurentius, T.; Rossaint, R.; Czaplik, M. Reducing Loneliness in Stationary Geriatric Care with Robots and Virtual Encounters—A Contribution to the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2021, 18, 4846. [Google Scholar] [CrossRef]
  32. Adomavičienė, A.; Daunoravičienė, K.; Kubilius, R.; Varžaitytė, L.; Raistenskis, J. Influence of new technologies on post-stroke rehabilitation: A comparison of armeo spring to the kinect system. Medicina 2019, 55, 98. [Google Scholar] [CrossRef]
  33. Kompaï Assist. Available online: https://www.kompairobotics.com/kompai-assist (accessed on 21 January 2025).
  34. Care-o-bot. Available online: http://www.care-o-bot.de/ (accessed on 21 January 2025).
  35. Zorabots. Available online: https://www.zorarobotics.be/robots/nao (accessed on 21 January 2025).
  36. Asgharian, P.; Panchea, A.M.; Ferland, F.A. Review on the Use of Mobile Service Robots in Elderly Care. Robotics 2022, 11, 127. [Google Scholar] [CrossRef]
  37. Kittmann, R.; Fröhlich, T.; Schäfer, J.; Reiser, U.; Weißhardt, F.; Haug, A. Let me introduce myself: I am Care-O-bot 4, a gentleman robot. In Mensch und Computer 2015—Tagungsband—Mensch und Computer 2015–Proceedings; De Gruyter: Berlin, Germany, 2015. [Google Scholar]
  38. Huisman, C.; Kort, H. Two-Year Use of Care Robot Zora in Dutch Nursing Homes: An Evaluation Study. Healthcare 2019, 7, 31. [Google Scholar] [CrossRef]
  39. Ceccarelli, M. Challenges in service robot devices for elderly motion assistance. Robotica 2024, 42, 4186–4199. [Google Scholar] [CrossRef]
  40. Khan, S.; Nazir, S.; Khan, H.U. Analysis of navigation assistants for blind and visually impaired people: A systematic review. IEEE Access 2021, 9, 26712–26734. [Google Scholar] [CrossRef]
  41. Actuonix Motion Devices. Available online: https://www.actuonix.com/l12-50-50-6-r (accessed on 30 May 2025).
  42. Arduino Documentation. Available online: https://docs.arduino.cc/hardware/nano/#tech-specs (accessed on 30 May 2025).
  43. Adafruit. Available online: https://cdn-shop.adafruit.com/datasheets/PCA9685.pdf (accessed on 30 May 2025).
  44. Tower Pro. Available online: https://towerpro.com.tw/product/mg995-robot-servo-360-180-rotation/ (accessed on 30 May 2025).
  45. XLSEMI. Available online: https://www.xlsemi.com/datasheet/XL4015-EN.pdf (accessed on 30 May 2025).
  46. Allegro MicroSystems. Available online: https://www.allegromicro.com/-/media/files/datasheets/acs712-datasheet.ashx (accessed on 30 May 2025).
  47. Tiwari, A.; Miyashita, N.; Espallargas, N.; Persson, B.N. Rubber friction: The contribution from the area of real contact. J. Chem. Phys. 2018, 148, 224701. [Google Scholar] [CrossRef] [PubMed]
  48. Actuonix Motion Devices. Available online: https://www.actuonix.com/l16-50-63-6-r (accessed on 25 June 2025).
Figure 1. Issues for designing a robot to assist blind and elderly people.
Figure 1. Issues for designing a robot to assist blind and elderly people.
Applsci 15 10474 g001
Figure 2. Requirements for designing a robot to assist blind and elderly people.
Figure 2. Requirements for designing a robot to assist blind and elderly people.
Applsci 15 10474 g002
Figure 3. Design requirements for a robot to assist blind and elderly people.
Figure 3. Design requirements for a robot to assist blind and elderly people.
Applsci 15 10474 g003
Figure 4. Conceptual design with the main components of the guide robot.
Figure 4. Conceptual design with the main components of the guide robot.
Applsci 15 10474 g004
Figure 7. Snapshot of the simulated motion of the designed guide robot along a straight line in a plane.
Figure 7. Snapshot of the simulated motion of the designed guide robot along a straight line in a plane.
Applsci 15 10474 g007
Figure 8. Prototype of the designed guide robot in Figure 5 and Figure 6 with components in Table 4.
Figure 8. Prototype of the designed guide robot in Figure 5 and Figure 6 with components in Table 4.
Applsci 15 10474 g008
Figure 9. The built protype for lab testing: (a) version with six wheels; (b) version with five wheels.
Figure 9. The built protype for lab testing: (a) version with six wheels; (b) version with five wheels.
Applsci 15 10474 g009
Figure 10. Snapshot of a test with the five-wheel prototype overcoming a 20 mm high wooden obstacle at 0.1 m/s in 10 tests with success rate of 100%.
Figure 10. Snapshot of a test with the five-wheel prototype overcoming a 20 mm high wooden obstacle at 0.1 m/s in 10 tests with success rate of 100%.
Applsci 15 10474 g010
Figure 11. Schematic layout of sensors and main components on the guide robot platform.
Figure 11. Schematic layout of sensors and main components on the guide robot platform.
Applsci 15 10474 g011
Table 1. Comparative analysis of selected commercial assistive robots and the proposed prototype.
Table 1. Comparative analysis of selected commercial assistive robots and the proposed prototype.
RobotSize
(mm)
Weight
(kg)
Autonomy
(h)
PayloadCost
(€)
Mini Cheetah [14]480 × 270 × 30090.5–2YesNA
Laikago [16]570 × 370 × 600243–4Yes (5 kg)23,106–46,213
Lysa [21]550 × 350 × 2002.58No2800–3800
Kompaï Assist [33]1800 × 930 × 16506824Yes20,000 (minimum)
Care-O-bot [34]720 × 720 × 15801405Yes (5 kg)65,927–231,743
Zora [35]275 × 311 × 5745.51.5No16,606
Proposed prototype150 × 250 × 1051.94.5–5 (estimated)Yes255
NA means data not available.
Table 2. Main requirements for the proposed design of the guide robot.
Table 2. Main requirements for the proposed design of the guide robot.
Design RequirementsReference Values
Cost (€)<500
Payload (kg)2–3
Travel speed (m/s)0.5–1.5
Weight (kg)3–5
Size (mm)600–800 in length
400–600 in width
300–700 in height
Operational autonomy (h)4–6
Table 3. Main characteristics of the electronic components for the prototype design in Figure 5 and Figure 6.
Table 3. Main characteristics of the electronic components for the prototype design in Figure 5 and Figure 6.
Main FeatureACTUONIX L12-100-50-6R
[41]
Arduino Nano
[42]
PCA9685
[43]
Tower Pro MG996R
[44]
XL4015
[45]
ACS712
[46]
Device TypeLinear ActuatorMicrocontroller16-Channel PWM ControllerServomotorDC-DC Voltage Regulator (Buck Converter)Current Sensor (Hall Effect)
Main FunctionPrecise linear movementProcessing and controlControl multiple servosAngular positioningReduce input voltage to a stable outputMeasure AC/DC electric current
Operating Voltage6 V5 V3.3 V–5 V4.8 V–7.2 V4 V–38 V (input), 1.25 V–36 V (output)5 V
Interface/CommunicationPWM signal or voltage controlUSB, UART, SPI, I2C, GPIOI2CPWM signalDC Input/OutputAnalog output
Load/Current Capacity45 NDigital (20 mA per pin), Analog6 V (with external power), 25 mA per channelHigh torque (10 kg/cm)Up to 5A5A
Cost€120.8€27.1€8.4€32.6€5.49€3
Table 5. Data computed during the straight-line motion simulated in Figure 7.
Table 5. Data computed during the straight-line motion simulated in Figure 7.
WheelsParameterOutput Data
Rear wheelsDriving torque46.77 N·mm
Reaction axle force9.35 N
Ground contact force5.61 N
Front wheelsDriving torque1.39 N·mm
Reaction axle force0.28 N
Ground contact force0.17 N
Table 6. Main characteristics of the designed five-wheeled guide robot prototype in Figure 9b and Figure 10.
Table 6. Main characteristics of the designed five-wheeled guide robot prototype in Figure 9b and Figure 10.
ParameterValue
Platform size (mm)250 × 150
Wheel diameter (mm)65
Height (mm)105
Maximum height (mm)185
Leg length (mm)[118–168]
Weight (kg)1.9
Power (W)6
Total cost (€)255
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Garrosa, M.; Ceccarelli, M.; Russo, M.; Yang, B. Design of a Mobile Assisting Robot for Blind and Elderly People. Appl. Sci. 2025, 15, 10474. https://doi.org/10.3390/app151910474

AMA Style

Garrosa M, Ceccarelli M, Russo M, Yang B. Design of a Mobile Assisting Robot for Blind and Elderly People. Applied Sciences. 2025; 15(19):10474. https://doi.org/10.3390/app151910474

Chicago/Turabian Style

Garrosa, María, Marco Ceccarelli, Matteo Russo, and Bowen Yang. 2025. "Design of a Mobile Assisting Robot for Blind and Elderly People" Applied Sciences 15, no. 19: 10474. https://doi.org/10.3390/app151910474

APA Style

Garrosa, M., Ceccarelli, M., Russo, M., & Yang, B. (2025). Design of a Mobile Assisting Robot for Blind and Elderly People. Applied Sciences, 15(19), 10474. https://doi.org/10.3390/app151910474

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop