Next Article in Journal
Pain Perception Following Periodontal Decontamination Treatment with Laser Therapies: Comparison between Oxygen High-Level Laser Therapy (OHLLT) and Laser-Assisted New Attachment Procedure (LANAP)
Previous Article in Journal
Search for Optimal Parameters in the Control Structure of a Surgical System for Soft Tissue Operations Based on In Vitro Experiments on Cardiovascular Tissue
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Mobile Robot for Security Applications in Remotely Operated Advanced Reactors

1
Department of Civil & Environmental Engineering, Idaho State University, Pocatello, ID 83209, USA
2
Department of Mechanical Engineering, Idaho State University, Pocatello, ID 83209, USA
3
Nuclear Safety and Regulatory Research Division, Idaho National Laboratory, Idaho Falls, ID 83415, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(6), 2552; https://doi.org/10.3390/app14062552
Submission received: 25 October 2023 / Revised: 29 January 2024 / Accepted: 13 March 2024 / Published: 18 March 2024

Abstract

:
This review paper addresses the escalating operation and maintenance costs of nuclear power plants, primarily attributed to rising labor costs and intensified competition from renewable energy sources. The paper proposes a paradigm shift towards a technology-centric approach, leveraging mobile and automated robots for physical security, aiming to replace labor-intensive methods. Focusing on the human–robot interaction principle, the review conducts a state-of-the-art analysis of dog robots’ potential in infrastructure security and remote inspection within human–robot shared environments. Additionally, this paper surveys research on the capabilities of mobile robots, exploring their applications in various industries, including disaster response, exploration, surveillance, and environmental conservation. This study emphasizes the crucial role of autonomous mobility and manipulation in robots for diverse tasks, and discusses the formalization of problems, performance assessment criteria, and operational capabilities. It provides a comprehensive comparison of three prominent robotic platforms (SPOT, Ghost Robotics, and ANYmal Robotics) across various parameters, shedding light on their suitability for different applications. This review culminates in a research roadmap, delineating experiments and parameters for assessing dog robots’ performance in safeguarding nuclear power plants, offering a structured approach for future research endeavors.

1. Introduction

The report [1] published by the Idaho National Laboratory in 2016, in which revenue gap analysis was performed for 79 nuclear power plants (NPPs), found that 63 out of the 79 power plants lost money in the same year (Figure 1). The rising operation and maintenance cost of nuclear power plants was one of the major causes in this, and may jeopardize the continued operation of the plants, as utilities struggle to compete with the low prices offered by natural gas and renewable energy sources. Utilities are working hard on implementing modernization strategies to reduce the cost of generating electricity using nuclear power. The operation and maintenance (O&M) cost for NPPs accounts for almost 70% of the power generation cost (Figure 2), with 50% of it being a labor cost. Recent research found that physical security forces make up almost 20% of the workforce at some NPPs based on discussions with utilities and other stakeholders. Thus, reducing labor costs is essential for lowering the cost of operating an NPP.
A good physical protection system (PPS) is designed considering vulnerability assessment in all its components that have validated performance measures established for operation. Any PPS design should meet the protection objectives within the operational, safety, legal, and economic constraints of the facility. The primary functions of a PPS are the detection of an adversary, delay of that adversary, and response by security personnel [3].
The study presents a survey of research on the capabilities of dog robots for safeguarding nuclear power plants. As per the definition put forward by the International Organization of Standardization (ISO 8373) [4], “robot is an automatically controlled, re-programmable, multipurpose manipulator with three or more axes” [5]. The history of robotics is quite fascinating. Once a part of movie fantasy, the inspiration taken from the movies has made it possible for the existence of robots based on different fields of applications including marine [6,7], space [8,9,10], medicine and surgery [11,12,13], household and many more. The word robot was first introduced into the English language in 1921 by the playwriter Karel Capek in his satirical drama, “R.U.R. (Rossum’s Universal Robots)”. In this, the robots were introduced as machines that resemble people, but work tirelessly. But the early work leading to today’s industrial robots can be traced back to the late 1940s, just after World War II, when research programs were started at the Oak Ridge and Argonne National Laboratories to develop remotely controlled mechanical manipulators for handling radioactive materials [14].
The use of external sensing mechanisms allows robots to interact with the environment in a flexible manner. The function of robot sensors can be divided into two major categories: internal-state and external-state [15]. While internal-state sensors deal with detecting internal variables such as arm joint position, which are used for robot control, the external-state sensors, on the other hand, deal with the detection of external agencies like touch, range, and outer dangers. Through this study, we plan on presenting a survey of research on the capabilities of dog robots for safeguarding nuclear power plants. Although the use of robots for physical security at NPPs is a relatively new research area to explore, robots have been used in power plants for a long time in other areas as well. For instance, the robot developed by Hughes Aircraft in 1958 was used to reduce human risk associated with the contaminated hazardous environment of a plant [15,16].
In this research, the focus is on enhancing the performance of a proposed physical security robot designed for patrolling the perimeter of a power plant. Legged robots encounter significant challenges when navigating through unstructured terrains, particularly in terms of gait flexibility and trajectory planning under heavy payload conditions. To address these issues, the researchers present an innovative gait transition hierarchical control framework for their hexapod wheel-legged robot, BIT-6NAZA (Beijing Institute of Technology, Beijing, China) [17,18]. The framework incorporates a flexible gait planner (FGP) and a gait feedback regulator (GFR) with behavior rules. The FGP optimally selects footholds and adjusts gait types based on considerations such as the secure foothold, stability margin, and kinematic margin of legs. Meanwhile, the GFR modifies the foot-end trajectory of the chosen gait using terrain feedback information, enabling the robot to adapt seamlessly to unstructured terrains during its patrol.
The BIT-6NAZA robot is also featured in a study on a novel hybrid gait obstacle avoidance control strategy using a perception system for navigating uneven terrain [18]. This research considers the robot’s motion state and gait unit, identifying the local terrain with a visual perception system. Wheel-legged hybrid gait types and parameters are selected based on terrain detection, with gait topology and planning matrices generated for each leg controller to enable obstacle avoidance. A feedback controller, incorporating posture and foot-end force sensors, is utilized for stability. Demonstrations with the BIT-6NAZA robot confirm the effectiveness and feasibility of the hybrid gait obstacle avoidance control strategy.
A novel hierarchical framework was developed [19] for the six wheel-legged robot (BIT-6NAZA) to achieve stable high-level movement with payload delivery. Experimental results demonstrated the effectiveness of the framework, showcasing satisfactory stable performance in real-world field environments. The framework includes a speed consensus algorithm for wheeled motion, three designed legged motion sequences, and foot-end trajectories based on Bezier functions. A comprehensive whole-body control architecture ensures horizontal stability during obstacle avoidance through attitude, impedance, and center height controllers.
Legged robots commonly employ sets of gait patterns to traverse diverse surfaces. Achieving stable transitions between these gaits is crucial for effective locomotion on changing terrains. In their work, Haynes and Rizzi [20] demonstrate the benefits of implementing gait transitions on two distinct robotic platforms to enhance their behavioral capabilities. Using the RHex robotic hexapod, they showcase how gait transitions facilitate a seamless switch from a tripod walking gait to a metachronal wave gait, which is particularly advantageous for climbing stairs. Additionally, the researchers introduce the RiSE platform, a hexapod robot designed for vertical climbing, emphasizing the pivotal role of gait transitions in enabling efficient vertical mobility for this robot.
Wermelinger et al. [21] present a path planning framework for legged robots in challenging terrains, emphasizing efficient and safe navigation. The approach computes terrain characteristics and utilizes a traversability map to assess the costs of robot footprints based on obstacle negotiation capabilities. Employing the RRT* algorithm, the planner optimizes path length and safety. Results from extensive simulations and real-world testing on the quadrupedal robot StarlETH (Swiss Federal Institute of Technology (ETH), Zurich, Switzerland) demonstrate successful integration of the proposed navigation framework. The hierarchical structure facilitates frequent path replanning, while the cascaded planning approach enables swift searches in simple environments and complex solutions for intricate terrains.
Moeller et al. [22] explore the development, deployment, and efficacy of an autonomous navigation system designed for an agricultural robot. The system, equipped with a precise onboard GPS, enables the robot to autonomously locate and eliminate potato plants previously identified as infected with Potato Virus Y (PVY). This technology aligns with the advancements in precision agriculture, showcasing its potential in targeted plant management. The navigation system relies on a Pixhawk microcontroller for user-friendly operation and cost-effectiveness. To ensure precision, especially when selectively removing infected plants, an RTK GPS module from Swift Navigation (San Francisco, CA, USA) is incorporated for enhanced accuracy in the robot’s navigation.
Deemyad et al. [23] examine the development of a prototype chassis for an autonomous ground vehicle (AGV) designed to identify and remove potatoes affected by Potato Virus Y (PVY) in challenging field conditions. The four-wheel powered vehicle encounters rough terrain and deep irrigation ruts in potato fields, making navigation a demanding task for the robot chassis. The researchers employed an optimization routine to determine the optimal size and material for the chassis. Seven distinct stress analyses were carried out to refine the design and material selection for the prototype, addressing the challenges posed by the complex and varied field terrain.
We structure this review paper as follows: For the first part of the literature review, we briefly discuss the performance requirements for physical security. We then have some general discussions on mobile robots and their types. The next section explains some basic functionalities and capabilities of robots, how robots formalize problems, and how they undergo qualitative and quantitative assessments. It also covers the parameterization of robot performance. Then, we have a section that discusses the performance of three commercially available dog robots. This performance comparison is also presented in a tabulated format in the Appendix A. Next, we talk about the research roadmap created for future research on using mobile dog robots for security at advanced reactors, and then we wrap up by discussing potential future research.

2. Research Methodology

We followed a systematic approach to compile relevant articles. For information on basic robotic theories and the historical background of robotics in the Nuclear Power Plant, research articles and books published in the 1980s, 1990s and 2000s were followed to some extent. For the assessment of the robot’s performance parameterization and review of mobile robotics, we focused more on papers published after 2010 to ensure that we only reviewed the latest research on the use of robotics in the field of our interest. Figure 3 illustrates the distribution of cited papers based on year of publication. Using relevant keywords (Table 1), we used Google Scholar as the primary database for sourcing relevant research papers. Using Google Scholar, we accessed the websites of other journal publishers, including IEEE, Elsevier, ACM Digital Library, MDPI, and Springer.
Several publication sources were utilized in gaining access to all the relevant journal articles needed for the literature review. The lists of journal publishers and the number of articles included from each publisher are presented in Table 2 below.

3. Performance Requirements for Physical Security

Whether developing a new physical protection system (PPS) or upgrading an existing system, the designer must consider how to best integrate features such as fences, barriers, sensors, procedures, communication devices, and security personnel into a PPS capable of meeting the protection objectives. The final PPS design should achieve these goals while remaining within the facility’s operational, safety, regulatory, and economic restrictions. A PPS’s key functions are to identify an attacker, delay an adversary, and respond with security people (a guard force) [3]. For a typical nuclear power plant, the physical security posture is shown in Figure 4 [24]. For the better performance of physical security, the following basic requirements [25] must be met:
  • Establishment and maintenance of a security organization;
  • Usage of security equipment and technology;
  • Proper training and qualification of security personnel;
  • Implementation of predetermined response plans and strategies;
  • Protection of digital computer and communication systems and networks.
Besides this, a critical element of the security plan is the need to demonstrate the ability of armed and unarmed security personnel to perform assigned duties and responsibilities. In 10 CFR 73.45 [25] “Performance capabilities for fixed site physical protection systems”, it is mentioned that to meet the general performance requirements of 10 CFR 73.20, a fixed-site physical system should include the following performance capabilities:
  • Prevent unauthorized access of persons and vehicles to material access areas and vital areas;
  • Permit only authorized activities and conditions within protected areas, material access areas, and vital areas;
  • Permit only authorized placement and movement of strategic special nuclear material within material access areas;
  • Permit the removal of only authorized and confirmed forms and amounts of strategic special nuclear material from material access areas;
  • Provide authorized access and assure detection of and response to unauthorized penetrations of the protected area to satisfy the general performance requirements of 10 CFR 73.20 (a);
  • A prompt response by the response force under the physical protection programs.

4. Mobile Robots Categories

Numerous industries all over the world in one way or the other have considered using automated robots to increase their productivity. As our research is also focused on using the mobile dog robot to improve physical security at nuclear power plants, it is imperative for us to dive into detail about mobile robots and their types. There are several categories of mobile robots. Arms/manipulators are the most common type of robot that has been used in these sectors for decades. Despite their successes, these commercial robots suffer from a fundamental disadvantage: a lack of mobility. Although robotic arms/manipulators have been considered a type of mobile robot, due to their limited range of motions, we cannot use them in a wider scope of applications such as patrolling, transporting, cleaning, and so on. In contrast, mobile robots can travel easily and carry out flexible tasks within manufacturing plants, moving around to maximize their effectiveness. For instance, AGV (autonomous guided vehicle) robots transport parts between assembly stations using electrical guide wires or onboard lasers for navigation. Helpmate service robots deliver items in hospitals by tracking ceiling lights’ positions. Autonomous cleaning robots are also developed for large buildings [26].
Mobile robots are also being used as an automatic hotel assistant system that uses mobile platforms to interact with visitors and staff, assisting in activities such as delivering things, providing instructions on hotel services, guiding guests to their rooms, and providing general information [27].
Mobile robots are also extensively being used for disaster response applications. A huge earthquake and tsunami devastated eastern Japan in March 2011, causing significant destruction and loss of life. The Fukushima Daiichi nuclear power plants were badly damaged, causing power outages. Hydrogen explosions damaged three nuclear reactor buildings, with one reactor losing control. Initially, the focus was to analyze the accident site’s damage and radiation levels; however, this was too dangerous for humans due to the risk of significant radiation exposure. To respond to this situation, Nagatani et al. conducted research involving the use of a mobile robot, Quince, to carry out surveillance missions in such a dangerous environment [28].
An autonomous robot can determine and execute tasks on its own, with the assistance of a perception system. Locomotion, perception, cognition, and navigation are all fundamental features of mobile robotics. Understanding mechanics, kinematics, dynamics, and control theory is essential for understanding locomotion. Signal analysis, including computer vision and sensor technology, is involved in perception. Cognition is the control system that interprets sensor input and controls the robot’s activities. Navigation requires an understanding of planning algorithms, information theory, and artificial intelligence [29].
Based on their locomotion system, mobile robots can be classified [28] into the following major categories.

4.1. Stationary (Arm/Manipulator)

A manipulator is a mechanical component of a robot, and the terms ‘robot’ and ‘manipulator’ are frequently used interchangeably in the scientific literature. It is made up of links connected by rotational or translational joints, usually in a series configuration, and has three major parts: the arm, wrist, and hand (see Figure 5). The arm’s function is to position an object in 3D Cartesian space, while the wrist adjusts its orientation. The hand is used to manipulate items and is also known as an end-effector or tool. The remaining part, i.e., the body of the robot, is fixed on a surface. Robots are commonly used in industrial applications for operations such as welding, packaging, sorting, cutting, painting, moving, and sanding. These activities rely on manipulator-based robots, which fulfill their goals by controlling the manipulator’s joints in single or multiple directions according to specified patterns [30].
At present times, industries are utilizing robotic arms to reduce human errors and enhance the efficiency, productivity, and precision of their operations [32]. For instance, lots of factories make use of robotic arm manipulators to transfer or pack their products. To make sure that the robot is efficient, productive, and precise in its operation, it is necessary to optimize the robot’s trajectory. This can be obtained via the right modeling and design [33].

4.2. Land-Based

As the name suggests, land-based robots operate on land. They can be further divided into different sub-categories, which are mentioned below.

4.2.1. Wheeled Mobile Robot (WMR)

WMRs, or wheeled mobile robots, are the most widely used robots among the many well-developed mature robots. The body, wheels, wheel support mechanism, and wheel drive system are the main components of a standard WMR chassis. The number of wheels on a chassis determines its configuration, with typical alternatives including two-wheel, three-wheel, and four-wheel layouts. The four-wheel chassis (see Figure 6) is the most common of these [34].
While studies on legged and caterpillar-based locomotion have been extensive, the majority of mobile robots in research, development, and evaluation rely on wheels for mobility. This preference for WMRs (wheeled mobile robots) is due to several significant advantages: (1) they are more energy efficient, because they generally move on smooth and firm surfaces; (2) their multiple applications in the industry; (3) they require fewer parts and are less complex, making them easier to build; (4) their control wheels are less complex; (5) they cause less wear on the surface where they move compared with the bands of caterpillars; (6) balance problems do not present a great difficulty since they are always in contact with the surface. The reason WMRs have been experiencing popularity is due to their growing applications in planetary exploration, mining, inspection, and surveillance [36], the rescue of people, hazardous waste cleanup, and medical assistance, among others [37].
Wheeled mobile robots (WMRs) are becoming more common in both industrial and service robotics, particularly when adaptive motion capabilities on reasonably smooth and flat surfaces are required [38].

4.2.2. Walking (Legged) Mobile Robot

Robotics developers have been inspired by animals, varied terrestrial mobility skills, particularly facilitated through limb structures, for decades. For more dynamic and effective legged machines, designers have taken inspiration from nature’s flexible designs. These newly developed robots are very different from the production robots used in factories, ushering in a new era of adaptable mobile robots for numerous social purposes. Engineers have been pursuing the design of legged mobility machines for over a century. Starting as far back as the mid-1800s, initial efforts were focused on cleverly designed linkages to mechanically create predetermined leg movements [39].
In both natural environments and modern scenarios, there are situations that are either inaccessible or too hazardous for humans, including planetary surfaces, construction sites, mines, disaster relief operations, and anti-terrorism actions. To achieve advancements in the field of science, technology, and society, it is imperative to conduct in-depth research in these contexts to find practical solutions. A common challenge across these situations is the presence of uneven and irregular terrains, which limits the effectiveness of wheeled and tracked robots. Legged robots (see Figure 7), with their terrain adaptation capabilities, are better suited for addressing these challenges.
Legged robots offer several advantages. First, they exhibit superior adaptability on rugged terrains because their movement involves a series of discrete footprints, unlike the continuous rut left by wheeled or tracked robots. Second, legged vehicles possess enhanced mobility and flexibility due to the numerous degrees of freedom afforded by their multiple legs. These characteristics make legged robots a promising choice for a wide range of applications in challenging environments [40].
Figure 7. Typical quadrupedal mobile robot [41].
Figure 7. Typical quadrupedal mobile robot [41].
Applsci 14 02552 g007
Different types of legged robots are available at present, ranging from one-legged to more-than-six-legged ones [29]. Among all these legged mobile robots, the quadrupedal robot is a legged robot that is superior to wheeled and tracked robots due to its potential to explore all terrain, like that of humans and animals [42].

4.2.3. Tracked Slip/Skid Locomotion

In the rapidly expanding and dynamic domain of service robotics, tracked ground mobile robots (see Figure 8) have become a focal point for numerous researchers in both industrial and academic spheres. This is primarily because tracked locomotion proves highly effective in traversing soft and uneven terrains, thanks to its extensive ground contact surface [43].
Unlike typical robots, those designed for the nuclear industry operate in exceptionally harsh conditions, including high levels of radiation, extreme temperatures, high pressures, confined spaces, slopes, stairs, and complex pipelines. Thus, these nuclear environment robots must possess exceptional terrain adaptability and superior reliability [45].
The initial phase in preparing for maintenance and decommissioning tasks in facilities with radionuclides, particularly those housing high- and very-high-activity cells, involves conducting inspections. As a result, the strategic development of exploration robots designed for challenging environments is essential within the nuclear sector to fulfill these requirements [46]. This is where the use of tracked robots comes into play.

4.2.4. Hybrid

Hybrid robots, as the name suggests, are those that have a hybrid locomotion system built into them. For instance, a hybrid robot system might have a leg-wheel hybrid (Figure 9) locomotion system [47,48], a leg-track hybrid locomotion system [49], a wheel-track hybrid locomotion system [50], or a leg-wheel-track hybrid locomotion system [29,51].

4.3. Air-Based/Unmanned Aerial Vehicles

Unmanned aerial vehicles, also known as UAVs, have been in operational use for just a few decades, and they have become a crucial component of many nations’ air defense systems [53].
In recent years, aerial robots, particularly small UAVs, and drones, have seen significant advancements in their design, functionality, flight capabilities, and navigation control. Due to this reason, they are extensively employed in various applications, including photography, path planning, search and rescue operations, intrusion detection (see Figure 10), and inspections of power lines and civil infrastructure. Unmanned aerial vehicles (UAVs), also known as pilotless aircraft, operate with advanced components, including a physical model, a ground control station (GCS), modern sensors, and a communication platform. UAVs are classified based on their flying principle, determined by their aerodynamic structure. Heavier UAVs rely on thrust for flight, with two main types: ‘rotor’ UAVs using multiple rotors and propellers for lift, and ‘wing’ UAVs using wings for aerodynamic lift. Wing-type UAVs can be further categorized as flapping-wing, fixed-wing, or flying-wing. Lighter UAVs, such as parachutes, balloons, and blimps, use buoyancy for flight [54].
A UAV is also referred to as dynamic remotely operated navigation equipment (DRONE) and is described as an aircraft that lacks an onboard pilot and that operates autonomously or under the control of a ground-based flight controller. A quadcopter is the other type of UAV. A quadcopter, also called a quadrotor, is a four-rotor helicopter with rotors oriented upwards in a square arrangement equidistant from the center of the mass. It is maneuvered by altering the angular speeds of its rotors, which are powered by electric motors. Quadcopters are a common design for small UAVs due to their straightforward structure. They find applications in surveillance, search and rescue missions, construction inspections, and various other tasks [56].

4.4. Water-Based/Unmanned Water Vehicles

From their origins in the 1950s to the present day, the field of unmanned underwater vehicles (UUVs) has significantly expanded. They have been employed in various tasks, ranging from bomb detection and recovery to underwater exploration. Initially, these robots were primarily used for military and scientific purposes, but they have now found widespread applications in civilian domains, including recreational use.
Underwater robotics has evolved alongside manned underwater vehicles, but recent decades have witnessed the development of more cost-effective robots or UUVs. These robots have been optimized and deployed for a wide range of applications, contributing to the sustainable development of our planet. Understanding the marine environment’s dynamics holds critical importance for humanity’s future survival. Consequently, researchers have been committed to creating increasingly efficient unmanned vehicles capable of performing underwater work, maintenance, and exploration of ever-deeper marine realms [57].
Likewise, research on autonomous boats, also known as unmanned surface vehicles (USVs) (Figure 11), in disaster management (DM) is widespread across multiple publications, often covering various unmanned vehicles under the category of unmanned marine Vessels (UMV). These vehicles are employed in disaster response and management, which encompass a range of actions. These actions are contingent on prior preparation and response capabilities, and they take place in either challenging emergency conditions or more favorable preparation stages [58].
Plastic use is increasing without proper regulation in many countries, and this is causing toxins that harm the environment and human health, potentially leading to issues like cancer, birth defects, and immune system problems. Most of these plastics, through the river, ultimately end up in the ocean, leading to life-hazardous situations for marine animals. Research [60] was conducted where the researchers designed a prototype of an unmanned floating waste-collecting mobile robot that was able to collect trash weighing up to 10 kg and clean an area of about 3000 square centimeters, drawing only 45 watts from the battery.

4.5. Other

In addition to the above-mentioned robot types, there are some other types of robots as well that are difficult to classify based on their motion. Some other types of robots include snake-like robots, worm-like robots, nanobots, etc. [29].
Snake-like robots (Figure 12) are highly articulated robots that possess the ability to synchronize their internal degrees of freedom (DOF) in order to execute a wide range of locomotion tasks. This extends their functionality beyond that of traditional legged or wheeled robots. The primary benefits of these robots lie in their versatility, allowing them to achieve a wide array of behaviors that go beyond just climbing, crawling, or swimming.
Worm-like robots use peristalsis, the same method of locomotion (see Figure 13) that earthworms use. Nanobots are highly miniature-sized robots that are nanometers in size. Progress in medical robotics holds the potential to enhance contemporary healthcare and people’s well-being. Shrinking these robotic systems has opened up numerous opportunities to apply precise medical techniques [62].

5. Basic Functionality and Capability of a Robot

For a robot to be considered an ideal one for novel applications in construction, disaster, underwater, and many others, it must possess certain basic capabilities and be able to carry out some basic functions. Since these robots need to interact with such complex human environments, autonomous mobility and manipulation are the must-have basic functionalities [64]. Similarly, high speed is another basic capability of a mobile robot [65]. Apart from these, some other basic autonomous capabilities include integrated mobility and manipulation, cooperative skills between multiple robots, interaction ability with humans (HRI), and efficient techniques for the real-time modification of collision-free paths [66].
In terms of movement potentiality, localization, map-building, and navigation are the three terms used to measure if a mobile robot is skilled to move to its destination efficiently (along short trajectories) and safely (with no collisions). Localization is the tendency of a robot to orient locally or to keep within a definite locality. Map-building, as the name suggests, involves building maps (with the help of algorithms) with the routes (there may be several routes) to a destination. Navigation refers to the way of traveling to a desired destination efficiently and safely [66].
A mobile robot must also possess the ability to carry out other functions like the ability to interact with the environment, which includes activities like grabbing, lifting, pushing, and manipulating objects [64]. Given the types of mobile robots (manipulator, legged, wheeled, and many more), the functions that they carry out and the capabilities that they possess differ.

6. Robots’ Way of Formalizing Problems

Formalization is an act of giving something a fixed structure or form by introducing rules. There are several instances when a robot needs to formalize problems, like when it gets stuck while navigating, to avoid obstacles, and many more based on the performance that the robot is designed for.
With a robot that has the task of tidying up a household, although planning for the tasks seems easy, there is a problem for the robot since it is usually uncertain about the state of the household when planning [67]. For example, when the robot is ordered to clean the kitchen, it rarely knows what objects it needs to put away to execute the tidying process and whether or not there are any objects at all that need to be displaced. Nebel et al. [67] try to identify conditions under which classical planning can be used in a re-planning loop to solve the problems in a nondeterministic, partially observable open domain. For this purpose, they propose and use a continual planning approach, in which a plan is created to solve the problem at hand, and re-planned if anything does not work as per the previous plans. To experimentally carry this out, they created a test scenario that contained two rooms separated by a door, tables, and a shelf at the back. To tidy up the room, the test subject was required to find objects, pick them up, and bring them to the tidy location by navigating through the rooms. In the end, the robot was required to clean the table where the objects were initially shifted from.
From a learning perspective, a humanoid robot should be capable of improving its knowledge of recognizing objects with active perception [68]. Nguyen et al. have addressed the problem of learning to recognize objects via manipulation in a developmental robotics scenario. In their research, they have integrated a bottom–up vision system, a control system of a complex robot system, and a top–down interactive exploration method to collect data and determine if interacting with humans is beneficial. They set the experiment up to find this out. The main idea was for the robot to learn to associate a camera view with an object via episodes. In each episode, the robot had to decide which object it wanted to learn more about, which manipulation it should use as a strategy to generate new sample data, and then learn to differentiate between objects. Based on the experimental results, the humanoid robot iCub could learn to recognize 3D objects via manipulation and in interaction with teachers, choosing the adequate exploration strategy to enhance competence progress and focusing more of its efforts on the most complex tasks [68]. This improved the autonomous decision capacity of the iCub humanoid robot.
Detecting humans through machine vision in complex environments represents a significant research area. To derive meaningful insights from captured audio-visual data, algorithms need to be implemented, either onboard or remotely, in near real-time. Wadoro (a watch dog robot) [36] is an autonomous mobile surveillance robot designed to navigate the semi-outdoor environment around a house, actively seeking potential human intruders. Equipped with a night-vision camera, it can conduct surveillance around the clock. The primary goal of the robot is to promptly generate alerts in the event of any intrusion on the premises. Throughout its surveillance operation, Wadoro undergoes distinct phases—including human detection, tracking, recognition, and the subsequent generation of alerts. Additionally, the robot incorporates protective measures to safeguard itself against collisions and potential attempts of theft or removal from the ground. When idle, the robot remains still, monitoring people or faces via visual observation and PIR sensors. If a person is detected by the camera, it switches to tracking mode; if human motion is sensed by the PIR sensors, it turns and enters patrol mode after a maximum idle duration of 60 s.
In patrol mode, the robot moves at 0.5 m/s, capturing frames for human or face detection. It uses an ultrasonic sensor to avoid collisions, tilting its camera at 45° if an obstacle is within 100 cm. After 30 s of patrolling, it returns to idle mode to repeat the duty cycle.
In idle or patrol mode, if a person is visually identified, it enters track mode, following the subject to capture a facial image. After comparing it with familiar faces, if recognized, the robot returns to patrol mode; if an intruder is detected, it generates a phone call alert. In cases with no person detected, Wadoro’s duty cycle includes idle and patrol modes, lasting 90 s. Two-thirds of this time are spent in a low-power state, efficiently monitoring visually and reading PIR sensors.

7. Qualitative and Quantitative Performance Assessment of a Robot

Normally, two types of performance assessments can be carried out on a robot: qualitative and quantitative. Quantitative performance is easier to assess since it involves measuring performance parameters (like speed, time, agility, and others), which require less to no human judgment. On the other hand, qualitative performance assessment cannot be carried out directly since it usually involves intangibles that require more human judgment. One common method of qualitative assessment is to compare a robot’s performance with that of a fixed benchmark set.
In order to conduct a qualitative and quantitative performance assessment of a robot, the robot deployment categories and performance requirement categories need to be identified. The assessment criteria may vary depending on the type of robot. One such assessment was carried out to measure the performance of the urban search and rescue (USAR) robot [69]. In this project, the stakeholders identified robot deployment categories and performance requirements. They then translated these requirements into tests and metrics with which the performance of robots would be measured. The robot deployment categories included ground and aerial. The ground robots were further classified into subcategories which included Peek robots, collapsed-structure, and non-collapsed-structure wide-area survey robots, wall-climbing deliver robots, confined space robots, temporary shore robots, confined space shape shifters, and confined space retrieval robots. Similarly, the sub-category aerial robots included high-altitude loiter robots, rooftop payload drop robots, ledge access robots, variable depth sub robots, and bottom crawler robots. At the beginning of the project, the expectations of robot performance were defined for each robot. This process of definition captured individual categories of requirements, along with how to measure robot performance against each requirement and objective value. These requirements continue to grow with the evolution and maturity of robot capabilities.
For the case of our study, the concept of human–robot interaction (HRI) needs to be incorporated. A security personnel must have some sort of interaction with the security robot for a good physical security posture. Using this concept, several studies have been conducted to determine the efficiency of human–robot interaction. One such experiment was performed by Singh et al. [70]. The researchers presented a practical human–robot interaction system based on a wearable glove (Figure 13) for object grasp applications. Three different types of sensors were embedded to capture grasp information from human demonstrators, enabling the imitation of the posture and dynamics of the human hand. Calibration algorithms were implemented to mitigate errors caused by the sensors. After analyzing the collected data, the positions of the accelerometer, pressure, and flex sensors were determined in the wearable glove to capture grasp information for objects with different sizes and shapes. A three-layer neural network was trained to recognize grasp orientation and position based on the multi-sensor fusion method. The experimental results showed that an industrial robot could successfully grasp spheres, rectangles, and cylinder objects (see Figure 14) with the proposed HRI based on learning from demonstration.
The performance of various techniques developed and used, mainly computational algorithms, is subjected to quantitative assessment, but these techniques do not incorporate measures of robot behavior directly. They do not even provide us with a qualitative measure of how well the robot carries out its navigation task, or how well it avoids obstacles, for example [71]. Smithers aims to provide some standards for quantitative performance measures of robot behavior and proposes the use of correlation dimension estimation techniques to make quantitative estimates of overall robot behavior. Further, the research shows how quantitative measures of robot behavior can be developed and practically used with the addition of an appropriate framework for agent–environment interaction systems.

8. Parameterization of the Performance of a Robot

Formulating a mathematical expression and designing a comprehensive framework to quantitatively assess the performance of intelligent systems is a complex task. Rather than pursuing a theoretical framework lacking practical relevance, a more effective approach involves learning from the diverse requirements set by end users across multiple domains [72]. The decision to select a robot is always complex since robot performance depends on several parameters for which there are no industry-wide standards [73]. Measurement parameters are not globally the same because a robot can behave completely differently when exposed to a different environment, to accomplish the same task repeatedly [74]. The parameters of performance can also differ from the type of robot being considered. For industrial robots, repeatability, accuracy, load capacity, and velocity are the most important parameters, [73] whereas the same parameters might not be considered for other mobile robots like unmanned vehicles, quadruped robots, humanoid robots, etc.
Performance measurement in a robot takes into consideration four major criteria: robot performance, system performance [75], autonomy [74,75,76], and human–robot interaction [75,76,77,78].

8.1. Robot Performance

8.1.1. Self-Awareness

Self-awareness, in the context of robotics, refers to a robot’s ability to accurately assess its own capabilities. The level of self-awareness directly impacts the need for human monitoring and intervention. Fong et al. [75] have proposed subjective measures for assessing self-awareness (see Table 3).

8.1.2. Human Awareness

Human awareness is a qualitative aspect that gauges a robot’s awareness of humans. It involves sensitivity to human presence and the capability to execute commands from humans, requiring knowledge of these commands [75]. This parameter facilitates the comparison of multiple robots in situations where quantitative performance measurement is challenging, aiding in the selection of the most suitable robot.

8.2. System Performance

System performance is the measure of how a robot and human perform together as a team. It is closely associated with human–robot interaction (HRI) [75,77,78]. Here, the main agenda is to measure the human–robot team’s performance rather than task-specific performance [75].

8.3. Autonomy

Researchers have proposed several techniques for the measurement of the autonomy of a robot. One such approach is to compare the autonomous performance of several robots experimentally (some competitions such as Robocup or DARPA’s Grand Challenge are examples). Considering that all the robots are of different builds, have different motion capacities, have different sensors, and run on different algorithms, they are assigned the same tasks under the same environmental conditions. Based on the tasks they carry out in relation to the benchmark set, ratings can be provided to them. To be autonomous, the robot must have sensing, decisional, and actuation capabilities [74].
Neglect tolerance is another technical metric used for measuring autonomy. Neglect tolerance measures the efficiency of a robot when left unattended by humans. It is the measure of the time it takes the robot’s performance to diminish when unattended by humans [75].
Huang et al. [76] have devised a form of detailed modeling that includes several sets of metrics to measure autonomy levels (see Table 4).

8.4. Human–Robot Interaction (HRI)

As robots begin to interact closely with humans, the need to build systems worthy of trust regarding the safety and quality of the interaction has increased [79]. Different experimental studies have been carried out to evaluate human–robot interactions in the form of metrics. For instance, Schaefer [77] introduced a subjective tool for the measurement of human–robot trust in terms of percentage. The 40-item Trust Perception Scale-HRI and the 14-item sub-scale are created to measure the trust specific to the human, robot, and environmental elements.
Olsen and Goodrich [78], on the other hand, propose six interrelated generic metrics to guide the design of human–robot interaction (Table 5).
Using these metrics together provides a framework for thinking about interaction design. For a human–robot system to accomplish any task, an increase in effectiveness is a must. To increase this effectiveness, neglect tolerance [75,78] needs to be increased and the interaction effort of the interface needs to be reduced.
For dealing with interactions involving multiple robots in a team, researchers have introduced distinct performance metrics [80]. In single-robot interactions, neglect efficiency (NE) and interaction efficiency (IE) serve as widely utilized metric classes. However, in the context of multi-robot interactions, where the success of the team depends on human attention allocation to the robots, additional metrics are required. These include the attention allocation efficiency (AAE) metric class. AAE can be evaluated through various means, such as determining the time needed for the operator to decide which robot to attend to after completing an interaction with another robot and assessing the quality of that decision [71].

9. Introduction to Mobile Dog Robots

Dog robots are quadrupedal mobile robots. Since their appearance is that of a dog, they are commonly referred to as dog robots. Among the different versions of mobile dog robots available, there are some popular dog robots (see Figure 15) available commercially in the market that are being used for several marketing as well as research purposes: (1) Boston Dynamics’ Spot (Boston Dynamics, Waltham, MA, USA) [81], Ghost Robotics’ Vision 60 (Ghost Robotics, Philadelphia, PA, USA) [82], and ANYbotics’ ANYmal (ANYbotics, Zurich, Switzerland) [83].
This review paper tries to explore the optimum requirements in a dog robot for it to be eligible to be used for future research works on enhancing the present physical security condition in nuclear power plants. For this, we present a detailed comparison of the capabilities of all three dog robots under five different areas below. The tabulated information on the comparison between the robots can be found in Table A1 in Appendix A.

9.1. Operational Capabilities

The operational capability of the robot has five subsections. The sub-sections include payload and tracking, speed and distance, battery capabilities, safety and protection, and environmental conditions.

9.1.1. Payload and Tracking

The operational payload of the SPOT robot is high (up to 14 kg) compared with that of the Ghost Robotics robot (9 kg) and ANYmal Robotics robot (10 kg). The location tracking capabilities of both SPOT and Ghost robots are almost the same. The SPOT robot uses high-resolution GPS, an Orion RTLS tag, and robot odometry to track, the Ghost robot uses an integrated GPS waypoint and playback route automation from the OCU, and the ANYmal Robotics robot uses the points for autonomous navigation. Overall, all robots know their absolute location although they use different technologies.

9.1.2. Speed and Distance

The SPOT robot can power itself for one and a half hours if it just stands or walks and three hours if it lays down. It can walk 9 km in 1.5 h at 1.6 m/s. The Ghost robot has four different modes of speed: standard walk, fast walk, run and sprint. The speed of the robot in standard walk mode is 1 m/s (3.3 ft/s), that in fast walk mode is 1.6 m/s (5.2 ft/s), that in run nide is 2.2 m/s (7 ft/s), and that in sprint mode is 3 m/s (10 ft/s). It can carry out a fast walk for 10 km in three hours. The ANYmal Robotics robot has a maximum speed of 1.3 m/s and a safe and efficient operation speed of 0.75 m/s. It can walk up to 4 km at 1.3 m/s. The Ghost Robotics robot has better speed and distance when compared with that of the SPOT robot and ANYmal Robotics robot. However, the overall performance also depends on the amount of battery charge, which varies with respect to the payload, terrain, and temperature of the surroundings.

9.1.3. Battery Capabilities

The battery type of all the robots is a Li-ion battery. The running time on a full charge of a SPOT robot at a normal speed is 17 km, that for the Ghost Robotics robot is 12.8 km, and that for the ANYmal Robotics robot is 4 km. The standby time for the SPOT robot is one and a half hours, while that for the Ghost Robotics robot is 21 h. The recharge time of the ANYmal robot is 3 h for a full charge and 100 min for a 70% quick charge. The SPOT robot does not have ingress protection whereas Ghost Robotics and ANYmal Robotics robots have ingress protection (IP67). The Ghost Robotics robot has overall better battery capabilities, and the SPOT robot has the longest range.

9.1.4. Safety and Protection

The safety protection features an emergency STOP on the robot and a remote-contrcontrolled STOP. The ANYmal Robotics robot has a push button on the robot to disconnect the complete power supply and a push button on the remote control to disable the actuators.

9.1.5. Environmental Conditions

The environmental conditions include temperature, day or night, and water and dust ingress protection. The operating temperature of the SPOT robot is −20 °C to 45 °C, the Ghost Robotics robot is −40 °C to 55 °C and that of ANYmal Robotics robot is −10 °C to 50 °C. No light is required for Ghost Robotics robots and ANYmal Robotics robot for autonomous operation. The SPOT robot can have water up to knee height whereas the Ghost Robotics robot and ANYmal Robotics robot are fully protected against water. The Ghost Robotics robot can float on its belly as well. Overall, Ghost Robotics has better environmental conditions capabilities when compared to SPOT and ANYmal Robotics robots.

9.2. Sensory Capabilities

The sensory capabilities include the use of radiation sensors in dosimetry, CBRN, spectroscopy, and vision technologies.

9.2.1. Rad Sensors

The SPOT robot uses RDS-32 m, Accurad PRD, and SPIR Explorer internal GMs (Mirion Technologies, Atlanta, GA, USA) for dosimetry. The Ghost Robotics and ANYmal Robotics robot do not have any equipment that carries out dosimetry. The SPOT robot has gas environment sensors, toxin-sensing chemical sensors, and Alpha/Beta CAM measurement sensors (Mirion Technologies), while the Ghost robot has CBRN sensors and the ANYmal robot does not have any environmental or toxin-measuring sensors. The SPOT robot has the equipment to carry out spectroscopy measurements, i.e., SPIR-Explorer NAI/LaBr3, Osprey MCA, and MicroGe (Mirion Technologies), whereas both Ghost Robotics robots and ANYmal Robotics robots do not have any. Overall, The SPOT robot dominates the Rad sensors’ capabilities.

9.2.2. Navigation and Object Detection

The SPOT robot has an integrated radiometric thermal camera, a pan–tilt–zoom (PTZ) camera with 30× optical zoom, an infrared (IR) camera, a 360° camera, and audio with a microphone. In contrast, the Ghost Robotics robot also has a PTZ camera along with a LIDAR and thermal (IR) surround sensor, and the ANYmal Robotics robot has a zoom camera, a depth camera along with a LIDAR sensor, and a thermal camera. Overall, for navigation, the ANYmal Robotics robot has better navigation and object detection capabilities.

9.3. Technical Capabilities

The technical capabilities of the robot include computing power, communication, and integration.

9.3.1. Computing Power

Computing power is a critical technical capability for an autonomous robot. The SPOT robot uses an Intel i5 8th Gen processor, whereas the Ghost Robotics robot uses the powerful NVIDIA Edge for computing and the ANYmal Robotics robot uses two 8th Gen Intel core-I7 six-core processors. The same powerful NVIDIA Xavier is used in leading autonomous vehicles, so the Ghost Robotics robot has more potential to deploy any application. The SPOT robot has 16 GB DDR4 2666 MHz RAM and 512 GB SSD storage, whereas the Ghost Robotics robot has NVIDIA Xavier 32 GB RAM with 2 TB NVMe SSD storage, and the ANYmal Robotics robot has two 8 GB RAMs with two 240 GB SSDs as well. The Ghost Robotics robot has better storage and computing power when compared with those of the SPOT robot.

9.3.2. Communication

The SPOT robot uses a gigabit connection for real-time communication and has an ethernet connector, a low-voltage communications connector (TTL Serial), a GPS antenna, and a USB. In contrast, the Ghost Robotics robot uses Wi-Fi-6, 4G, 5G, and FirstNet radios for communication. The ANYmal Robotics robot uses Wi-Fi and 4G LTE for communication. The SPOT robot dominates communication, as real-time communication is essential, especially for security and surveillance monitoring.

9.3.3. Integration

Integration APIs are critical for the robot to communicate or share data. Depending on the task type, either low-level or high-level APIs are used. The SPOT robot uses APIs (space control and end-effector control) to control its robotics arm and third-party APIs for robotics operating systems (ROS), the Ghost Robotics robot can integrate any sensor, radio, or electronics device with the standard ROS/ROS2 framework, and the ANYmal Robotics robot uses an ANYmal API, which is a gRPC-based API for mission control data integration. The SPOT robot and ANYmal robot integrations are more customized and have fewer options for integrations. In contrast, Ghost Robotics robots run on open-source software like the ROS/ROS2 framework, and we can connect any device to the robot. Overall, it is tough to determine which one has better integration capabilities as there are drawbacks for both custom and open-source integrations.

9.4. Dexterity Capabilities

Dexterity capabilities are skills in performing tasks like picking an object, identifying flying objects overhead, swimming, and blind locomotion. They are generally associated with all kinds of arm movement and locomotion.

9.4.1. Manipulator

The SPOT robot has an arm manipulator with a gripper attached to it. It can grasp, pick, and place. The gripper also has a 4 k camera. In contrast, the Ghost robot also has a HDT global customizable manipulator. However, this arm is not as robust as the SPOT robot manipulator.

9.4.2. Blind and Inverted Locomotion

The SPOT robot does not have the capability of blind locomotion, whereas the Ghost Robotics robot can navigate when all its sensor capabilities are switched off. SPOT robots, ANYmal Robotics robots, and Ghost Robotics robots can stand back on their feet in inverted motion or while immobilized.

9.5. Software Capabilities

The SPOT robot has a Mirion backpack, which is an embedded computing system. It is connected to a controller interface with the Mirion SPIR-explorer overlay. The Ghost Robotics robot has an MPO Mobile control center. This is a rugged suitcase with AVERT-MPO servers, an outdoor viewable display, a keyboard, a mouse, and a standalone battery module for the 8 h runtime of the AVERT-MPO system.

Simulation

The SPOT robot and ANYmal Robotics robot do not have a simulation platform. However, the Ghost Robotics robot has a simulation platform that can mimic real-time scenarios.

10. Research Roadmap

Considering the literature review conducted on basic robot performance as well as the state-of-the-art review comparing the three popular dog robots, we will be proposing a new direction for the research on using mobile dog robots for security applications at remotely operated advanced reactors.
This research roadmap is created keeping in mind the experiments that can be conducted, data that can be collected, and parameters that need to be determined in each experiment to assess the performance of the robot for six different performance parameters: (a) autonomous navigation and patrolling, (b) runtime monitoring (c), path planning in unstructured environments, (d) human–robot social planning, (e) autonomous inspection, and (f) real-time communication.
For instance, experiments that can be conducted to assess the performance of a robot for autonomous navigation and patrolling may include the obstacle avoidance test, mapping test, localization test, patrolling test, dynamic obstacle avoidance test, and many more. For each of the mentioned tests, data like the time taken to navigate, the speed of a robot during navigation, the map created, a comparison of the latter to an actual map of the environment, and so on. Likewise, for real-time communication, tests include the latency test, bandwidth test, reliability test, scalability test, and others, and data include packet loss, round trip time, the size of the data transferred, the time taken to transfer the data, and so on.
Utilizing the aforementioned research roadmap, we propose research that can be conducted on a dog robot to utilize the technical capabilities available in them to use them for physical security research. Using this research, we can also introduce other sensors into them and analyze how they perform in each sensor environment.

11. Discussions

The findings presented in this literature review highlight the critical challenges faced by nuclear power plants (NPPs), particularly in relation to their economic existence. The revenue gap analysis conducted by the Idaho National Laboratory in 2016 underscored the financial struggles of a significant portion of NPPs, with 63 out of 79 power plants operating at a loss. The accelerating operation and maintenance (O&M) costs, constituting nearly 70% of the power generation cost, pose a substantial threat to the sustainability of nuclear power in the face of stiff competition from natural gas and renewable energy sources [1].
To address the economic constraints, this study explores the potential of utilizing robotic solutions, specifically dog robots, for enhancing the physical security of nuclear power plants. The rationale behind integrating robots lies in the need to optimize the Operation & Maintenance (O&M) costs, with labor costs accounting for 50% of the overall O&M expenses in NPPs. The historical context of using robots in hazardous environments, dating back to the late 1940s, serves as a testament to the long-standing interest in leveraging automation for mitigating risks associated with nuclear facilities [14].
The review further emphasizes the fundamental capabilities required for robots intended for various applications, such as construction, disaster response, underwater exploration, and more. For robots to excel in these novel environments, certain basic functionalities are deemed essential, with autonomous mobility and manipulation being key prerequisites. Additionally, high speed is identified as another critical capability for mobile robots [64,65]. In the context of movement potentiality, the ability to localize, build maps, and navigate efficiently and safely are crucial aspects of a mobile robot’s functionality. Localization involves the robot’s orientation within a specific locality, map-building focuses on creating route maps, and navigation refers to the effective and safe travel to a desired destination [66].
The versatility of mobile robots is highlighted, with their functions varying based on types such as manipulator, legged, and wheeled robots. These robots must interact with their environment, involving activities like grabbing, lifting, pushing, and object manipulation. The distinctive capabilities of each robot type dictate their specific functions [64]. A critical aspect of the literature review addresses the formalization of problems by robots. Robots encounter uncertainties, especially in complex tasks like household tidying. The study by Nebel et al. [67] demonstrates the application of continual planning to address uncertainties in a nondeterministic partially observable open domain. Another study by Nguyen et al. [68] explores the learning perspective, focusing on a humanoid robot’s ability to recognize objects through active perception and interactive exploration.
The review extends to the field of human–robot interaction (HRI), emphasizing the significance of a robot’s interaction with its environment and human counterparts. Wadoro, an autonomous mobile surveillance robot, is presented as an example, actively seeking potential human intruders in a semi-outdoor environment [36].
Performance assessment of robots is categorized into qualitative and quantitative measures. The literature emphasizes the importance of benchmarking to assess the qualitative performance of robots and presents a case study involving the Urban Search and Rescue (USAR) robot, where stakeholders defined deployment categories and performance requirements [69]. For the quantitative assessment of human–robot interaction, Singh et al. [70] conducted an experiment using a wearable glove for object grasp applications. This study involved the integration of sensors to capture grasp information from human demonstrators, demonstrating the effectiveness of a human–robot interaction system based on learning from demonstration.
The parameterization of robot performance is explored, highlighting the complexity in formulating a mathematical expression for quantitative assessment. The discussion touches upon industry-wide standards and the variation in measurement parameters based on the type of robot considered, such as industrial robots or mobile robots like unmanned vehicles, quadruped robots, and humanoid robots. Performance measurement criteria encompass self-awareness, human-awareness, system performance, autonomy, and human–robot interaction. Metrics for self-awareness, human-awareness, and system performance involve subjective and qualitative measures. Autonomy is measured through various techniques, including comparison competitions and neglect tolerance, while human–robot interaction is evaluated through metrics like trust perception, task effectiveness, and interaction effort [74,75,76,77,78].
The comprehensive parameterization and measurement of robot performance contribute to a deeper understanding of the capabilities required for successful deployment across various applications. These findings provide valuable insights for the development and evaluation of future robotic systems, ensuring their effectiveness, autonomy, and seamless interaction in diverse environments.
Furthermore, the comparative analysis of dog robots, specifically Boston Dynamics’ Spot [81], Ghost Robotics’ Vision 60 [82], and ANYbotics’ ANYmal [83] robots, reveals crucial insights into their operational, sensory, technical, dexterity, and software capabilities. This exploration is particularly relevant in the context of enhancing the physical security conditions of nuclear power plants (NPPs). The detailed comparison informs the selection of an optimal dog robot for future research on enhancing the physical security conditions in nuclear power plants. The choice depends on specific requirements and priorities. For applications demanding robust radiation monitoring, the SPOT robot’s superior rad sensors make it a preferred choice. The Ghost Robotics robot, with its advanced computing power, communication flexibility, and simulation capabilities, presents itself as a versatile option for diverse research applications. The specific operational, sensory, technical, dexterity, and software capabilities of each robot should be aligned with the unique challenges and goals associated with securing nuclear facilities.
Building upon the insights gained from the literature review on basic robot performance and the state-of-the-art comparison of popular dog robots, a new research direction is proposed. This roadmap outlines experiments, data collection methods, and key parameters for assessing the performance of mobile dog robots in security applications, specifically in the context of remotely operated advanced reactors.

12. Conclusions and Future Works

The comparative analysis offers practical insights for selecting dog robots tailored to specific needs in nuclear power plants, guiding researchers and practitioners. The identified strengths can drive advancements in security measures. The proposed research roadmap aligns with evolving challenges, providing a strategic guide for future investigations. The outlined experiments within each performance parameter will contribute essential data to optimize mobile dog robots, shaping innovative solutions for enhanced security in advanced reactor environments. The combined insights create a comprehensive framework for addressing the intricate demands of securing nuclear facilities.
The first phase of the research involved a literature review for the assessment of robots under different categories and a review of baseline practices for mobile robots in other industries. Additionally, a detailed research roadmap has been defined that includes experiments that can be conducted using mobile dog robots to know whether or not they can be utilized for physical security applications at power plants. Future work will include short-term and long-term tasks. The short-term tasks for the next phase include the following:
[a]
Utilizing an ROS (robot operating system) to create a simulation environment based on detecting objects;
[b]
Utilizing an ROS (robot operating system) in a robot through coding and using it for object detection purposes;
[c]
Introducing different sensors on an existing robot to oversee how each of the sensors behaves and then collect those data to analyze them.
In addition to these short-term tasks, there are a few long-term works that are being proposed for research. These include the following:
[a]
Setting up an experimental facility required to perform research works;
[b]
Performing a pilot demonstration at an existing or future nuclear power plant (NPP) site using a dog robot;
[c]
Engaging with physical security stakeholders at currently operating plants and at advanced reactor developers;
[d]
Developing a table-top model of a typical NPP-protected area indicating key elements of the target set and physical security posture.

Author Contributions

Conceptualization, M.M. and V.Y.; methodology, M.M., V.Y., T.D. and U.S.; software, U.S. and U.S.M.; validation, M.M., V.Y., T.D. and U.S.; formal analysis, M.M., V.Y., T.D., U.S. and U.S.M.; investigation, M.M., V.Y., T.D., U.S. and U.S.M.; resources, M.M., V.Y., T.D. and U.S.; data curation, V.Y., T.D., U.S. and U.S.M.; writing—original draft preparation, U.S.; writing—review and editing, M.M., V.Y., T.D., U.S. and U.S.M.; visualization, M.M., V.Y., T.D. and U.S.; supervision, M.M., V.Y. and T.D.; project administration, M.M.; funding acquisition, M.M. All authors have read and agreed to the published version of the manuscript.

Funding

The literature review for this research was supported by the Center for Advanced Energy Studies (CAES) business development funds as part of Idaho National Laboratory under the Department of Energy (DOE) Idaho Operations Office (an agency of the U.S. Government) Contract DE-AC07-05ID145142.

Data Availability Statement

Data are contained within the article or Appendix A.

Acknowledgments

The authors would like to express their gratitude to the Center for Advanced Energy Studies (CAES) for providing the necessary program development funds to carry out this literature review. Additionally, the authors would like to thank Jared Cantrell, Laboratory Manager at the Department of Civil and Environmental Engineering at Idaho State University, for his assistance in compiling the literature required for the article. Special thanks are also extended to Kathryn Hogarth for proofreading this article and giving it its final form. Furthermore, the authors acknowledge the support provided by Idaho State University for this project.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. State-of-the-art comparison between Boston Dynamics’ Spot Enterprise, Ghost Robotics’ Vision 60, and ANYbotics’ ANYmal robots [81,82,83].
Table A1. State-of-the-art comparison between Boston Dynamics’ Spot Enterprise, Ghost Robotics’ Vision 60, and ANYbotics’ ANYmal robots [81,82,83].
CapabilitiesSpotGhostANYmal
Basic Information
Base PriceUSD 160,000USD 207,600USD 150,000
Maintenance/service LifeNo maintenance plan, only battery changes needed. Two-filter mesh, check those conditions. Foot pads, single-bolt
1 year warranty
Service plan, self-charging
Not available (maintenance is as per contract). V60 Q-year PREMIUM Maintenance + Preventive Care + Software Maintenance, version number 8.6.4Not available
Repairable and components available in the U.S.From Boston DynamicsAvailable to maximum extentNo
Operational Capabilities
Payload and Tracking
PayloadUp to 14 kgWeight 9 kg +/− (20 lbs +/−) w/base battery10 kg (nominal), 15 kg (reduced performance)
Location trackingHR-GPS, Orion RTLS tag, Robot OdometryIntegrated GPS waypoint navigation and playback from operator control unit (OCU).High-precision autonomous navigation with centimeter accuracy.
Speed and distance
Speed, agility1.6 m/s for 9 km in 1.5 h.Standard walk: 1 m/s (3.3 ft/s); Fast walk: 1.6 m/s (5.2 ft/s); Run:2.2 m/s (7 ft/s); Sprint: 3 m/s (10 ft/s).Max speed: 1.3 m/s
Safe and efficient operation: 0.75 m/s
Maximum distance17 km on full charge,
one hour and half if it just stands or walks, 3 h if laid down.
10 km, 3 h at a fast speed.4 km at 1.3 m/s
Battery Capabilities
Type and capacityCapacity 564 WHCapacity 1250 WH Li-IonSwappable Li-Ion battery, UN 38.3 certified 932.4 WH
Running time and rangeRange on full charge: 17 km; Operating time: 1.5 h (Standing/Walking), 3 h (laying down)Li-ion battery allows for 6.8–12.8 km (4.2–8 miles) per charge, 8–10 h of mixed use, and 21 h of standby.Battery life: 90–120 min
Range: 4 km full charge, up to 2 km for typical inspection
Speed may be reduced on rough or slippery terrain.
Recharge time60 minNot available3 h for full charge, 100 min for 70% quick charge
Dimensions380 × 315 × 178 mm (15 × 12.4 × 7 in) L × W × HNot availableIn mm: 466 × 136 × 78 mm (18.35 × 5.35 × 3.07 in)
Weight5.2 kg (11.5 lbs)7.4 kg (16.3 lbs)5.5 kg (12.13 lbs)
Ingress protectionIP54IP67IP67
Safety and Protection
Emergency stopNot AvailableNot availablePush button on robot to disconnect power supply
Remote-control emergency stopNot AvailableNot availablePush button on remote control to disable the actuators
ComplianceDesigned according to ISO 12100 [84] for risk assessment and reduction methodology and IEC 60204-1 [85] for electrical safetyNot AvailableCE marked complying with Machinery Directive 2006/42/EC (MD),
EMC Directive 2014/30/EU (EMCD) [86], Low Voltage Directive 2014/35/
EU, Radio Equipment Directive 2014/53/EU (RED) [87] and Restriction
of Hazardous Substances in Electrical and Electronic Equipment
(ROHS) 2011/65/EU [88].
Environmental Conditions
Temperature−20 °C to 45 °C−40 °C to 55 °C (−40° to 131 °F)Specified: 0–40 °C [32–104 °F]; Typical: −10–50 °C [14–122 °F]
Lightning conditionsAmbient light requiredNot availableNo light required for autonomous operation and
inspection. Low light (min. 20 lux) needed for automatic
docking and tele-operation.
Sensory Capabilities
Rad sensors
DosimetryRDS-32 m + Supported probes for RDS-32
AccuRad PRD
SPIR_Explorer Internal GMs
No dosimetry. Interface package. Open source. ROS/ROS2 APINot available
Extra sensoryMeasurement capabilities:
Alpha/beta CAM
Environmental sensors
Toxic chemical sensing
CBRN (Chemical, Biological, radiological, and nuclear)Ultrasonic microphone: sampling frequency of 0–384 kHz
Spotlight: 189 lm continuous, 3790 lm for a short time
Range of motion: pan ± 165°, tilt ± 90°/+180°, speed 340°/s
SpectroscopySPIR-Explorer NaI/LaBr3
Data Analyst with Detector
e.g., GR-1 CZT
H3DM100/M400
Osprey MCA
MicroGe
Ram spectroscopyNot available
Navigation and Detection
VisionHD video
FLIR/IR video
3D depth imaging and mapping
RGB camera
Thermal sensors
PTZ camera
Zoom camera; default: 1080 × 1920 px, 15 FPS
Maximum: 2160 × 3840 px (QFHD/4 k), 30 FPS
20× optical zoom
70.2° to 4.1° FOV (horizontal)
LIDAR3D LIDAR scanning station used for surveying and 3D digital reconstruction.
Drawback: requires an entire payload space.
Lidar: OusterLIDAR 16 channels, 300 k pts/s, 10 Hz sweep; Range: 0.4–100 m, 3 cm accuracy; FOV: 360° × 15.0° to −15.0° (H × V); Laser: 905 nm, Eye-safe Class 1 Top of Form
 
Bottom of Form
Depth cameraNot available87 × 580 FOV, 640 × 480 at 90 FPS (fore, aft, and belly)0.3–3 m range, 87.3 × 58.1 × 95.3° depth FOV
(horizontal/vertical/diagonal), class 1 laser product
under the EN/IEC 60825-1, Edition 3 (2014)
Thermal cameraPrice: USD 55 k
30× zoom; 360 view, lighting built into
Thermal sensor: Resolution 645–512, refresh 69/56°, −5 °C temp.
Thermal (IR) surround sensor: 4 × 34° FoV lens, 640 × 512 resolution, 30 Hz and NVIDIA GPU. WaterproofThermal camera: −40 to +550 °C (radiometry), 336 × 256 px, 46° FOV (horizontal)
Technical Capabilities
Computing powerI5 Intel 8th Gen (Whiskey Lake-U) Core CCG LifecyclePower NVIDIA Edge Computing: Xavier 32 GB RAM used in autonomous vehicles, deploy AI and automation applications.2 × 8th Gen Intel Core i7 (6-core) CPU with 2 × 8 GB memory (RAM)
RAM and data storage16 GB DDR4 2666 MHz, 512 GB SSDNVIDIA Xavier 32 GB RAM, 2 TB NVMe SSDwith 2 × 8 GB Memory (RAM),
2 × 240 GB SSD
Communication and Integration
Communication (connect with the capability)Gigabit connection to the robot for real-time communication,
Ethernet Connector, low-voltage communications connector (TTL Serial),
GPS antenna, USB.
Wi-Fi-6, 4G, 5G, and FirstNet radiosWi-Fi: built-in module 2.4/5 Ghz, 802.11 ac wave2
The access point or client mode
4G LTE: add-on module, LTE Cat.12
Integration (low-level API/high-level API)Integrate manipulator and API options (Joint space control and end-effector control), 3rd party available for ROSIntegrate any sensor, radio or electronics sign industry-standard ROS/ROS2 framework. C/C++|ROS, ROS2, MAVLink-compatible w/1:1 Gazebo SIM support.ANYMAL API: gRPC-based API for
mission control
data integration
Dexterity Capabilities
Manipulator
Arm/manipulatorYes. Separate purchase −USD 80 k, does have constraints. Gripper has a 4 k camera 28 inches in front of the robot.
Grasping, picking, placing
constrained Manipulation
Door opening
HDT global customizable armNot available
Blind and Inverted Locomotion
Blind locomotion (in dark, snow or tall grass)Dark: manual control, turn off ground floor detection
Snow: can navigate.
Blind-mode operation in difficult terrain, at high speed, in tall grass, and in confined spaces.Can operate autonomously under no-light conditions.
Operates in inverted motion/immobilizationYes.Yes.Yes.
Software Capabilities
SimulatorCAD model. Used in ROS simulatorYes.No

References

  1. Yadav, V.; Biersdorf, J. Utilizing FLEX Equipment for Operations and Maintenance Cost Reduction in Nuclear Power Plants; INL/EXT-19-55445; Idaho National Laboratory: Idaho Falls, ID, USA, 2019.
  2. Gallery—World Nuclear Association. Available online: https://world-nuclear.org/gallery/nuclear-power-economics-and-project-structuring-re/breakdown-of-operating-costs-for-nuclear,-coal-and.aspx (accessed on 2 March 2024).
  3. Garcia, M.L. Vulnerability Assessment of Physical Protection Systems; Elsevier: Amsterdam, The Netherlands, 2005. [Google Scholar]
  4. ISO 8373:2012; Robots and Robotic Devices—Vocabulary. International Organization for Standardization: Geneva, Switzerland, 2012.
  5. Kurfess, T.R. (Ed.) Robotics and Automation Handbook; CRC Press: Boca Raton, FL, USA, 2005; Volume 414. [Google Scholar]
  6. Zereik, E.; Bibuli, M.; Mišković, N.; Ridao, P.; Pascoal, A. Challenges and future trends in marine robotics. Annu. Rev. Control 2018, 46, 350–368. [Google Scholar] [CrossRef]
  7. Zhang, F.; Marani, G.; Smith, R.N.; Choi, H.T. Future trends in marine robotics [tc spotlight]. IEEE Robot. Autom. Mag. 2015, 22, 14–122. [Google Scholar] [CrossRef]
  8. Flores-Abad, A.; Ma, O.; Pham, K.; Ulrich, S. A review of space robotics technologies for on-orbit servicing. Prog. Aerosp. Sci. 2014, 68, 1–26. [Google Scholar] [CrossRef]
  9. Yoshida, K. Achievements in space robotics. IEEE Robot. Autom. Mag. 2009, 16, 20–28. [Google Scholar] [CrossRef]
  10. Hirzinger, G.; Brunner, B.; Dietrich, J.; Heindl, J. Sensor-based space robotics-ROTEX and its telerobotic features. IEEE Trans. Robot. Autom. 1993, 9, 649–663. [Google Scholar] [CrossRef]
  11. Dario, P.; Guglielmelli, E.; Allotta, B.; Carrozza, M.C. Robotics for medical applications. IEEE Robot. Autom. Mag. 1996, 3, 44–56. [Google Scholar] [CrossRef]
  12. Davies, B. A review of robotics in surgery. Proc. Inst. Mech. Eng. Part H J. Eng. Med. 2000, 214, 129–140. [Google Scholar] [CrossRef]
  13. Taylor, R.H.; Menciassi, A.; Fichtinger, G.; Fiorini, P.; Dario, P. Medical robotics and computer-integrated surgery. In Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1657–1684. [Google Scholar]
  14. Fu, K.S.; Gonzalez, R.C.; Lee, C.G.; Freeman, H. Robotics: Control, Sensing, Vision, and Intelligence; McGraw-Hill: New York, NY, USA, 1987; Volume 1. [Google Scholar]
  15. Moore, T. Robots for nuclear power plants. IAEA Bull. 1985, 27, 31–38. [Google Scholar]
  16. Iqbal, J.; Tahir, A.M.; ul Islam, R. Robotics for nuclear power plants—Challenges and future perspectives. In Proceedings of the 2012 2nd International Conference on Applied Robotics for the Power Industry (CARPI), Zurich, Switzerland, 11–13 September 2012; pp. 151–156. [Google Scholar]
  17. Chen, Z.; Li, J.; Wang, S.; Wang, J.; Ma, L. Flexible gait transition for six wheel-legged robot with unstructured terrains. Robot. Auton. Syst. 2022, 150, 103989. [Google Scholar] [CrossRef]
  18. Chen, Z.; Li, J.; Wang, J.; Wang, S.; Zhao, J.; Li, J. Towards hybrid gait obstacle avoidance for a six wheel-legged robot with payload transportation. J. Intell. Robot. Syst. 2021, 102, 60. [Google Scholar] [CrossRef]
  19. Wang, S.; Chen, Z.; Li, J.; Wang, J.; Li, J.; Zhao, J. Flexible motion framework of the six wheel-legged robot: Experimental results. IEEE/ASME Trans. Mechatron. 2021, 27, 2246–2257. [Google Scholar] [CrossRef]
  20. Haynes, G.C.; Rizzi, A.A. Gaits and gait transitions for legged robots. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation ICRA 2006, Orlando, FL, USA, 15–19 May 2006; pp. 1117–1122. [Google Scholar]
  21. Wermelinger, M.; Fankhauser, P.; Diethelm, R.; Krüsi, P.; Siegwart, R.; Hutter, M. Navigation planning for legged robots in challenging terrain. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; pp. 1184–1189. [Google Scholar]
  22. Moeller, R.; Deemyad, T.; Sebastian, A. Autonomous navigation of an agricultural robot using RTK GPS and Pixhawk. In Proceedings of the 2020 Intermountain Engineering, Technology and Computing (IETC), Orem, UT, USA, 2–3 October 2020; pp. 1–6. [Google Scholar]
  23. Deemyad, T.; Moeller, R.; Sebastian, A. Chassis design and analysis of an autonomous ground vehicle (AGV) using genetic algorithm. In Proceedings of the 2020 Intermountain Engineering, Technology and Computing (IETC), Orem, UT, USA, 2–3 October 2020; pp. 1–6. [Google Scholar]
  24. Moderator, A. Security and Nuclear Power Plants: Robust and Significant. In U.S. NRC Blog; 23 August 2013. Available online: https://public-blog.nrc-gateway.gov/2013/08/23/security-and-nuclear-power-plants-robust-and-significant/ (accessed on 12 March 2024).
  25. U.S. Nuclear Regulatory Commission. Code of Federal Regulations, Title 10, Part 73, Section 45. Performance Capabilities for Fixed Site Physical Protection Systems. Available online: https://www.nrc.gov/reading-rm/doc-collections/cfr/part073/part073-0045.html (accessed on 12 March 2024).
  26. Siegwart, R.; Nourbakhsh, I.R.; Scaramuzza, D. Introduction to Autonomous Mobile Robots, 2nd ed.; The MIT Press: Cambridge, MA, USA; London, UK, 2011. [Google Scholar]
  27. López, J.; Pérez, D.; Zalama, E.; Gómez-García-Bermejo, J. Bellbot—A hotel assistant system using mobile robots. Int. J. Adv. Robot. Syst. 2013, 10, 40. [Google Scholar] [CrossRef]
  28. Nagatani, K.; Kiribayashi, S.; Okada, Y.; Otake, K.; Yoshida, K.; Tadokoro, S.; Nishimura, T.; Yoshida, T.; Koyanagi, E.; Fukushima, M.; et al. Emergency response to the nuclear accident at the Fukushima Daiichi Nuclear Power Plants using mobile rescue robots. J. Field Robot. 2013, 30, 44–63. [Google Scholar] [CrossRef]
  29. Rubio, F.; Valero, F.; Llopis-Albert, C. A review of mobile robots: Concepts, methods, theoretical framework, and applications. Int. J. Adv. Robot. Syst. 2019, 16, 1729881419839596. [Google Scholar] [CrossRef]
  30. Ajwad, S.A.; Iqbal, J.; Ullah, M.I.; Mehmood, A. A systematic review of current and emergent manipulator control approaches. Front. Mech. Eng. 2015, 10, 198–210. [Google Scholar] [CrossRef]
  31. Zomerdijk, M.J.J.; van der Wijk, V. Structural Design and Experiments of a Dynamically Balanced Inverted Four-Bar Linkage as Manipulator Arm for High Acceleration Applications. Actuators 2022, 11, 131. [Google Scholar] [CrossRef]
  32. Surati, S.; Hedaoo, S.; Rotti, T.; Ahuja, V.; Patel, N. Pick and place robotic arm: A review paper. Int. Res. J. Eng. Technol. 2021, 8, 2121–2129. [Google Scholar]
  33. Yudha, H.M.; Dewi, T.; Risma, P.; Oktarina, Y. Arm robot manipulator design and control for trajectory tracking; a review. In Proceedings of the 2018 5th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), Malang, Indonesia, 16–18 October 2018; pp. 304–309. [Google Scholar]
  34. Gao, X.; Li, J.; Fan, L.; Zhou, Q.; Yin, K.; Wang, J.; Song, C.; Huang, L.; Wang, Z. Review of Wheeled Mobile Robots’ Navigation Problems and Application Prospects in Agriculture. IEEE Access 2018, 6, 49248–49268. [Google Scholar] [CrossRef]
  35. Bae, B.; Lee, D.H. Design of a Four-Wheel Steering Mobile Robot Platform and Adaptive Steering Control for Manual Operation. Electronics 2023, 12, 3511. [Google Scholar] [CrossRef]
  36. Mittal, S.; Rai, J.K. Wadoro: An autonomous mobile robot for surveillance. In Proceedings of the 2016 IEEE 1st International Conference on Power Electronics, Intelligent Control and Energy Systems (ICPEICES), Delhi, India, 4–6 July 2016; pp. 1–6. [Google Scholar]
  37. Ortigoza, R.S.; Marcelino-Aranda, M.; Ortigoza, G.S.; Guzman, V.M.H.; Molina-Vilchis, M.A.; Saldana-Gonzalez, G.; Herrera-Lozada, J.C.; Olguin-Carbajal, M. Wheeled Mobile Robots: A review. IEEE Lat. Am. Trans. 2012, 10, 2209–2217. [Google Scholar] [CrossRef]
  38. De Luca, A.; Oriolo, G.; Vendittelli, M. Control of Wheeled Mobile Robots: An Experimental Overview. In Ramsete; Lecture Notes in Control and Information Sciences; Nicosia, S., Siciliano, B., Bicchi, A., Valigi, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2001; Volume 270. [Google Scholar] [CrossRef]
  39. Kim, S.; Wensing, P.M. Design of Dynamic Legged Robots. Found. Trends® Robot. 2017, 5, 117–190. [Google Scholar] [CrossRef]
  40. Liu, J.; Tan, M.; Zhao, X. Legged robots—An overview. Trans. Inst. Meas. Control 2007, 29, 185–202. [Google Scholar] [CrossRef]
  41. Roscia, F.; Cumerlotti, A.; Del Prete, A.; Semini, C.; Focchi, M. Orientation control system: Enhancing aerial maneuvers for quadruped robots. Sensors 2023, 23, 1234. [Google Scholar] [CrossRef] [PubMed]
  42. Biswal, P.; Mohanty, P.K. Development of quadruped walking robots: A review. Ain Shams Eng. J. 2021, 12, 2017–2031. [Google Scholar] [CrossRef]
  43. Bruzzone, L.; Nodehi, S.E.; Fanghella, P. Tracked locomotion systems for ground mobile robots: A review. Machines 2022, 10, 648. [Google Scholar] [CrossRef]
  44. Wang, C.; Zhang, H.; Ma, H.; Wang, S.; Xue, X.; Tian, H.; Liu, P. Simulation and Validation of a Steering Control Strategy for Tracked Robots. Appl. Sci. 2023, 13, 11054. [Google Scholar] [CrossRef]
  45. Dong, P.; Wang, X.; Xing, H.; Liu, Y.; Zhang, M. Design and control of a tracked robot for search and rescue in nuclear power plant. In Proceedings of the 2016 International Conference on Advanced Robotics and Mechatronics (ICARM), Macau, China, 18–20 August 2016; pp. 330–335. [Google Scholar]
  46. Ducros, C.; Hauser, G.; Mahjoubi, N.; Girones, P.; Boisset, L.; Sorin, A.; Jonquet, E.; Falciola, J.M.; Benhamou, A. RICA: A tracked robot for sampling and radiological characterization in the nuclear field. J. Field Robot. 2017, 34, 583–599. [Google Scholar] [CrossRef]
  47. Tadakuma, K.; Tadakuma, R.; Maruyama, A.; Rohmer, E.; Nagatani, K.; Yoshida, K.; Ming, A.; Shimojo, M.; Higashimori, M.; Kaneko, M. Mechanical design of the wheel-leg hybrid mobile robot to realize a large wheel diameter. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 3358–3365. [Google Scholar] [CrossRef]
  48. Vargas, G.A.; Gómez, D.J.; Mur, O.; Castillo, R.A. Simulation of a wheelleg hybrid robot in Webots. In Proceedings of the IEEE Colombian Conference on Robotics and Automation (CCRA), Bogota, Colombia, 29–30 September 2016; pp. 1–5. [Google Scholar] [CrossRef]
  49. Yokota, S.; Kawabata, K.; Blazevic, P.; Kobayashi, H. Control law for rough terrain robot with leg-type crawler. In Proceedings of the IEEE International Conference on Mechatronics and Automation, Luoyang, China, 25–28 June 2006; pp. 417–422. [Google Scholar]
  50. Kim, J.; Kim, Y.G.; Kwak, J.H.; Hong, D.H.; An, J. An wheel &track hybrid robot platform for optimal navigation in an urban environment. In Proceedings of the SICE Annual Conference 2010, Taipei, Taiwan, 18–21 August; 2010; pp. 881–884. [Google Scholar]
  51. Michaud, F.; Letourneau, D.; Arsenault, M.; Bergeron, Y.; Cadrin, R.; Gagnon, F.; Legault, M.A.; Millette, M.; Paré, J.F.; Tremblay, M.C.; et al. Multi-modal locomotion robotic platform using leg-track-wheel articulations. Auton. Robot. 2005, 18, 137–156. [Google Scholar] [CrossRef]
  52. Ordoñez-Avila, J.L.; Moreno, H.A.; Perdomo, M.E.; Calderón, I.G.C. Designing Legged Wheels for Stair Climbing. Symmetry 2023, 15, 2071. [Google Scholar] [CrossRef]
  53. Arjomandi, M.; Agostino, S.; Mammone, M.; Nelson, M.; Zhou, T. Classification of Unmanned Aerial Vehicles; Report for Mechanical Engineering Class; University of Adelaide: Adelaide, Australia, 2006; pp. 1–48. [Google Scholar]
  54. Ahmed, F.; Mohanta, J.C.; Keshari, A.; Yadav, P.S. Recent advances in unmanned aerial vehicles: A review. Arab. J. Sci. Eng. 2022, 47, 7963–7984. [Google Scholar] [CrossRef]
  55. Fitrikananda, B.P.; Jenie, Y.I.; Sasongko, R.A.; Muhammad, H. Risk Assessment Method for UAV’s Sense and Avoid System Based on Multi-Parameter Quantification and Monte Carlo Simulation. Aerospace 2023, 10, 781. [Google Scholar] [CrossRef]
  56. Luukkonen, T. Modelling and control of quadcopter. Independent research project in applied mathematics. Espoo 2011, 22, 22. [Google Scholar]
  57. Neira, J.; Sequeiros, C.; Huamani, R.; Machaca, E.; Fonseca, P.; Nina, W. Review on unmanned underwater robotics, structure designs, materials, sensors, actuators, and navigation control. J. Robot. 2021, 2021, 5542920. [Google Scholar] [CrossRef]
  58. Jorge, V.A.; Granada, R.; Maidana, R.G.; Jurak, D.A.; Heck, G.; Negreiros, A.P.; Dos Santos, D.H.; Gonçalves, L.M.; Amory, A.M. A survey on unmanned surface vehicles for disaster robotics: Main challenges and directions. Sensors 2019, 19, 702. [Google Scholar] [CrossRef]
  59. Liu, G.; Zhang, S.; Ma, G.; Pan, Y. Path Planning of Unmanned Surface Vehicle Based on Improved Sparrow Search Algorithm. J. Mar. Sci. Eng. 2023, 11, 2292. [Google Scholar] [CrossRef]
  60. Akib, A.; Tasnim, F.; Biswas, D.; Hashem, M.B.; Rahman, K.; Bhattacharjee, A.; Fattah, S.A. Unmanned floating waste collecting robot. In Proceedings of the TENCON 2019–2019 IEEE Region 10 Conference (TENCON), Kochi, India, 17–20 October 2019; pp. 2645–2650. [Google Scholar]
  61. Xu, X.; Wang, C.; Xie, H.; Wang, C.; Yang, H. Dual-Loop Control of Cable-Driven Snake-like Robots. Robotics 2023, 12, 126. [Google Scholar] [CrossRef]
  62. Soto, F.; Wang, J.; Ahmed, R.; Demirci, U. Medical micro/nanorobots in precision medicine. Adv. Sci. 2020, 7, 2002203. [Google Scholar] [CrossRef] [PubMed]
  63. Martinez-Sanchez, D.E.; Sandoval-Castro, X.Y.; Cruz-Santos, N.; Castillo-Castaneda, E.; Ruiz-Torres, M.F.; Laribi, M.A. Soft Robot for Inspection Tasks Inspired on Annelids to Obtain Peristaltic Locomotion. Machines 2023, 11, 779. [Google Scholar] [CrossRef]
  64. Khatib, O.; Yokoi, K.; Brock, O.; Chang, K.; Casal, A. Robots in human environments: Basic autonomous capabilities. Int. J. Robot. Res. 1999, 18, 684–696. [Google Scholar] [CrossRef]
  65. Khatib, O.; Roth, B. New robot mechanisms for new robot capabilities. In Proceedings of the IROS, Osaka, Japan, 3–5 November 1991; pp. 44–49. [Google Scholar]
  66. Arbib, M.A. (Ed.) The Handbook of Brain Theory and Neural Networks, 2nd ed.; MIT Press: Cambridge, MA, USA, 2002. [Google Scholar]
  67. Nebel, B.; Dornhege, C.; Hertle, A. How much does a household robot need to know in order to tidy up. In Proceedings of the AAAI Workshop on Intelligent Robotic Systems, Bellevue, DC, USA, 14–15 July 2013. [Google Scholar]
  68. Ivaldi, S.; Lyubova, N.; Droniou, A.; Gerardeaux-Viret, D.; Filliat, D.; Padois, V.; Sigaud, O.; Oudeyer, P.Y. Learning to recognize objects through curiosity-driven manipulation with the iCub humanoid robot. In Proceedings of the 2013 IEEE Third Joint International Conference on Development and Learning and Epigenetic Robotics (ICDL), Osaka, Japan, 18–22 August 2013; pp. 1–8. [Google Scholar]
  69. Messina, E.R.; Jacoff, A.S. Measuring the performance of urban search and rescue robots. In Proceedings of the 2007 IEEE Conference on Technologies for Homeland Security, Woburn, MA, USA, 16–17 May 2007; pp. 28–33. [Google Scholar]
  70. Singh, R.; Mozaffari, S.; Akhshik, M.; Ahamed, M.J.; Rondeau-Gagné, S.; Alirezaee, S. Human–Robot Interaction Using Learning from Demonstrations and a Wearable Glove with Multiple Sensors. Sensors 2023, 23, 9780. [Google Scholar] [CrossRef] [PubMed]
  71. Smithers, T. On quantitative performance measures of robot behaviour. Robot. Auton. Syst. 1995, 15, 107–133. [Google Scholar] [CrossRef]
  72. Madhavan, R.; Lakaemper, R.; Kalmár-Nagy, T. Benchmarking and standardization of intelligent robotic systems. In Proceedings of the 2009 International Conference on Advanced Robotics, Munich, Germany, 22–26 June 2009; pp. 1–7. [Google Scholar]
  73. Khouja, M.J. An options view of robot performance parameters in a dynamic environment. Int. J. Prod. Res. 1999, 37, 1243–1257. [Google Scholar] [CrossRef]
  74. Lampe, A.; Chatila, R. Performance measure for the evaluation of mobile robot autonomy. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation ICRA 2006, Orlando, FL, USA, 15–19 May 2006; pp. 4057–4062. [Google Scholar]
  75. Huang, H.M.; Messina, E.; Wade, R.; English, R.; Novak, B.; Albus, J. Autonomy measures for robots. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition, Anaheim, CA, USA, 13–19 November 2004; Volume 47063, pp. 1241–1247. [Google Scholar]
  76. Steinfeld, A.; Fong, T.; Kaber, D.; Lewis, M.; Scholtz, J.; Schultz, A.; Goodrich, M. Common metrics for human-robot interaction. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, Salt Lake City, UT, USA, 2–3 March 2006; pp. 33–40. [Google Scholar]
  77. Schaefer, K.E. Measuring trust in human robot interactions: Development of the “trust perception scale-HRI”. In Robust Intelligence and Trust in Autonomous Systems; Springer: Boston, MA, USA, 2016; pp. 191–218. [Google Scholar]
  78. Olsen, D.R.; Goodrich, M.A. Metrics for evaluating human-robot interactions. In Proceedings of the PERMIS, Gaithersburg, MD, USA, 16–18 September 2003; Volume 2003, p. 4. [Google Scholar]
  79. Kress-Gazit, H.; Eder, K.; Hoffman, G.; Admoni, H.; Argall, B.; Ehlers, R.; Heckman, C.; Jansen, N.; Knepper, R.; Křetínský, J.; et al. Formalizing and guaranteeing human-robot interaction. Commun. ACM 2021, 64, 78–84. [Google Scholar] [CrossRef]
  80. Crandall, J.W.; Cummings, M.L. Developing performance metrics for the supervisory control of multiple robots. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Arlington, VA, USA, 8–11 March 2007; pp. 33–40. [Google Scholar]
  81. Boston Dynamics. Spot Specifications. Boston Dynamics Support Center. 26 October 2022. Available online: https://support.bostondynamics.com/s/article/Robot-specifications (accessed on 26 October 2022).
  82. Ghost Robotics. Vision 60: Ghost Robotics. Home. Available online: https://www.ghostrobotics.io/vision-60 (accessed on 12 March 2024).
  83. ANYbotics. Anymal. ANYbotics. 1 September 2023. Available online: https://www.anybotics.com/robotics/anymal/#:~:text=ANYmal%20technical%20specifications%201%20360%C2%B0%20vision%20ANYmal%20navigates,ruggedized%20tablet%20for%20remote%20control%2C%20transportation%20box%20 (accessed on 12 March 2024).
  84. ISO 12100:2010; Safety of Machinery—General Principles for Design—Risk Assessment and Risk Reduction. International Organization for Standardization: Geneva, Switzerland, 2010.
  85. IEC 60204-1:2016; Safety of Machinery—Electrical Equipment of Machines—Part 1: General Requirements. International Electrotechnical Commission: Geneva, Switzerland, 2016.
  86. European Commission. Directive 2014/30/EU of the European Parliament and of the Council of 26 February 2014 on the harmonization of the laws of the Member States relating to electromagnetic compatibility (recast). Off. J. Eur. Union 2014, L 96, 79–107. [Google Scholar]
  87. European Commission. Directive 2014/53/EU of the European Parliament and of the Council of 16 April 2014 on the harmonisation of the laws of the Member States relating to the making available on the market of radio equipment and repealing Directive 1999/5/EC. Off. J. Eur. Union 2014, L 153, 62–112. [Google Scholar]
  88. European Commission. Directive 2011/65/EU of the European Parliament and of the Council of 8 June 2011 on the restriction of the use of certain hazardous substances in electrical and electronic equipment (recast). Off. J. Eur. Union 2011, L 174, 88–110. [Google Scholar]
Figure 1. Revenue gap of NPPs in USA [1].
Figure 1. Revenue gap of NPPs in USA [1].
Applsci 14 02552 g001
Figure 2. Contribution of O&M costs and fuel costs in coal, gas, and nuclear power plants [2].
Figure 2. Contribution of O&M costs and fuel costs in coal, gas, and nuclear power plants [2].
Applsci 14 02552 g002
Figure 3. Distribution of cited papers based on publication year.
Figure 3. Distribution of cited papers based on publication year.
Applsci 14 02552 g003
Figure 4. Physical security components of a typical NPP (U.S. NRC [24]).
Figure 4. Physical security components of a typical NPP (U.S. NRC [24]).
Applsci 14 02552 g004
Figure 5. Experimental setup of the balanced manipulator arm [31].
Figure 5. Experimental setup of the balanced manipulator arm [31].
Applsci 14 02552 g005
Figure 6. Typical four-wheeled mobile robot [35].
Figure 6. Typical four-wheeled mobile robot [35].
Applsci 14 02552 g006
Figure 8. Three-dimensional design of a tracked mobile robot [44].
Figure 8. Three-dimensional design of a tracked mobile robot [44].
Applsci 14 02552 g008
Figure 9. Representation of leg-wheel hybrid locomotion system [52].
Figure 9. Representation of leg-wheel hybrid locomotion system [52].
Applsci 14 02552 g009
Figure 10. Representation of a UAV and an intrusion outside of the detection area [55].
Figure 10. Representation of a UAV and an intrusion outside of the detection area [55].
Applsci 14 02552 g010
Figure 11. Physical picture of YL-2500 unmanned ship [59].
Figure 11. Physical picture of YL-2500 unmanned ship [59].
Applsci 14 02552 g011
Figure 12. Motion trajectory of a snake-like robot—(a) first step, (b) second step, (c) third step, and (d) fourth step [61].
Figure 12. Motion trajectory of a snake-like robot—(a) first step, (b) second step, (c) third step, and (d) fourth step [61].
Applsci 14 02552 g012
Figure 13. Worm-like robot—(A) schematic of muscles in earthworm, and (B) concept design of worm-like robot [63].
Figure 13. Worm-like robot—(A) schematic of muscles in earthworm, and (B) concept design of worm-like robot [63].
Applsci 14 02552 g013
Figure 14. HRI using data glove. (a) Schematic diagram of data glove; (b) different prototypes [70].
Figure 14. HRI using data glove. (a) Schematic diagram of data glove; (b) different prototypes [70].
Applsci 14 02552 g014
Figure 15. Mobile dog robots: (a) Ghost Robotics’ Vision 60; (b) Boston Dynamics’ Spot dog robot.
Figure 15. Mobile dog robots: (a) Ghost Robotics’ Vision 60; (b) Boston Dynamics’ Spot dog robot.
Applsci 14 02552 g015
Table 1. List of major keywords used in Google Scholar to track journal papers.
Table 1. List of major keywords used in Google Scholar to track journal papers.
Tracking SourceKeywords
Google Scholar“mobile robot”
“basic functionality of robot”
“basic capability of robot”
“quantitative performance of robot”
“wadoro”
“human-robot interaction”
“dog robot”
“manipulator robot arm”
“types of mobile robot”
“history of robotics”
“robot formalization of problem”
“security robot formalization of problem”
“unmanned aerial vehicles”
“DRONE”
“unmanned water vehicle”
Table 2. Database showing the number of papers based on publishers.
Table 2. Database showing the number of papers based on publishers.
PublisherNo. of Papers
Idaho National Laboratory2
Elsevier1
CRC press1
Annual Reviews in Control1
IEEE Robotics & Automation Magazine3
Progress in aerospace sciences1
IEEE Transactions on robotics and automation1
Journal of Engineering in Medicine1
McGraw-Hill1
2nd International Conference on Applied Robotics for the Power Industry (CARPI)1
U.S. NRC2
The MIT Press1
International Journal of Advanced Robotic Systems2
Journal of Field Robotics2
Frontiers of mechanical engineering1
Int. Res. J. Eng. Technol1
5th International Conference on Electrical Engineering, Computer Science, and Informatics (EECSI), IEEE1
IEEE Access1
IEEE 1st International Conference on Power Electronics, Intelligent Control and Energy Systems (ICPEICES)1
IEEE Latin America Transactions1
Springer3
Foundations and Trends® in Robotics1
Transactions of the Institute of Measurement and Control1
Ain Shams Engineering Journal1
MDPI—Machines2
International Conference on Advanced Robotics and Mechatronics (ICARM), IEEE1
Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems1
IEEE Colombian conference on robotics and automation (CCRA)1
Proceedings of the IEEE International conference on mechatronics and automation1
Proceedings of the SICE Annual Conference1
Auton Robot1
University of Adelaide1
Arabian Journal for Science and Engineering1
Independent Research Project in Applied Mathematics1
Journal of Robotics2
MDPI—Sensors5
TENCON 2019—2019 IEEE Region 10 Conference (TENCON)1
Advanced Science1
The International Journal of Robotics Research1
IROS2
Proceedings of the AAAI workshop on intelligent robotic systems1
2013 IEEE Third Joint International Conference on Development and Learning and Epigenetic Robotics (ICDL), IEEE1
IEEE Conference on Technologies for Homeland Security1
International Conference on Advanced Robotics1
International Journal of Production Research1
Proceedings 2006 IEEE International Conference on Robotics and Automation2
ASME International Mechanical Engineering Congress and Exposition1
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-Robot Interaction1
Communications of the ACM1
Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction1
Boston Dynamics1
Ghost Robotics1
ANYbotics1
MDPI—Actuators 1
MDPI—Applied Sciences1
MDPI—Symmetry1
MDPI—Aerospace1
Journal of Marine Science and Engineering1
MDPI—Robotics1
Robotics and Autonomous Systems2
Journal of Intelligent & Robotic Systems1
Intermountain Engineering, Technology and Computing (IETC). IEEE2
International Conference on Intelligent Robots and Systems (IROS). IEEE1
IEEE/ASME Transactions on Mechatronics1
Table 3. Subjective measures for assessing self-awareness [75].
Table 3. Subjective measures for assessing self-awareness [75].
S.N.MeasuresDescription
(a)Understanding of intrinsic limitationRecognize inherent function limitations.
(b)EffectivenessDetecting, isolating, and recovering from faults.
(c)Self-monitoring capacityAbility to monitor health, state, task progress, etc., and detect deviations from the normal.
Table 4. Sets of metrices to measure autonomy [76].
Table 4. Sets of metrices to measure autonomy [76].
S.N.Primary GroupingsSets of Metrics (in Each Grouping)
(a)Mission complexityTactical behavior.
Coordination and collaboration.
Performance.
Sensory processing/world modeling.
(b)Environmental difficultyStatic environment.
Dynamic environment.
Electronic/electromagnetic environment.
Mobility.
Mapping and navigation.
Urban environment.
Rural environment.
Operational environment.
(c)Level of HRI [60,61,62]Human intervention.
Operator workload.
Operator skill level.
Operator to unmanned system (UMS) ratio.
Table 5. Six interrelated generic metrics to guide the design of human–robot interaction [78].
Table 5. Six interrelated generic metrics to guide the design of human–robot interaction [78].
S.N.Metrics for Design
(a)Task effectiveness (TE)
(b)Neglect tolerance (NT) [75,78]
(c)Robot attention demand (RAD)
(d)Free time (FT)
(e)Fan out (FO)
(f)Interaction effort (IE)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sharma, U.; Medasetti, U.S.; Deemyad, T.; Mashal, M.; Yadav, V. Mobile Robot for Security Applications in Remotely Operated Advanced Reactors. Appl. Sci. 2024, 14, 2552. https://doi.org/10.3390/app14062552

AMA Style

Sharma U, Medasetti US, Deemyad T, Mashal M, Yadav V. Mobile Robot for Security Applications in Remotely Operated Advanced Reactors. Applied Sciences. 2024; 14(6):2552. https://doi.org/10.3390/app14062552

Chicago/Turabian Style

Sharma, Ujwal, Uma Shankar Medasetti, Taher Deemyad, Mustafa Mashal, and Vaibhav Yadav. 2024. "Mobile Robot for Security Applications in Remotely Operated Advanced Reactors" Applied Sciences 14, no. 6: 2552. https://doi.org/10.3390/app14062552

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop