Next Article in Journal
Process of Learning from Demonstration with Paraconsistent Artificial Neural Cells for Application in Linear Cartesian Robots
Previous Article in Journal
A Data-Driven Model Predictive Control for Quadruped Robot Steering on Slippery Surfaces
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Robotics: Five Senses plus One—An Overview

by
Rand N. Albustanji
1,*,†,
Shorouq Elmanaseer
1,† and
Ahmad A. A. Alkhatib
2,†
1
Department of Software Engneering, Al-Zaytoonah University of Jordan, Amman 11931, Jordan
2
Department of Computer Information Systems, Al-Zaytoonah University of Jordan, Amman 11931, Jordan
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Robotics 2023, 12(3), 68; https://doi.org/10.3390/robotics12030068
Submission received: 21 February 2023 / Revised: 22 April 2023 / Accepted: 26 April 2023 / Published: 4 May 2023
(This article belongs to the Section Sensors and Control in Robotics)

Abstract

:
Robots can be equipped with a range of senses to allow them to perceive and interact with the world in a more natural and intuitive way. These senses can include vision, hearing, touch, smell, and taste. Vision allows the robot to see and recognize objects and navigate its environment. Hearing enables the robot to recognize sounds and respond to vocal commands. Touch allows the robot to perceive information about the texture, shape, and temperature of objects through the sense of touch. Smell enables the robot to recognize and classify different odors. Taste enables the robot to identify the chemical composition of materials. The specific senses used in a robot will depend on the needs of the application, and many robots use a combination of different senses to perceive and interact with the environment. This paper reviews the five senses used in robots, their types, how they work, and other related information, while also discussing the possibility of a Sixth Sense.

1. Introduction

Robotics is an interdisciplinary field of computer science and engineering that is rapidly advancing and transforming the world. The field of robotics aims to design machines that can help and assist humans in various tasks. This field integrates knowledge and expertise in mechanical engineering, electrical engineering, information engineering, mechatronics, electronics, bioengineering, computer engineering, control engineering, software engineering, mathematics, and more. Our senses give us the power to explore the world around us! With our five senses—sight, hearing, touch, smell, and taste—we can perceive the world and its changes. Sensors are the devices that help robots do the same. To make robots even more effective, engineers have been exploring ways to give them sensory abilities, such as odor-sensing, vision, tactile sensing, hearing, and taste. In addition to the traditional five senses, some researchers are exploring the idea of a “Sixth Sense” for robots. Have you ever wondered how robots can see, hear, smell, taste, and touch?
Robotics aims to design machines that can assist and help humans in various tasks. The robotic system consists of several components that work together to perform a specific task or set of tasks. These components can vary depending on the type of robot and its intended purpose, but some common components found in many robotic systems include actuators, control systems, power supply, and sensors.
The integration of multiple sensory systems in robots has enabled them to perceive, interact with, and navigate their environment in a way similar to humans. These sensory systems include vision, touch, hearing, smell, and taste, as well as the idea of a Sixth Sense. However, developing effective sensory systems for robots is not without its challenges and limitations, such as the integration of multiple sensory inputs and the reliability of sensors in different environments. Despite these challenges, researchers are making significant progress in developing innovative solutions, such as advanced algorithms and artificial intelligence, enabling robots to perform complex tasks and interact with humans in new and exciting ways.
Considerable advancements have been achieved in the field of sensory systems for robots. However, there remain numerous challenges and limitations that must be addressed. One of the most significant challenges is the integration of multiple sensory inputs, which can be complex and require advanced algorithms to process and interpret the data [1]. Another challenge is the development of sensors that can operate reliably in a range of environments and conditions. For instance, sensors that rely on visual data may struggle in low-light or high-glare environments, while sensors that detect tactile information may struggle to distinguish between different textures [2]. There are also limitations related to the size, weight, and power requirements of sensors, which can limit their use in certain applications. Sensors that are too large or heavy may not be suitable for use in small robots or drones [3].
To address these challenges and limitations, researchers are exploring a range of solutions. These include the development of more advanced algorithms for processing sensory data, using machine learning and artificial intelligence to enable robots to adapt and learn from their environment, and developing new sensor technologies that are more robust and reliable [4]. There is also a growing focus on integrating sensors with other technologies, especially in robotics, to enable more advanced capabilities. The use of sensors in conjunction with robotic prosthetics can enable individuals with disabilities to regain some level of mobility and independence [5].
Overall, while there are many challenges and limitations in the development of sensory systems for robots, researchers are making significant progress in developing innovative solutions to these challenges. The integration of multiple sensory inputs has the potential to transform the field of robotics, multiple types of sensors were displayed in Figure 1, enabling robots to perform a wide range of complex tasks and interact with humans in new and exciting ways.
This article discusses the different types of sensory systems used in robotics, their components, functionalities, types of sensors used, and the most popular trends and challenges associated with each system.

2. Literature Review

Robots can sense, plan, and act. They are equipped with sensors that go beyond human capabilities! From exploring the surface of Mars to lightning-fast global deliveries, robots can do things humans can only dream of. When designing and building robots, engineers often use fascinating animal and human models to help decide which sensors they need. For instance, bats can be used as a model for sound-detecting robots, ants can be used as a model to determine smell, and bees can be used as a model to determine how they use pheromones to call for help.
Human touch helps us to sense various features of our environment, such as texture, temperature, and pressure. Similarly, tactile sensors in robots can detect these qualities and more. For instance, the robot vacuum cleaner (Roomba) uses sensors to detect objects through contact [7]. However, similar to sight and sound, a robot may not always know the precise content of what it picks up (a bag, a soft cake, or a hug from a friend); it just knows that there is an obstacle to be avoided or found.
Tactile sensing is a crucial element of intelligent robotic manipulation as it allows robots to interact with physical objects in ways that other sensors cannot [8]. This article provides a comprehensive overview of tactile sensing in intelligent robotic manipulation, including its history, common issues, applications, advantages, and disadvantages. It also includes a review of sensor hardware and delves into the major topics related to understanding and manipulation.
Robots are increasingly being used in various applications, including industrial, military, and healthcare. One of the most important features of robots is their ability to detect and respond to environmental changes. Odor-sensing technology is a key component of this capability. In a survey presented by [9], the current status of chemical sensing as a sensory modality for mobile robots was reviewed. The article evaluates various techniques that are available for detecting chemicals and how they can be used to control the motion of a robot. Additionally, it discusses the importance of controlling and measuring airflow close to the sensor to infer useful information from readings of chemical concentration.
Robot vision is an emerging technology that uses cameras and sensors to allow robots to interpret and respond to their environment, with numerous applications in the medical, industrial, and entertainment fields. It requires artificial intelligence (AI) techniques to produce devices that can interact with the physical world, and the accuracy of these devices depends on the vision techniques used. A survey by [10] presents a summary of data processing and domain-based data processing, evaluating various robot vision techniques, tools, and methodologies.
Robot sensors and ears detect EM waves. The sound waves heard by human ears can also be detected by some robot sensors, such as microphones. Other robot sensors can detect waves beyond our capabilities, such as ultrasound. Cloud-based speech recognition systems use AI to interpret a user’s voice and convert it into text or commands, enable robots to interact with humans in a more natural way, automate certain tasks, and are hosted on the cloud for increased reliability and cost-effectiveness [11]. We examined the potential of utilizing smart speakers to facilitate communication in human–robot interaction (HRI) scenarios.
For the past decade, robotics research has focused on developing robots with cognitive skills and the ability to act and interact with people in complex and unconstrained environments. To achieve this, robots must be capable of safely navigating and manipulating objects, as well as understanding human speech. However, in typical real-world scenarios, individuals who are speaking are often located at a distance, posing challenges for the robot’s microphone signals to capture the speech [12]. Researchers have addressed this challenge by working on enabling humanoid robots to accurately detect and locate both visible and audible people. Their focus has been on combining vision and hearing to recognize human activity.
The sense of taste is the most challenging sense to replicate in the structure of robots. A lot of research has been conducted on this subject, but a definitive solution has not yet been reached. The human tongue, despite its small size, is highly complex, with different parts responsible for perceiving different flavors—bitter, sour, and salty—which adds to the difficulty of electronically reproducing the tongue. However, robots can now have a sense of taste. They can be programmed to detect flavors and distinguish between different tastes. This is used in the food industry to ensure that food products meet the required quality standards [13]. The study presented a review of an e-tongue, a powerful tool for detecting and discriminating among tastes and flavors. It consists of a sensor array composed of several types of sensors, each sensitive to a different taste. By analyzing the output of these sensors, the electronic tongue can detect and differentiate between various tastes and flavors. Additionally, the electronic tongue can measure the concentration of a specific substance in a sample, as well as its bitterness and sweetness.
The Sixth Sense is a revolutionary new technology that can help to bridge the gap between humans and machines. It uses advanced artificial intelligence to recognize and respond to the user’s environment and surroundings. This technology can be used to create a more personal and interactive experience with machines, making them more human-like and helping to improve the overall user experience. The potential applications of this technology are endless, and it is sure to revolutionize how humans interact with machines and technology [14]. The researchers developed a gesture-controlled robot with an Arduino microcontroller and a smartphone. It uses a combination of hand gestures and voice commands to allow for a more intuitive way of controlling robots. With this technology, robots can be given complex commands with a few simple gestures.
We all know that the field of robotics is increasingly being applied across various domains with different attributes. To continue evolving in this field, adaptable methods and sensors must be found to be incorporated into underwater or flying robots. Flying robots are critical pieces of technology that use sensors and algorithms to detect and avoid obstacles or potential hazards in their path. They are designed to be lightweight and scalable, providing a high level of safety while allowing for efficient and effective operation. In their work, the authors of [15] developed a new sense-and-avoid system using active stereo vision, which is more effective than fixed-camera systems. The system only requires one stereo camera, which can save costs and make it more accessible.
Underwater exploration is essential to advancing our understanding of ocean resources and the environment. With the development of underwater robot technologies, such as autonomous navigation, remote manipulation, and underwater sensing capabilities, exploration of the underwater world has become much easier. Despite these advances, the complicated underwater environment poses many challenges to the development of state-of-the-art sensing technologies [16]. The authors discussed the current state of underwater sensing technologies by focusing on underwater acoustic sensing, underwater optical sensing, underwater magnetic sensing, underwater bionic sensing, the challenges of underwater sensing technology, and possible solutions.
Imagine this: All of these sensors used in robotics are combined into one real, social robot capable of human communication. One of the real applications that combined the aforementioned electronic senses, Pepper, is the world’s first social humanoid robot able to recognize faces and basic human emotions [17]. Researchers presented an extended estimate of Pepper’s capacity for human–robot social interaction through the new version of the speech recognition systems.

3. Vision

Robot vision—a game-changer for automation processes. By giving robots the ability to see, we have unlocked a whole new level of precision and accuracy in smart automated processes. Robotic vision works similar to human vision—it captures valuable data from 3D images and applies them to the robot’s programming algorithm. With this, robots can identify colors, find parts, detect people, check the quality, process information, read text, and much more [18].
Robot vision also helps simplify complex and expensive fixtures—giving robots the power to find objects in their working envelope and adapt to variations in part size, shape, and location. All of this ultimately reduces costs and improves system efficiency [19].
Robot vision and machine vision are two related but distinct fields. Robot vision is a subset of machine vision that focuses specifically on the use of computer vision techniques for robotic applications. Machine vision, on the other hand, is a broader field that encompasses a range of technologies and techniques for extracting information from visual inputs, including still images and videos [20].

3.1. Component of Robot Vision System

The major components of a machine vision system include lighting, lens, image sensor, vision processing, and communications. Lighting illuminates the part to be inspected, allowing its features to stand out so they can be clearly seen by the camera. The lens captures the image and presents it to the sensor in the form of light. The sensor in a machine vision camera converts this light into a digital image, which is then sent to the processor for analysis [21].
  • Lighting: Proper lighting is vital to the success of robotic vision systems. Poor lighting can cause an image to be undetectable to a robot, resulting in inefficient operation and loss of information.
  • Lenses: The lens of a vision system directs the light in order to capture an image.
  • Image sensor: Image sensors are responsible for converting the light that is captured by the lens into a digital image for the robot to later analyze.
  • Vision processing: Vision processing is how robotic vision systems obtain data from an image that is used by robots for analysis in order to determine the best course of action for operation.
  • Communications: Communications connect and coordinate all robotic vision components. This allows all vision components to effectively and quickly interact and communicate for a successful system.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22].
Vision processing consists of algorithms that review the image and extract required information, run the necessary inspection, and make a decision. Finally, communication is typically accomplished by either a discrete I/O signal or data sent over a serial connection to a device that is logging information or using it [21].
  • Sensing and digitizing: This process yields a visual image of sufficient contrast that is typically digitized and stored in the computer’s memory.
  • Image processing and analysis: The digitized image is subjected to image processing and analysis for data reduction and interpretation of the image. This function may be further subdivided into:
    • Prepossessing: It deals with techniques such as noise reduction and enhancement details.
    • Segmentation: It partitions an image into objects of interest.
    • Description: It computes various features, such as size, shape, etc., suitable for differentiating one object from another.
    • Recognition: It defines the object.
    • Interpretation: It assigns meaning to recognized objects in the scene.
  • Application: The current applications of robot vision include the following:(inspection, part identification, location, and orientation) [23].

3.2. Types of Vision Sensors Used in Robotics

Robotic vision sensor applications are multi-component devices with many moving parts. There are constant advancements in this area. Smart cameras, with their frequent applications in vehicle recognition systems, will be the most common vision sensors for many. On the other hand, vision sensors are commonly used in industry to track operations and ensure product safety.
There are two kinds of robotic vision sensors, each of which can be modified for various purposes:
  • Orthographic projection-type sensors: The rectangular field of view of orthographic projection-type robotic vision sensors is the most common. They are ideal for infrared sensors with short-range or laser-range finders.
  • Perspective projection-type sensors: The field of view of robotic vision sensors that use perspective projection has a trapezoidal shape. They are ideal for sensors that are used in cameras [18].

3.3. The Most Popular Trends and Challenges for Vision Sensing in Robotics

Robotic vision is a rapidly evolving field that has seen significant advances in recent years. However, there are still several research challenges that need to be addressed to further improve the capabilities of robotic vision systems.
  • Multi-sensor perception: Multi-sensor perception was identified as a popular trend in vision sensing in robotics in 2022. This field involves combining data from multiple sensors, such as cameras, LIDAR, and RADAR, to improve the accuracy and robustness of perception in robotics [24].
  • Explainable artificial intelligence (XAI): Explainable artificial intelligence (XAI) has emerged as a popular trend in vision sensing in robotics, where robots can explain their perceptions and decision-making processes to humans in a transparent and understandable manner [25].
  • Edge computing for real-time perception: Edge computing was identified as a popular trend in vision sensing in robotics in 2022 [26]; processing is moved from the cloud to the edge of the network, allowing for real-time perception and decision-making [27,28].
  • Human–robot collaboration and interaction: Human–robot collaboration/interaction continues to be a popular trend in vision sensing in robotics, with a focus on improving the ability of robots to perceive and respond to human gestures, expressions, and speech [29].
  • Robustness to lighting conditions: One of the key challenges in robotic vision is in developing systems that can work effectively work under varying lighting conditions. Possible solutions include developing algorithms that can adapt to different lighting conditions, using high dynamic range (HDR) cameras, and incorporating machine learning techniques to learn and adapt to different lighting conditions [30].
Overall, the field of robotic vision is rapidly advancing, and there are many exciting research directions and opportunities to further improve the capabilities of robotic vision systems.

4. Hearing Sense

There are several ways that robots can “hear”. One common method is to use microphones or other sensors that are able to detect sound waves and convert them into electrical signals. These signals can then be processed by the robot’s computer system and used to understand spoken commands or other sounds in the environment. Another method that some robots use to “hear” is to use lasers or other types of sensors to detect vibrations in the environment. These sensors can be used to detect the vibrations caused by sound waves, which can then be used to understand spoken commands or other sounds in the environment. Overall, the ability of robots to “hear” depends on the specific sensors and technology used, as well as the programming and algorithms that are in place to interpret and understand the signals that are being received [31].

4.1. Component of Robotic Hearing Systems

A robotic hearing system, also known as an auditory system, is a type of sensor that allows a robot to detect and interpret sound waves. The main components of a robotic hearing system include [32]:
  • Microphones or other sound sensors: These are the devices that detect sound waves and convert them into electrical signals. There are many different types of microphones and sound sensors that can be used, including those that use diaphragms, piezoelectric crystals, or lasers to detect vibrations.
  • Amplifiers: These are electronic devices that are used to amplify the electrical signals that are generated by microphones or sound sensors. They can help to improve the sensitivity and accuracy of hearing sensors.
  • Analog-to-digital converters (ADCs): These are devices that are used to convert the analog electrical signals from the microphones or sound sensors into digital data that can be processed by the robot’s computer system.
  • Computer system: This is the central processing unit of the robot, which is responsible for controlling the various functions and sensors of the robot. The computer system is used to process digital data from ADCs and interpret and understand spoken commands or other sounds in the environment.
  • Algorithms and software: These are the instructions and programs that are used by the computer system to analyze and interpret digital data from microphones or sound sensors. The algorithms and software may be designed to recognize specific words or sounds or to understand and respond to more complex spoken commands.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22].

4.2. Functionality of the Hearing System

The process for a robot to “hear” depends on the specific sensors and technology that are being used to detect sound waves. Here is a general overview of the process that may be used by a robot with microphones or other sound sensors [33]:
  • Sound waves enter the microphone or sensor.
  • The microphone or sensor converts the sound waves into electrical signals.
  • The electrical signals are sent to the robot’s computer system.
  • The computer system processes the signals and converts them into digital data.
  • The digital data are analyzed using algorithms and software designed to understand and interpret spoken language or other sounds.
  • Based on the analysis, the robot can take appropriate actions or respond to the sounds it has heard.

4.3. Types of Hearing Sensors Used in Robotics

There are many different types of hearing sensors that can be used in robots and other devices, including microphones that use a diaphragm to detect sound waves, piezoelectric sensors that use a crystal to detect vibrations, and laser Doppler vibrometers that use lasers to detect vibrations. The specific type of hearing sensor that is used may depend on the specific application and the requirements of the system in which it is being used [34].
  • Sound sensor: This is a simple, easy-to-use, and low-cost device used to detect sound waves in the air. It can measure the intensity of sound and convert it into an electrical signal that can be read through a microcontroller [35].
    • Pressure microphone: This is a microphone in which only one side of the diaphragm is exposed to the sound as the back is a closed chamber. The diaphragm responds solely to pressure, which has no direction. Therefore pressure microphones are omnidirectional.
    • High amplitude pressure microphone: Designed for very high amplitude measurements. It is used in small–closed couplers, confined spaces, or flush-mounted applications.
    • Probe microphone: Used for measurements in difficult or inaccessible situations, such as high temperatures or conditions of airflow. Its right-angled design makes it well-suited for measurements in exhaust systems, machinery, and scanning surfaces, such as loudspeakers and cabinets.
    • Acoustic pressure sensor: Consists of a stack of one or more acoustically responsive elements housed within a housing. The external acoustic pressure to be sensed is transmitted to the stack through a diaphragm, which is located in an end cap that closes the top portion of the housing.
  • Piezoelectric sensor: Uses a crystal to detect vibrations, including those caused by sound waves. Piezoelectric sensors are often used in robots because they are relatively small and can be easily integrated into the robot’s design [35].
  • Laser Doppler vibrometer: Uses lasers to detect vibrations, including those caused by sound waves. These vibrometers are highly sensitive and can detect very small vibrations. However, they are also more expensive and complex compared to some other types of hearing sensors [36].
  • Ultrasonic sensor: Uses infrared light to detect objects and measure distances. Infrared sensors are often used in robots to help them navigate and avoid obstacles, but they can also be used to detect certain types of sounds by detecting the vibrations that they cause in the environment [37].
  • Infrared sensor: Uses high-frequency sound waves to detect objects and measure distances. Ultrasonic sensors are often used in robots to help them navigate and avoid obstacles, but they can also be used to detect and interpret certain types of sounds [38].

4.4. The Most Popular Trends and Challenges in the Field of Robotic Hearing

  • Sound localization and separation: These were identified as popular trends in the field of robotic hearing in 2022 [39]. They involve enhancing the capability of robots to accurately locate and separate different sound sources in noisy environments [40].
  • Speech recognition and synthesis: They continue to be popular trends in the field of robotic hearing, with a focus on improving the ability of robots to understand and produce human speech [41].
  • Auditory scene analysis: This has emerged as a popular trend in the field of robotic hearing, where robots can analyze complex sound scenes and identify individual sound sources [42].
  • Cross-modal perception: This was a popular trend in the field of robotic hearing in 2022 [43]; information from different sensory modalities, such as vision and hearing, is combined to improve the accuracy and robustness of perception in robotics [44].

5. Tactile Sense

Tactile sense, also known as the sense of touch, allows humans and animals to perceive and interpret information about the texture, shape, and temperature of objects through the sense of touch. In robotics, the tactile sense is often simulated using sensors that are placed on the surface of a robot’s skin or limbs [45].
These sensors can detect pressure, temperature, and other physical sensations, and send this information to the robot’s control system. The control system can then use this information to make decisions about how to interact with the environment and how to move the robot’s body. Some robots also use haptic feedback, which allows them to transmit a sense of touch to the user by vibrating or applying pressure to the skin [46].

5.1. Components of Robotic Tactile Sensing

  • Sensing: The robot uses sensors to detect physical sensations, such as pressure, temperature, and force. These sensors may be mounted on the surface of the robot’s skin or limbs, and they may be connected to the control system through wires or wireless signals.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22].
  • Data processing: The control system processes the sensor data using algorithms that interpret the data and provide the robot with a sense of touch. This may involve filtering the data to remove noise or errors and applying algorithms to extract information about the shape, size, and texture of objects in the environment.
  • Decision-making: The control system uses the processed sensor data to make decisions about how to interact with the environment and how to move the robot’s body. This may involve adjusting the robot’s grip on an object, avoiding collisions, or navigating around obstacles.
  • Actuation: The control system sends commands to the robot’s actuators, which are responsible for moving the robot’s body. The actuators may be motors, servos, or other types of mechanical devices, and they use the commands from the control system to move the robot’s limbs and other body parts [47].
Overall, the process of a robotic tactile sense involves using sensors to gather information about the environment, processing this information to extract meaningful data, and using these data to make decisions about how to interact with the environment and move the robot’s body.

5.2. Functionally of a Robotic Tactile System

The primary function of a robotic tactile system is to allow a robot to perceive and interpret information about the environment through the sense of touch. This can be useful in a number of different applications, including object recognition, grasping and manipulation, and navigation. In object recognition, a tactile sensor can be used to detect the shape, size, and texture of an object, allowing the robot to identify and classify the object. In grasping and manipulation tasks, a tactile sensor can be used to detect the force being applied to the object, allowing the robot to adjust its grip and apply the appropriate amount of force. In navigation, a tactile sensor can be used to detect obstacles and other features of the environment, allowing the robot to avoid collisions and navigate around obstacles.
Overall, the tactile sense is an important aspect of a robot’s ability to interact with and understand the environment, and it is a key component of many modern robotics systems [48].

5.3. Types of Tactile Sensors Used in Robotics

  • Pressure sensors: These sensors detect the amount of pressure applied to a surface. They can be used to measure the weight of an object or to detect when an object comes into contact with the robot.
  • Temperature sensors: These sensors are used to measure the temperature of an object or surface. They can be used to detect changes in temperature or to monitor the temperature of an object over time [49].
  • Force sensors: These sensors are used to measure the force or strength of an object or surface. They can be used to detect the amount of force being applied to the robot or to measure the strength of an object [50].
  • Strain sensors: These sensors are used to measure the deformation of an object or surface. They can be used to detect changes in an object’s shape or size or measure the amount of strain being applied to the robot [51].
  • Tactile array sensors: These sensors are made up of a large number of individual sensors that are arranged in a grid or matrix. They can be used to detect the texture, shape, and size of an object or to detect the movement of an object across the surface of the sensor [52], One of the examples of applications in this Type of Tactile Sensor Displays in the Figure 2.

5.4. The Most Popular Trends and Challenges in Robotic Tactile Sensing

Tactile sensing is an important aspect of robotics that allows robots to perceive and interpret the physical world around them through touch. In recent years, there have been several trends and challenges in the field of robotic tactile sensing, including:
  • Soft robotics: Soft robots are robots that are made from flexible materials, such as silicone or rubber, and are designed to be able to deform and adapt to their environment. Soft robots have the potential to be more dexterous and capable of delicate manipulation; tactile sensing is a critical component of their ability to interact with the world around them [53].
  • Artificial skin: Researchers are working on developing artificial skin for robots that is capable of detecting and interpreting tactile information; for example: manipulating cloth. Reference [54] proposed using tactile sensing to improve the ability of a robot to manipulate cloth. This can include pressure, temperature, and texture, which can help robots interact more effectively with their environments and humans [55].
  • Grasping and manipulation: Tactile sensing is essential for robots to be able to grasp and manipulate objects, especially those that are delicate or irregularly shaped. Researchers are working on developing new algorithms and sensors to improve the ability of robots to sense and manipulate objects, even in complex environments [56].
  • Prosthetics and rehabilitation: Tactile sensing is also important in the development of prosthetics and rehabilitation devices. By incorporating tactile sensors into these devices, it is possible to provide users with a more natural and intuitive experience, which can help improve their quality of life [57].
  • Perception and learning: Tactile sensing can be used to help robots learn about their environment and develop more sophisticated perception capabilities. By using tactile feedback, robots can better understand the physical properties of objects and surfaces, which can help them make more informed decisions and improve their ability to interact with the world around them [58].
Figure 2. Tactile sensors in the first robot by Sony, AIBO [59].
Figure 2. Tactile sensors in the first robot by Sony, AIBO [59].
Robotics 12 00068 g002

6. Electrical Nose

An electronic nose is a technology that involves biological olfactory performance. It is used to recognize complex, volatile molecules, which can emulate the architecture and function of the olfactory system [60].
The use of olfactory sensing is gaining significant importance in various domains such as the food industry, environmental monitoring, and medical treatments. Here is a summary of electronic and bio-electronic noses: electronic noses have considerable application prospects in the identification of scents, such as wines, vegetables, cigarettes, and so on. It is broadly employed in odor sensing, raw material inspection, quality marking, and process administration. It is one of the essential tools for quality assurance and quality control [61].
The electronic nose is mainly employed in fruit and vegetable testing to evaluate quality, detect maturity, and identify species. Additionally, in terms of medical diagnostics, the electronic nose is cutting-edge for the early detection of diseases. By collecting only a small volume of human breath, the electronic nose can capture the various scents within it by using a bioreceptor, which then forms a processed chemical map [62].
Different sensors based on various chemical detection principles and electronic noses have been used in clinical disease diagnosis applications.

6.1. Components of an Electrical Nose

An electronic nose typically consists of the following components:
  • Sensors: These are the components that detect the chemical compounds present in the sample being analyzed [63].
  • Data acquisition system: This component is responsible for collecting and storing the data from the sensors [64].
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22].
  • Data analysis system: This component is responsible for analyzing the data from the sensors and determining the specific chemicals present in the sample [65].
  • Display or output device: This component is used to present the results of the analysis to the user [66].
  • Sample introduction system: This component is responsible for introducing the sample to be analyzed into the electronic nose [67].
  • Power supply: This component provides the electrical power required to operate the electronic nose [66].
  • Housing or enclosure: This component encloses and protects the other components of the electronic nose [66].

6.2. Functionality of Electrical Nose

The smells are composed of molecules, which have specific sizes and shapes. Each of these molecules has a correspondingly sized/shaped receptor in the human nose. When a specific receptor receives a molecule, it sends a signal to the brain and the brain identifies the smell associated with the particular molecule [68].
The electronic nose is one of the most advanced pieces of equipment in the world. It works by pulling an air sample through a tube into a chamber housing the electronic sensor array. As the sample-handling unit exposes the sensors to the odorant, the VOCs interact with the surface and bulk of the sensor’s active material, resulting in a steady-state condition in a few seconds to a few minutes. The response is then recorded and delivered to the signal-processing unit before a washing gas is applied to the array to remove the odorant mixture. Finally, a reference gas is applied to the array to prepare it for a new measurement cycle. This process is known as the response and recovery times of the sensor array [69].

6.3. Types of Electrical Nose Sensors

There are several different types of sensors that can be used in an electronic nose, including metal oxide semiconductor (MOS) sensors, quartz crystal microbalance (QCM) sensors, and surface acoustic wave (SAW) sensors. These sensors work by detecting changes in electrical or mechanical properties that occur when an odorant is present, A comparison between the natural human nose and the electronic nose is shown in Figure 3. The use of electronic noses has a wide range of applications, including the detection of harmful gases and chemicals, the quality control of food and beverages, and the diagnosis of diseases, such as cancer and diabetes. They can also be used in environmental monitoring and the detection of illegal drugs and explosives [62].
There are several types of sensors that can be used in an electronic nose, including [62]:
  • Metal oxide semiconductor (MOS) sensors: These sensors are based on the principle of detecting changes in electrical conductivity in response to chemical exposure. They are often used in electronic noses because they are relatively inexpensive and have a fast response time.
  • Quartz crystal microbalance (QCM) sensors: These sensors are based on the principle of detecting changes in the resonant frequency of a quartz crystal in response to chemical exposure. They are highly sensitive and can detect very small amounts of chemicals.
  • Surface acoustic wave (SAW) sensors: These sensors are based on the principle of detecting changes in the velocity of an acoustic wave propagating on the surface of a piezoelectric material in response to chemical exposure. They are highly sensitive and can detect very small amounts of chemicals.
  • Gas sensors: Devices that detect the presence and concentration of gases in the air or environment. They work by converting the interaction between gas molecules and the sensing material into an electrical signal. There are different types of gas sensors, such as electrochemical, optical, and semiconductor sensors, each with specific principles and applications. These sensors are used in a variety of industries and applications, including air quality monitoring, industrial safety, environmental monitoring, and medical diagnosis [70].
  • Mass spectrometry sensors: These sensors use mass spectrometry to identify and quantify individual chemical compounds. They are highly sensitive and can identify a wide range of chemicals, but they are also relatively expensive and require a long analysis time.

6.4. The Most Popular Trends and Challenges for Robotic Electrical Noses

An electrical nose, also known as an e-nose, is a type of sensor array that is used to detect and identify different odors and volatile organic compounds. The use of electrical noses in robotics is a rapidly growing field, with many new trends and challenges emerging. Some of the most popular trends and challenges for robotic electrical noses include:
  • Miniaturization: One of the biggest challenges in the development of electrical noses for robotics is the need for miniaturization, the overall size of the e-nose is 164 × 143 × 65 mm [71]. In order for robots to be able to use an electrical nose, the sensor array must be small and lightweight enough to be integrated into the robot’s design. Researchers are working on developing new materials and fabrication techniques to create smaller and more efficient electrical noses [72].
  • Sensitivity and selectivity: Achieving high sensitivity and selectivity poses significant challenges in the development of electrical noses for robotics. The sensor array must be able to detect a wide range of different odors and distinguish between them, even in complex environments. Researchers are working on developing new sensor materials and signal-processing algorithms to improve the sensitivity and selectivity of electrical noses [73].
  • Real-time response: In many applications, such as environmental monitoring or industrial process control, it is important for the electrical nose to provide a real-time response. This requires fast data acquisition and processing, as well as a robust control system to interpret and respond to the sensor data [74].
  • Machine learning: With the growing volume of data generated by electrical noses, machine learning algorithms are playing an increasingly vital role in the development of robotic applications. Researchers are working on developing new machine learning techniques to improve the accuracy and reliability of electrical noses in detecting and identifying different odors [75].
  • Integration with other sensors: In order to provide a comprehensive understanding of the environment, it is often necessary to integrate electrical noses with other sensors, such as cameras or microphones. This requires the development of new algorithms and control systems to integrate and interpret data from multiple sources [76].
Figure 3. Schematic diagram of an e-nose device versus a biological olfactory system [77].
Figure 3. Schematic diagram of an e-nose device versus a biological olfactory system [77].
Robotics 12 00068 g003

7. Electronic Tongue

An electronic tongue is a device that can mimic the ability of a human tongue to taste and distinguish different flavors. It typically consists of a series of sensors that can detect and measure various chemical properties. Electronic tongues are used in a variety of applications, including quality control in the food and beverage industry, monitoring the purity of water and other liquids, and the development of new flavors for products [78].

7.1. Components of an Electronic Tongue

There are a wide variety of materials that can be used for sensing in electronic tongues. The specific materials used depend on the type of flavor or chemical property being detected, as well as the specific design and requirements of the electronic tongue [79].
  • Conductive polymers: These are special polymers that are highly conductive and can be used to detect changes in conductivity, which can be indicative of certain flavors or chemical properties [80].
  • Ion-selective electrodes: These are electrodes that are selectively sensitive to particular types of ions, such as sodium or potassium. They can be used to detect changes in the concentration of these ions, which can be indicative of certain flavors or chemical properties [81].
  • Optical fibers: These are fibers made of special materials that can transmit light over long distances. They can be used to detect changes in the refractive index or other optical properties of a substance, which can be indicative of certain flavors or chemical properties [82].
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22].
  • Piezoelectric materials: These are materials that produce an electrical charge when subjected to mechanical stress or strain. They can be used to detect changes in the mechanical properties of a substance, which can be indicative of certain flavors or chemical properties [83].
  • Surface acoustic wave devices: These are devices that use sound waves to detect changes in the properties of a substance. They can be used to detect changes in the viscosity, density, or other properties of a substance, which can be indicative of certain flavors or chemical properties [84].

7.2. Functionality of Electronic Tongue

The specific methods and technologies used to build an electronic tongue can vary depending on the application and the type of substance that is tasted. Some electronic tongues use sensors based on chemical reactions, while others use sensors based on physical properties or optical properties. Some electronic tongues are designed to detect a specific type of flavor or chemical property, while others are more general-purpose and can detect a wider range of flavors and properties. Once the electronic tongue has collected data from the sensors, it processes the data and determines the flavor profile of the substance being tasted. This is typically done using machine learning algorithms that have been trained on a large dataset of known flavors and chemical properties. The electronic tongue compares the data from the sensors to the known flavors in its dataset and determines the most likely match. The results of the tasting are then displayed to the user, either on a screen or through some other output mechanism [85].

7.3. Types of Electric Tongue Sensors

Tasting sensors for robots typically work by measuring the chemical composition of a substance using a variety of techniques, such as spectroscopy, chromatography, or electrochemistry. The sensors may also use specialized sensors or probes to detect specific compounds or elements in a substance [86].
Here is a list of common types of tasting sensors that may be used in robots:
  • pH sensors: These sensors are used to measure the acidity or basicity of a substance, and are often used in food and beverage production to monitor the pH of products.
  • Conductivity sensors: These sensors measure the ability of a substance to conduct electricity, and can be used to detect the presence of certain ions or compounds in a sample.
  • Temperature sensors: These sensors measure the temperature of a substance and can be used to monitor the temperature of food or other products during processing.
  • Spectroscopy sensors: These sensors use light or other electromagnetic radiation to analyze the chemical composition of a substance, and can be used to detect specific compounds or elements.
  • Chromatography sensors: These sensors use techniques such as gas chromatography or liquid chromatography to separate and analyze the components of a substance; they can be used to identify specific compounds or measure the concentrations of different substances.
  • Electrochemical sensors: These sensors use electrical currents to detect the presence of certain ions or compounds in a sample, and can be used to detect the presence of contaminants or other substances of interest.

7.4. The Most Popular Trends and Challenges for Robotic Electronic Tongues

An electronic tongue is a type of sensor array that is used to analyze and identify different taste properties, such as sweet, sour, salty, bitter, and umami. The use of electronic tongues in robotics is a rapidly growing field, with many new trends and challenges emerging. Some of the most popular trends and challenges for robotics electronic tongues include:
  • Miniaturization: Similar to electrical noses, one of the biggest challenges in the development of electronic tongues for robotics is the need for miniaturization. The sensor array must be small and lightweight enough to be integrated into the robot’s design [72].
  • Sensitivity and selectivity: As with electrical noses, achieving high sensitivity and selectivity is an important challenge for electronic tongues. The sensor array must be able to detect and distinguish between different taste properties in a wide range of environments and conditions [73].
  • Real-time response: Similar to electrical noses, real-time response is important in many applications for electronic tongues, such as food and beverage quality control. This requires fast data acquisition and processing, as well as a robust control system to interpret and respond to the sensor data [74].
  • Machine learning: Machine learning algorithms are becoming increasingly important in the development of robotics applications, including electronic tongues. Researchers are working on developing new machine learning techniques to improve the accuracy and reliability of electronic tongues in detecting and identifying different taste properties [75].
  • Integration with other sensors: To provide a comprehensive understanding of food and beverage products, it is often necessary to integrate electronic tongues with other sensors, such as color sensors or pH sensors. This requires the development of new algorithms and control systems to integrate and interpret data from multiple sources [76].

8. Sixth Sense

A Sixth Sense could refer to a range of different things, depending on the context in which it is used. Some people might use the term to describe a heightened intuition or a sense of awareness that goes beyond the five physical senses of sight, hearing, taste, touch, and smell. In the context of robotics, a Sixth Sense might refer to a system or ability that allows a robot to perceive and interact with its environment in a way that goes beyond its basic sensors and actuators. This could include the ability to sense and respond to temperature, pressure, or other physical phenomena, or the ability to process and understand complex visual or auditory information [87].

8.1. The Difference between the Sixth Sense and the Other Five Senses in Robots

The idea of a Sixth Sense is often used colloquially to refer to a supposed extra sensory ability beyond the five traditional senses (sight, hearing, taste, touch, and smell). While there is no scientific evidence to support the existence of a distinct Sixth Sense, there are certainly other senses and abilities that go beyond the traditional five, such as proprioception (the ability to sense the position and movement of one’s own body) and interoception (the ability to perceive internal bodily sensations). It is true that many of these senses and abilities involve similar sensory components as the traditional senses (such as receptors in the skin for touch or the inner ear for balance) [88].
However, the ultimate difference lies in the way the brain processes and integrates these various sensory inputs, as well as the conscious experiences and interpretations that result. For example, while vision and hearing rely on distinct sensory organs and pathways in the brain, they work together to create a cohesive and multi-dimensional perception of the world [89]. Similarly, proprioception and interoception involve a complex interplay of sensory inputs and motor outputs, allowing us to navigate and interact with our environment in a seamless and coordinated way. Ultimately, the idea of a Sixth Sense may be more of a metaphorical concept than a literal one, representing the idea that our sensory experiences and abilities go beyond a simple sum of their parts, and are shaped by complex interactions between our brains, bodies, and the world around us.

8.2. Components of a Sixth Sense System

A Sixth Sense robot would likely include the following components [90]:
  • Sensors: These would include visual sensors cameras, auditory sensors microphones, tactile sensors, gustatory sensors, sensors for detecting taste, and olfactory sensors.
  • Manipulators: These would include arms, hands, or other devices that allow the robot to interact with its environment, such as picking up objects or manipulating tools.
  • Processor: This would be the central "brain" of the robot, responsible for processing the data from the sensors, executing commands, and controlling the manipulators.
  • Data collection: The sensor data are collected by the control system and stored in a buffer or memory [22].
  • Power supply: This would provide the electrical power required to operate the robot.
  • Housing or enclosure: This would enclose and protect the other components of the robot.
  • Machine learning system: This would allow the robot to learn and adapt to new situations and environments, using techniques such as artificial neural networks and other machine learning algorithms.

8.3. Functionality of a Sixth Sense System

In terms of functionality, a Sixth Sense robot might be able to [91]:
  • Navigate its environment using visual and/or other sensors to avoid obstacles and locate objects or destinations.
  • Identify and classify objects and other beings using visual and/or other sensors, and possibly use machine learning algorithms to improve its ability to recognize and classify new objects and beings.
  • Interact with objects and other beings using its manipulators, and possibly using force sensors and other sensors to gauge the appropriate amount of force to apply.
  • Communicate with other beings using various modalities such as speech, gestures, and facial expressions.
  • Learn and adapt to new situations and environments using machine learning algorithms and other techniques to improve its performance over time.

8.4. Types of Sixth Sense Sensors

There are many different types of sensors that could potentially be used to enable a Sixth Sense in a robot. Some examples of sensors that might be used for this purpose include:
  • Temperature sensors: These sensors can detect changes in temperature and could be used to enable a robot to sense and respond to changes in its environment [92].
  • Pressure sensors: These sensors can detect changes in pressure and could be used to enable a robot to sense and respond to changes in its environment, such as changes in the amount of force being applied to it [93].
  • Humidity sensors: These sensors can detect changes in humidity and could be used to enable a robot to sense and respond to changes in its environment [92].
  • Cameras: These sensors can capture images and video, and could be used to enable a robot to perceive and understand its environment in a more sophisticated way [94].
  • Microphones: These sensors can detect and record sound waves, and could be used to enable a robot to perceive and understand its environment through hearing [94].
  • LiDAR sensors: These sensors use lasers to measure distance and can be used to enable a robot to build up a detailed 3D map of its environment [95].

8.5. Sixth Sense Techniques

There are many different ways that a Sixth Sense could work in a robot, depending on the specific capabilities and goals of the system.
  • Using sensors to detect and interpret physical phenomena that are not directly visible to the robot, such as temperature, pressure, or humidity. For example, a robot might have a Sixth Sense for temperature that allows it to detect changes in the ambient temperature and adjust its behavior accordingly [96].
  • Using machine learning algorithms to process and interpret complex visual or auditory information in real-time. For example, a robot might be equipped with cameras and machine learning algorithms that allow it to recognize and classify objects in its environment, or to understand and respond to spoken commands [97].
  • Using neural networks or other artificial intelligence techniques to enable the robot to make decisions and take actions based on its environment and its goals. For example, a robot might be programmed to navigate through a crowded environment by using its Sixth Sense to avoid obstacles and find the optimal path [98].
Overall, the specific capabilities and implementation of a robot’s Sixth Sense will depend on the goals of the system and the technological resources available.

8.6. The Most Popular Trends and Challenges for a Robot’s Sixth Sense

The concept of the Sixth Sense in robotics typically refers to the ability of robots to perceive and interact with the world in ways that go beyond the five traditional senses of sight, hearing, touch, taste, and smell. Here are some of the most popular trends and challenges in the development of Sixth Sense capabilities for robots:
  • Multi-sensor fusion: One of the main challenges in developing Sixth Sense capabilities for robots is the need to integrate data from multiple sensors and sources. This involves developing sophisticated algorithms for data fusion and interpretation that can combine information from a wide range of sensors, such as cameras, microphones, pressure sensors, and temperature sensors [99].
  • Machine learning: Machine learning and artificial intelligence (AI) are key technologies that can help robots develop a Sixth Sense. By analyzing and interpreting data from multiple sensors, robots can learn to recognize patterns and make predictions about their environment. This can help robots navigate complex environments, detect and avoid obstacles, and interact more intelligently with their surroundings [100].
  • Haptic feedback: Haptic feedback, which involves providing robots with the ability to feel and respond to physical stimuli, is a key part of developing a Sixth Sense for robots. This involves developing sensors and actuators that can provide feedback to robots about their environment, such as changes in pressure or temperature [101].
  • Augmented reality (AR): Augmented reality technology can be used to enhance a robot’s perception of the world by providing additional visual or auditory information. This can help robots recognize and interact with objects more effectively, even in complex and changing environments [102].
  • Human–robot interaction: Developing a Sixth Sense for robots also requires the ability to interact with humans in a natural and intuitive way. This involves developing sensors and algorithms that can recognize human gestures and expressions, as well as natural language processing capabilities that enable robots to understand and respond to human speech [103].

9. Summary

Robots can be equipped with various sensors to allow them to perceive their environment and interact with it in a way that is similar to how humans use their senses. These sensors can be used to give robots the ability to see, hear, touch, taste, and smell.
This article focuses on six sensory systems used in robotics, namely vision, hearing, tactile, electrical nose, electronic tongue, and the Sixth Sense. Each sensory system is discussed in detail, starting with the components, functionalities, types of sensors used, and the most popular trends and challenges.
The vision system is essential in robotics and enables robots to capture and interpret visual information from their surroundings. The article discusses the different types of vision sensors used, including CCD and CMOS sensors, and the challenges associated with vision sensing, such as low-light conditions.
The hearing system enables robots to perceive and interpret sounds from their environment. The article discusses the different types of hearing sensors, including MEMS microphones and ultrasonic sensors, and the challenges associated with hearing sensing, such as noise interference.
The tactile sensing system enables robots to sense and interpret touch and pressure information from their environment. The article discusses the different types of tactile sensors, including capacitive and resistive sensors, and the challenges associated with tactile sensing, such as sensor placement and calibration.
The electrical nose and electronic tongue systems enable robots to sense and interpret odor and taste information from their environment. The article discusses the different types of sensors used in each system, including metal oxide sensors and biosensors, and the challenges associated with these systems, such as sensor drift.
Finally, the article discusses the Sixth Sense system, which enables robots to perceive and interpret information beyond the five senses. The article discusses the different types of Sixth Sense sensors, including magnetic and electric field sensors, and the challenges associated with Sixth Sense sensing, such as interference from other sensors. The outcomes are introduced in Table 1.

10. Conclusions

This review highlighted the significance of sensory systems in robotics, allowing robots to perceive and interact with their environments in a manner similar to humans. The findings of this research provide valuable insights for researchers and developers in the field, as robots equipped with sensory systems can collect data, perform experiments, and complete tasks more efficiently. As technology continues to advance, we can expect to see even more applications of robotics in various industries.
The integration of sensory systems into robotics has improved their effectiveness in completing tasks. Each sensory system discussed in this article has unique components, functionalities, and limitations associated with them. Despite the challenges, the advancement of technology has helped in addressing these limitations, making robots more efficient in their operations.
The field of sensing for robots is a critical area of research with numerous challenges and limitations to overcome, such as the integration of multiple sensory inputs. However, researchers are continuously developing innovative solutions to these challenges, such as machine learning and artificial intelligence, and integrating sensors with other technologies, such as robotics. The potential applications of these developments are enormous, ranging from robotic prosthetics to autonomous vehicles and environmental monitoring systems. With continued research and development, the field of sensing for robotics has the potential to revolutionize many industries and improve the quality of life for people worldwide.
Overall, the integration of sensory systems in robotics is an exciting and rapidly evolving field that holds great promise for the future. As technology continues to advance, we can expect to see more innovations and advancements in this area, enabling robots to perceive and interact with their environments in a manner similar to humans.

Author Contributions

All authors contributed equally to all aspects of this paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Acknowledgments

The authors appreciate the contributions of the previous researchers whose works are referenced in this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, J.; Gao, Q.; Pan, M.; Fang, Y. Device-Free Wireless Sensing: Challenges, Opportunities, and Applications. IEEE Netw. 2018, 32, 132–137. [Google Scholar] [CrossRef]
  2. Zhu, Z.; Hu, H. Robot Learning from Demonstration in Robotic Assembly: A Survey. Robotics 2018, 7, 17. [Google Scholar] [CrossRef]
  3. Yousef, H.; Boukallel, M.; Althoefer, K. Tactile sensing for dexterous in-hand manipulation in robotics—A review. Sens. Actuators A Phys. 2011, 167, 171–187. [Google Scholar] [CrossRef]
  4. Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef]
  5. Ishida, H.; Wada, Y.; Matsukura, H. Chemical Sensing in Robotic Applications: A Review. IEEE Sens. J. 2012, 12, 3163–3173. [Google Scholar] [CrossRef]
  6. Andrea, C.; Navarro-Alarcon, D. Sensor-Based Control for Collaborative Robots: Fundamentals, Challenges, and Opportunities. Front. Neurorobot. 2021, 113, 576846. [Google Scholar] [CrossRef]
  7. Coggins, T.N. More work for Roomba? Domestic robots, housework and the production of privacy. Prometheus 2022, 38, 98–112. [Google Scholar] [CrossRef]
  8. Blanes, C.; Ortiz, C.; Mellado, M.; Beltrán, P. Assessment of eggplant firmness with accelerometers on a pneumatic robot gripper. Comput. Electron. Agric. 2015, 113, 44–50. [Google Scholar] [CrossRef]
  9. Russell, R.A. Survey of Robotic Applications for Odor-Sensing Technology. Int. J. Robot. Res. 2001, 20, 144–162. [Google Scholar] [CrossRef]
  10. Deshmukh, A. Survey Paper on Stereo-Vision Based Object Finding Robot. Int. J. Res. Appl. Sci. Eng. Technol. 2017, 5, 2100–2103. [Google Scholar] [CrossRef]
  11. Deuerlein, C.; Langer, M.; Seßner, J.; Heß, P.; Franke, J. Human-robot-interaction using cloud-based speech recognition systems. Procedia Cirp 2021, 97, 130–135. [Google Scholar] [CrossRef]
  12. Alameda-Pineda, X.; Horaud, R. Vision-guided robot hearing. Int. J. Robot. Res. 2014, 34, 437–456. [Google Scholar] [CrossRef]
  13. Tan, J.; Xu, J. Applications of electronic nose (e-nose) and electronic tongue (e-tongue) in food quality-related properties determination: A review. Artif. Intell. Agric. 2020, 4, 104–115. [Google Scholar] [CrossRef]
  14. Chanda, P.; Mukherjee, P.K.; Modak, S.; Nath, A. Gesture controlled robot using Arduino and android. Int. J. 2016, 6, 227–234. [Google Scholar]
  15. Chen, G.; Dong, W.; Sheng, X.; Zhu, X.; Ding, H. An Active Sense and Avoid System for Flying Robots in Dynamic Environments. IEEE/ASME Trans. Mechatron. 2021, 26, 668–678. [Google Scholar] [CrossRef]
  16. Cong, Y.; Gu, C.; Zhang, T.; Gao, Y. Underwater robot sensing technology: A survey. Fundam. Res. 2021, 1, 337–345. [Google Scholar] [CrossRef]
  17. De Jong, M.; Zhang, K.; Roth, A.M.; Rhodes, T.; Schmucker, R.; Zhou, C.; Ferreira, S.; Cartucho, J.; Veloso, M. Towards a robust interactive and learning social robot. In Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems, Stockholm, Sweden, 10–15 July 2018; pp. 883–891. [Google Scholar]
  18. Chakraborty, E. What Is Robotic Vision?|5+ Important Applications. Lambda Geeks. Available online: https://lambdageeks.com/robotic-vision-important-features/ (accessed on 4 February 2023).
  19. YouTube. Robotic Vision System AEE Robotics Part 9. 2021. Available online: https://www.youtube.com/watch?v=7csTyRjKAeE (accessed on 4 February 2023).
  20. Tao, S.; Cao, J. Research on Machine Vision System Design Based on Deep Learning Neural Network. Wirel. Commun. Mob. Comput. 2022, 2022, 4808652. [Google Scholar] [CrossRef]
  21. LTCC, PCB an Reticle Inspection Solutions—Stratus Vision AOI. (n.d.). Stratus Vision AOI. Available online: https://stratusvision.com/ (accessed on 4 February 2023).
  22. Pan, L.; Yang, S.X. An Electronic Nose Network System for Online Monitoring of Livestock Farm Odors. IEEE/ASME Trans. Mechatron. 2009, 14, 371–376. [Google Scholar] [CrossRef]
  23. Understanding What Is a Robot Vision System|Techman Robot. Techman Robot. 2021. Available online: https://www.tm-robot.com/en/robot-vision-system/ (accessed on 4 February 2023).
  24. Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
  25. Ahmed, I.; Jeon, G.; Piccialli, F. From Artificial Intelligence to Explainable Artificial Intelligence in Industry 4.0: A Survey on What, How, and Where. IEEE Trans. Ind. Inform. 2022, 18, 5031–5042. [Google Scholar] [CrossRef]
  26. Nikravan, M.; Kashani, M.H. A review on trust management in fog/edge computing: Techniques, trends, and challenges. J. Netw. Comput. Appl. 2022, 204, 103402. [Google Scholar] [CrossRef]
  27. Huang, P.; Zeng, L.; Chen, X.; Huang, L.; Zhou, Z.; Yu, S. Edge Robotics: Edge-Computing-Accelerated Multirobot Simultaneous Localization and Mapping. IEEE Internet Things J. 2022, 9, 14087–14102. [Google Scholar] [CrossRef]
  28. Wang, S.-T.; Li, I.-H.; Wang, W.-Y. Human Action Recognition of Autonomous Mobile Robot Using Edge-AI. IEEE Sens. J. 2023, 23, 1671–1682. [Google Scholar] [CrossRef]
  29. Matarese, M.; Rea, F.; Sciutti, A. Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction. Front. Robot. AI 2022, 9, 733954. [Google Scholar] [CrossRef] [PubMed]
  30. Billah, M.A.; Faruque, I.A. Robustness in bio-inspired visually guided multi-agent flight and the gain modulation hypothesis. Int. J. Robust Nonlinear Control. 2022, 33, 1316–1334. [Google Scholar] [CrossRef]
  31. Attanayake, A.M.N.C.; Hansamali, W.G.R.U.; Hirshan, R.; Haleem, M.A.L.A.; Hinas, M.N.A. Amigo (A Social Robot): Development of a robot hearing system. In Proceedings of the IET 28th Annual Technical Conference, Virtual, 28 August 2021. [Google Scholar]
  32. ElGibreen, H.; Al Ali, G.; AlMegren, R.; AlEid, R.; AlQahtani, S. Telepresence Robot System for People with Speech or Mobility Disabilities. Sensors 2022, 22, 8746. [Google Scholar] [CrossRef]
  33. Karimian, P. Audio Communication for Multi-Robot Systems. Mater’s Thesis, Simon Fraser University, Burnaby, BC, Canada, 2007. [Google Scholar]
  34. Robotics 101: Sensors That Allow Robots to See, Hear, Touch, and Move|Possibility|Teledyne Imaging. Available online: https://possibility.teledyneimaging.com/robotics-101-sensors-that-allow-robots-to-see-hear-touch-and-move/ (accessed on 4 February 2023).
  35. Shimada, K. Morphological Fabrication of Equilibrium and Auditory Sensors through Electrolytic Polymerization on Hybrid Fluid Rubber (HF Rubber) for Smart Materials of Robotics. Sensors 2022, 22, 5447. [Google Scholar] [CrossRef]
  36. Darwish, A.; Halkon, B.; Oberst, S. Non-Contact Vibro-Acoustic Object Recognition Using Laser Doppler Vibrometry and Convolutional Neural Networks. Sensors 2022, 22, 9360. [Google Scholar] [CrossRef]
  37. Alkhatib, A.A.; Elbes, M.W.; Abu Maria, E.M. Improving accuracy of wireless sensor networks localisation based on communication ranging. IET Commun. 2020, 14, 3184–3193. [Google Scholar] [CrossRef]
  38. Masoud, M.; Jaradat, Y.; Manasrah, A.; Jannoud, I. Sensors of smart devices in the internet of everything (IoE) era: Big opportunities and massive doubts. J. Sens. 2019, 2019, 6514520. [Google Scholar] [CrossRef]
  39. Senocak, A.; Ryu, H.; Kim, J.; Kweon, I.S. Learning sound localization better from semantically similar samples. In Proceedings of the ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 23–27 May 2022; pp. 4863–4867. [Google Scholar]
  40. Qiu, Y.; Li, B.; Huang, J.; Jiang, Y.; Wang, B.; Huang, Z. An Analytical Method for 3-D Sound Source Localization Based on a Five-Element Microphone Array. IEEE Trans. Instrum. Meas. 2022, 71, 7504314. [Google Scholar] [CrossRef]
  41. Kumar, T.; Mahrishi, M.; Meena, G. A comprehensive review of recent automatic speech summarization and keyword identification techniques. In Artificial Intelligence in Industrial Applications: Approaches to Solve the Intrinsic Industrial Optimization Problems; Springer: Berlin/Heidelberg, Germany, 2022; pp. 111–126. [Google Scholar]
  42. Nakadai, K.; Okuno, H.G. Robot Audition and Computational Auditory Scene Analysis. Adv. Intell. Syst. 2020, 2, 2000050. [Google Scholar] [CrossRef]
  43. Bruck, J.N.; Walmsley, S.F.; Janik, V.M. Cross-modal perception of identity by sound and taste in bottlenose dolphins. Sci. Adv. 2022, 8, eabm7684. [Google Scholar] [CrossRef]
  44. Zhou, W.; Yue, Y.; Fang, M.; Qian, X.; Yang, R.; Yu, L. BCINet: Bilateral cross-modal interaction network for indoor scene understanding in RGB-D images. Inf. Fusion 2023, 94, 32–42. [Google Scholar] [CrossRef]
  45. Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile Sensing—From Humans to Humanoids. IEEE Trans. Robot. 2009, 26, 1–20. [Google Scholar] [CrossRef]
  46. Thai, M.T.; Phan, P.T.; Hoang, T.T.; Wong, S.; Lovell, N.H.; Do, T.N. Advanced intelligent systems for surgical robotics. Adv. Intell. Syst. 2020, 2, 1900138. [Google Scholar] [CrossRef]
  47. Liu, Y.; Aleksandrov, M.; Hu, Z.; Meng, Y.; Zhang, L.; Zlatanova, S.; Ai, H.; Tao, P. Accurate light field depth estimation under occlusion. Pattern Recognit. 2023, 138, 109415. [Google Scholar] [CrossRef]
  48. Lepora, N.F. Soft Biomimetic Optical Tactile Sensing With the TacTip: A Review. IEEE Sensors J. 2021, 21, 21131–21143. [Google Scholar] [CrossRef]
  49. Li, S.; Zhang, Y.; Wang, Y.; Xia, K.; Yin, Z.; Wang, H.; Zhang, M.; Liang, X.; Lu, H.; Zhu, M.; et al. Physical sensors for skin-inspired electronics. InfoMat 2020, 2, 184–211. [Google Scholar] [CrossRef]
  50. Templeman, J.O.; Sheil, B.B.; Sun, T. Multi-axis force sensors: A state-of-the-art review. Sens. Actuators A Phys. 2020, 304, 111772. [Google Scholar] [CrossRef]
  51. Seyedin, S.; Zhang, P.; Naebe, M.; Qin, S.; Chen, J.; Wang, X.; Razal, J.M. Textile strain sensors: A review of the fabrication technologies, performance evaluation and applications. Mater. Horiz. 2019, 6, 219–249. [Google Scholar] [CrossRef]
  52. Scimeca, L.; Hughes, J.; Maiolino, P.; Iida, F. Model-Free Soft-Structure Reconstruction for Proprioception Using Tactile Arrays. IEEE Robot. Autom. Lett. 2019, 4, 2479–2484. [Google Scholar] [CrossRef]
  53. Whitesides, G.M. Soft Robotics. Angew. Chem. Int. Ed. 2018, 57, 4258–4273. [Google Scholar] [CrossRef] [PubMed]
  54. Tirumala, S.; Weng, T.; Seita, D.; Kroemer, O.; Temel, Z.; Held, D. Learning to Singulate Layers of Cloth using Tactile Feedback. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; pp. 7773–7780. [Google Scholar]
  55. Pang, Y.; Xu, X.; Chen, S.; Fang, Y.; Shi, X.; Deng, Y.; Wang, Z.-L.; Cao, C. Skin-Inspired Textile-Based Tactile Sensors Enable Multifunctional Sensing of Wearables and Soft Robots. SSRN Electron. J. 2022, 96, 107137. [Google Scholar] [CrossRef]
  56. Babin, V.; Gosselin, C. Mechanisms for robotic grasping and manipulation. Annu. Rev. Control Robot. Auton. Syst. 2021, 4, 573–593. [Google Scholar] [CrossRef]
  57. Liu, H.; Guo, D.; Sun, F.; Yang, W.; Furber, S.; Sun, T. Embodied tactile perception and learning. Brain Sci. Adv. 2020, 6, 132–158. [Google Scholar] [CrossRef]
  58. Beckerle, P.; Salvietti, G.; Unal, R.; Prattichizzo, D.; Rossi, S.; Castellini, C.; Hirche, S.; Endo, S.; Amor, H.B.; Ciocarlie, M.; et al. A human–robot interaction perspective on assistive and rehabilitation robotics. Front. Neurorobot. 2017, 11, 24. [Google Scholar] [CrossRef]
  59. Bianco, A. The Sony AIBO—The World’s First Robotic Dog. Available online: https://sabukaru.online/articles/the-sony-aibo-the-worlds-first-robotic-dog (accessed on 13 April 2023).
  60. Göpel, W. Chemical imaging: I. Concepts and visions for electronic and bioelectronic noses. Sens. Actuators D Chem. 1998, 52, 125–142. [Google Scholar] [CrossRef]
  61. Fitzgerald, J.E.; Bui, E.T.H.; Simon, N.M.; Fenniri, H. Artificial Nose Technology: Status and Prospects in Diagnostics. Trends Biotechnol. 2017, 35, 33–42. [Google Scholar] [CrossRef]
  62. Kim, S.; Chen, J.; Cheng, T.; Gindulyte, A.; He, J.; He, S.; Li, Q.; Shoemaker, B.A.; Thiessen, P.A.; Yu, B.; et al. PubChem in 2021: New data content and improved web interfaces. Nucleic Acids Res. 2021, 49, D1388–D1395. [Google Scholar] [CrossRef]
  63. James, D.; Scott, S.M.; Ali, Z.; O’Hare, W.T. Chemical Sensors for Electronic Nose Systems. Microchim. Acta 2005, 149, 1–17. [Google Scholar] [CrossRef]
  64. Chueh, H.-T.; Hatfield, J.V. A real-time data acquisition system for a hand-held electronic nose (H2EN). Sens. Actuators B Chem. 2002, 83, 262–269. [Google Scholar] [CrossRef]
  65. Pan, L.; Yang, S.X. A new intelligent electronic nose system for measuring and analysing livestock and poultry farm odours. Environ. Monit. Assess. 2007, 135, 399–408. [Google Scholar] [CrossRef] [PubMed]
  66. Ampuero, S.; Bosset, J.O. The electronic nose applied to dairy products: A review. Sens. Actuators B Chem. 2003, 94, 1–12. [Google Scholar] [CrossRef]
  67. Simpkins, A. Robotic Tactile Sensing: Technologies and System (Dahiya, R.S. and Valle, M.; 2013) (On the Shelf). IEEE Robot. Autom. Mag. 2013, 20, 107. [Google Scholar] [CrossRef]
  68. Shepherd, G.M. Smell images and the flavour system in the human brain. Nature 2006, 444, 316–321. [Google Scholar] [CrossRef]
  69. Nagle, H.T.; Gutierrez-Osuna, R.; Schiffman, S.S. The how and why of electronic noses. IEEE Spectrum 1998, 35, 22–31. [Google Scholar] [CrossRef]
  70. Tladi, B.C.; Kroon, R.E.; Swart, H.C.; Motaung, D.E. A holistic review on the recent trends, advances, and challenges for high-precision room temperature liquefied petroleum gas sensors. Anal. Chim. Acta 2023, 1253, 341033. [Google Scholar] [CrossRef]
  71. Sun, Z.H.; Liu, K.X.; Xu, X.H.; Meng, Q.H. Odor evaluation of vehicle interior materials based on portable E-nose. In Proceedings of the 2020 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020; pp. 2998–3003. [Google Scholar]
  72. Viejo, C.G.; Fuentes, S.; Godbole, A.; Widdicombe, B.; Unnithan, R.R. Development of a low-cost e-nose to assess aroma profiles: An artificial intelligence application to assess beer quality. Sens. Actuators B Chem. 2020, 308, 127688. [Google Scholar] [CrossRef]
  73. Szulczyński, B.; Wasilewski, T.; Wojnowski, W.; Majchrzak, T.; Dymerski, T.; Namieśnik, J.; Gębicki, J. Different ways to apply a measurement instrument of E-nose type to evaluate ambient air quality with respect to odour nuisance in a vicinity of municipal processing plants. Sensors 2017, 17, 2671. [Google Scholar] [CrossRef]
  74. Trivino, R.; Gaibor, D.; Mediavilla, J.; Guarnan, A.V. Challenges to embed an electronic nose on a mobile robot. In Proceedings of the 2016 IEEE ANDESCON, Arequipa, Peru, 19–21 October 2016; pp. 1–4. [Google Scholar]
  75. Ye, Z.; Liu, Y.; Li, Q. Recent progress in smart electronic nose technologies enabled with machine learning methods. Sensors 2021, 21, 7620. [Google Scholar] [CrossRef] [PubMed]
  76. Chiu, S.W.; Tang, K.T. Towards a chemiresistive sensor-integrated electronic nose: A review. Sensors 2013, 13, 14214–14247. [Google Scholar] [CrossRef] [PubMed]
  77. Seesaard, T.; Wongchoosuk, C. Recent Progress in Electronic Noses for Fermented Foods and Beverages Applications. Fermentation 2022, 8, 302. [Google Scholar] [CrossRef]
  78. Rodríguez-Méndez, M.L.; Apetrei, C.; de Saja, J.A. Evaluation of the polyphenolic content of extra virgin olive oils using an array of voltammetric sensors. Electrochim. Acta 2010, 53, 5867–5872. [Google Scholar] [CrossRef]
  79. Ribeiro, C.M.G.; Strunkis, C.D.M.; Campos, P.V.S.; Salles, M.O. Electronic nose and tongue materials for Sensing. In Reference Module in Biomedical Sciences; Elsevier: Amsterdam, The Netherlands, 2021. [Google Scholar]
  80. Sierra-Padilla, A.; García-Guzmán, J.J.; López-Iglesias, D.; Palacios-Santander, J.M.; Cubillana-Aguilera, L. E-Tongues/noses based on conducting polymers and composite materials: Expanding the possibilities in complex analytical sensing. Sensors 2021, 21, 4976. [Google Scholar] [CrossRef]
  81. Yan, R.; Qiu, S.; Tong, L.; Qian, Y. Review of progresses on clinical applications of ion selective electrodes for electrolytic ion tests: From conventional ISEs to graphene-based ISEs. Chem. Speciat. Bioavailab. 2016, 28, 72–77. [Google Scholar] [CrossRef]
  82. Floris, I.; Sales, S.; Calderón, P.A.; Adam, J.M. Measurement uncertainty of multicore optical fiber sensors used to sense curvature and bending direction. Measurement 2019, 132, 35–46. [Google Scholar] [CrossRef]
  83. Kiran, E.; Kaur, K.; Aggarwal, P. Artificial senses and their fusion as a booming technique in food quality assessment—A review. Qual. Assur. Saf. Crop. Foods 2022, 14, 9–18. [Google Scholar]
  84. Zhou, B. Construction and simulation of online English reading model in wireless surface acoustic wave sensor environment optimized by particle swarm optimization. Discret. Dyn. Nat. Soc. 2022, 2022, 1633781. [Google Scholar] [CrossRef]
  85. Mohamed, Z.; Shareef, H. An Adjustable Machine Learning Gradient Boosting-Based Controller for Pv Applications. SSRN Electron. J. 2022. [Google Scholar] [CrossRef]
  86. Shimada, K. Artificial Tongue Embedded with Conceptual Receptor for Rubber Gustatory Sensor by Electrolytic Polymerization Technique with Utilizing Hybrid Fluid (HF). Sensors 2022, 22, 6979. [Google Scholar] [CrossRef] [PubMed]
  87. Cominelli, L.; Carbonaro, N.; Mazzei, D.; Garofalo, R.; Tognetti, A.; De Rossi, D. A Multimodal Perception Framework for Users Emotional State Assessment in Social Robotics. Future Internet 2017, 9, 42. [Google Scholar] [CrossRef]
  88. Grall, C.; Finn, E.S. Leveraging the power of media to drive cognition: A media-informed approach to naturalistic neuroscience. Soc. Cogn. Affect. Neurosci. 2022, 17, 598–608. [Google Scholar] [CrossRef]
  89. Bonci, A.; Cen Cheng, P.D.; Indri, M.; Nabissi, G.; Sibona, F. Human-robot perception in industrial environments: A survey. Sensors 2021, 21, 1571. [Google Scholar] [CrossRef] [PubMed]
  90. Laut, C.L.; Leasure, C.S.; Pi, H.; Carlin, S.M.; Chu, M.L.; Hillebr, G.H.; Lin, H.K.; Yi, X.I.; Stauff, D.L.; Skaar, E.P. DnaJ and ClpX Are Required for HitRS and HssRS Two-Component System Signaling in Bacillus anthracis. Infect. Immun. 2022, 90, e00560-21. [Google Scholar] [CrossRef]
  91. Bari, R.; Gupta, A.K.; Mathur, P. An Overview of the Emerging Technology: Sixth Sense Technology: A Review. In Proceedings of the Second International Conference on Information Management and Machine Intelligence: ICIMMI 2020, Jaipur, India, 24–25 July 2020; Springer: Singapore, 2021; pp. 245–254. [Google Scholar]
  92. Wikelski, M.; Ponsford, M. Collective behaviour is what gives animals their “Sixth Sense”. New Sci. 2022, 254, 43–45. [Google Scholar] [CrossRef]
  93. Xu, G.; Wan, Q.; Deng, W.; Guo, T.; Cheng, J. Smart-Sleeve: A Wearable Textile Pressure Sensor Array for Human Activity Recognition. Sensors 2022, 22, 1702. [Google Scholar] [CrossRef]
  94. Hui, T.K.L.; Sherratt, R.S. Towards disappearing user interfaces for ubiquitous computing: Human enhancement from Sixth Sense to super senses. J. Ambient. Intell. Humaniz. Comput. 2017, 8, 449–465. [Google Scholar] [CrossRef]
  95. Li, N.; Ho, C.P.; Xue, J.; Lim, L.W.; Chen, G.; Fu, Y.H.; Lee, L.Y.T. A Progress Review on Solid-State LiDAR and Nanophotonics-Based LiDAR Sensors. Laser Photonics Rev. 2022, 16, 2100511. [Google Scholar] [CrossRef]
  96. Randall, N.; Bennett, C.C.; Šabanović, S.; Nagata, S.; Eldridge, L.; Collins, S.; Piatt, J.A. More than just friends: In-home use and design recommendations for sensing socially assistive robots (SARs) by older adults with depression. Paladyn J. Behav. Robot. 2019, 10, 237–255. [Google Scholar] [CrossRef]
  97. Shih, B.; Shah, D.; Li, J.; Thuruthel, T.G.; Park, Y.-L.; Iida, F.; Bao, Z.; Kramer-Bottiglio, R.; Tolley, M.T. Electronic skins and machine learning for intelligent soft robots. Sci. Robot. 2020, 5, eaaz9239. [Google Scholar] [CrossRef]
  98. Anagnostis, A.; Benos, L.; Tsaopoulos, D.; Tagarakis, A.; Tsolakis, N.; Bochtis, D. Human Activity Recognition through Recurrent Neural Networks for Human–Robot Interaction in Agriculture. Appl. Sci. 2021, 11, 2188. [Google Scholar] [CrossRef]
  99. Tsanousa, A.; Bektsis, E.; Kyriakopoulos, C.; González, A.G.; Leturiondo, U.; Gialampoukidis, I.; Karakostas, A.; Vrochidis, S.; Kompatsiaris, I. A review of multisensor data fusion solutions in smart manufacturing: Systems and trends. Sensors 2022, 22, 1734. [Google Scholar] [CrossRef] [PubMed]
  100. Kumar S.N., N.; Zahid, M.; Khan, S.M. Sixth Sense Robot For The Collection of Basic Land Survey Data. Int. Res. J. Eng. Technol. 2021, 8, 4484–4489. [Google Scholar]
  101. Saracino, A.; Deguet, A.; Staderini, F.; Boushaki, M.N.; Cianchi, F.; Menciassi, A.; Sinibaldi, E. Haptic feedback in the da Vinci Research Kit (dVRK): A user study based on grasping, palpation, and incision tasks. Int. J. Med. Robot. Comput. Assist. Surg. 2019, 15, E1999. [Google Scholar] [CrossRef]
  102. García, A.; Solanes, J.E.; Muñoz, A.; Gracia, L.; Tornero, J. Augmented Reality-Based Interface for Bimanual Robot Teleoperation. Appl. Sci. 2022, 12, 4379. [Google Scholar] [CrossRef]
  103. Akalin, N.; Kristoffersson, A.; Loutfi, A. Evaluating the sense of safety and security in human–robot interaction with older people. In Social Robots: Technological, Societal and Ethical Aspects of Human-Robot Interaction; Springer: Berlin/Heidelberg, Germany, 2019; pp. 237–264. [Google Scholar]
Figure 1. Robotic senses [6].
Figure 1. Robotic senses [6].
Robotics 12 00068 g001
Table 1. Robotic Sensing: Review.
Table 1. Robotic Sensing: Review.
Robotic SensingComponents of Sensing SystemFunctionalitySensors TypesApplications
Vision
  • Lighting
  • Lenses
  • Image Sensor
  • Vision processing
  • Communications
  • Data collection
  • Allow the robot to perceive its environment and gather information about it.
  • Orthographic projection
  • Perspective projection
  • Navigation
  • localization
Hearing Sense
  • Microphones or other sound sensors
  • Amplifiers
  • Analog-to-digital converters (ADCs)
  • Computer system
  • Algorithms and software
  • Data collection
  • Perceive sounds in its environment and gather information about them
  • Sound sensors
  • Piezoelectric sensors
  • Piezoelectric sensors
  • Ultrasonic sensors
  • Infrared sensors
  • Sound localization and tracking
  • Sound classification
  • Noise canceling
Tactile Sense
  • Sensing
  • Data collection
  • Data processing
  • Decision-making
  • Actuation
  • Perceive physical contact and gather information
  • Pressure sensors
  • Temperature sensors
  • Force sensors
  • Strain sensors
  • Tactile array sensors
  • Object identification
  • Force feedback
  • Surface sensing
Electrical Nose
  • Sensors
  • Data acquisition system
  • Data analysis system
  • Display or output device
  • Sample introduction system
  • Power supply
  • Housing or enclosure
  • Data collection
  • Detect and identify different chemical compounds
  • Metal oxide semiconductor (MOS) sensors
  • Quartz crystal microbalance (QCM) sensors
  • Surface acoustic wave (SAW) sensors
  • Gas sensors: (solid-state gas sensors, infrared gas sensors, catalytic gas sensors, electrochemical gas sensors, nanowire gas sensors)
  • Mass spectrometry sensors
  • Chemical analysis
  • Odor tracking
  • Environmental monitoring
Electronic Tongue
  • Conductive polymers
  • Ion-selective electrodes
  • Optical fibers
  • Surface acoustic wave devices
  • Data collection
  • Detect and identify different chemical compounds
  • pH sensors
  • Conductivity sensors
  • Temperature sensors
  • Chromatography sensors
  • Electrochemical sensors
  • Spectroscopy sensors
  • Chemical analysis
  • Flavor detection
  • Chemical analysis
Sixth Sense
  • Sensors
  • Manipulators
  • Processor
  • Power supply
  • Housing or enclosure
  • Communication system
  • Machine learning system
  • Data collection
  • Allow the robot to perceive its environment and interact with it in various ways
  • Temperature sensors
  • Pressure sensors
  • Humidity sensors
  • Cameras
  • Microphones
  • LiDAR sensors
  • Viewing map
  • Creating a multimedia reading experience
  • Making calls
  • Interacting with physical objects
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Albustanji, R.N.; Elmanaseer, S.; Alkhatib, A.A.A. Robotics: Five Senses plus One—An Overview. Robotics 2023, 12, 68. https://doi.org/10.3390/robotics12030068

AMA Style

Albustanji RN, Elmanaseer S, Alkhatib AAA. Robotics: Five Senses plus One—An Overview. Robotics. 2023; 12(3):68. https://doi.org/10.3390/robotics12030068

Chicago/Turabian Style

Albustanji, Rand N., Shorouq Elmanaseer, and Ahmad A. A. Alkhatib. 2023. "Robotics: Five Senses plus One—An Overview" Robotics 12, no. 3: 68. https://doi.org/10.3390/robotics12030068

APA Style

Albustanji, R. N., Elmanaseer, S., & Alkhatib, A. A. A. (2023). Robotics: Five Senses plus One—An Overview. Robotics, 12(3), 68. https://doi.org/10.3390/robotics12030068

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop