Next Article in Journal
Point Cloud Registration Method Based on Geometric Constraint and Transformation Evaluation
Previous Article in Journal
Enhanced Seamless Video Fusion: A Convolutional Pyramid-Based 3D Integration Algorithm
Previous Article in Special Issue
Transparency-Aware Segmentation of Glass Objects to Train RGB-Based Pose Estimators
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Sensors for Robots

1
National Key Laboratory of Intelligent Tracking and Forecasting for Infectious Diseases, The Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, The Tianjin Key Laboratory of Intelligent Robotics (tjKLIR), Institute of Robotics and Automatic Information System (IRAIS), Nankai University, Tianjin 300350, China
2
The Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(6), 1854; https://doi.org/10.3390/s24061854
Submission received: 27 February 2024 / Accepted: 5 March 2024 / Published: 14 March 2024
(This article belongs to the Special Issue Sensors for Robots II)

1. Introduction

Currently, robots are playing significant roles in industry [1], life sciences [2], education [3], medicine [4], social services [5], and the military [6]. Robots will undoubtedly play increasingly important roles in the future life of human beings [7]. Sensors are key components of robots, enabling them to perceive additional environmental information, just like the scanners of the famous R2D2 Robot in Star Wars movies [8]. They are the basis of the self-adaptive abilities and automation control abilities of robots. Therefore, novel sensing techniques and advanced sensor applications for robots have received increasing attention worldwide [9].
Novel sensor technologies for robotics have been developed to detect new environmental parameters or improve the measurement accuracy of the present parameters [10]. To achieve this, novel design [11], fabrication [12], and calibration methods [13] have been invented to improve the detection sensitivity of the sensors. Besides the aforementioned sensor innovations, advancements in the integration [14], diffusion [15], and processing methods [16] of the detected signals have been made to improve the SNR and acquire useful information from gross signals. Based on these new technologies, novel force [17], vision [18], tactile [11], and auditory [19] sensors have been reported in recent years.
With the rapid development of swarm robots and multi-agent systems, multi-sensors [20], reconfigurable sensors [21], and cyber–physical sensing systems [22] have been developed to detect the parameters of each robot or agent. Because each body of the above system is usually far from the other, wireless sensor networks [23], multi-sensing intelligent systems [20], and sensor systems for human–robot interactions [24] have been developed to facilitate the communication and processing of the signals of each robotic body.
With the widespread application of robots, sensors for novel robotic applications have made significant progress. For example, vision sensors and vision algorithms for motive object tracking and robot navigation have been widely applied in mobile robots [25], unmanned aerial vehicles [26], and underwater vehicles [27]. In addition to the applications in traditional research areas, robot sensing technologies have also been widely applied in interdisciplinary areas, including life science [2], material sciences [28], physical sciences [29], and micro-nano manipulations [30].

2. Overview of Contribution

In this context, this topical collection includes ten papers focusing on the latest advancements in the field of sensors for robots and robotic sensing applications. Each of the ten original contributions accepted for publication in this Special Issue has undergone a rigorous review process involving a minimum of two expert reviewers across at least two rounds of revision. The studies featured in the current topical collection are summarized as follows.
In Contribution 1, the authors proposed a multi-path scattering model for virtual vegetation. On this basis, a passive microwave imaging algorithm was introduced to reconstruct vegetation images based on the scattering data from the multi-path model received by the UAV (Unmanned Aerial Vehicle) robot. The combination of the electromagnetic scattering model and image processing method contributes to understanding the image results and the multi-path scattering mechanisms of vegetation, which provide a reference for the research and development of microwave imaging systems of UAV robot networking using satellite transmitting signals.
In Contribution 2, the authors proposed a novel reinforcement learning-based hierarchical control framework to enable a snake robot with an onboard camera to realize autonomous self-localization and path following. A series of simulations and hardware experimental results validate that the proposed method is capable of achieving precise and fast convergence for the path-following tasks for a snake robot with autonomous visual perception.
Contribution 3 by Yang et al. discussed the coverage of practical models with varying intensities. This study analyzed the properties of the effective coverage of multi-node teams consisting of a given number of nodes. Numerical analyses and simulations for 3-node and n-node teams (n ≥ 4) were conducted separately. For the 3-node cases, the relations between the side lengths of equilateral triangle formation and the effective coverage of the team equipped with two different types of models were inspected. For the n-node cases (n ≥ 4), the effective coverage of a team in three formations, namely regular polygon, regular star, and equilateral triangular tessellation (for n = 6), were investigated. This study provides useful insights and guides for the development of practical devices for coverage applications.
In Contribution 4 by Imtiaz et al., the target was to have the agent learn to perform prehensile and non-prehensile robotic manipulations to improve the efficiency and throughput of the pick-and-place task. To achieve this target, this study specified the problem as a Markov decision process (MDP) and deployed a deep reinforcement learning (RL) temporal difference model-free algorithm known as the deep Q-network (DQN). Experiments carried out under a range of complex scenario test cases show promising results compared to a baseline deep learning approach and a ResNet architecture-based approach.
In Contribution 5, Shi et al. presented a design and a control algorithm for a robot used for grinding the surfaces of large, curved workpieces with unknown parameters, such as wind turbine blades. First, the structure and motion mode of the grinding robot were determined. Considering that fuzzy PID has the advantages of variable parameters and strong adaptability, this study proposed a force/position hybrid control strategy based on fuzzy PID. The experimental results of robotic grinding verified the feasibility and effectiveness of the proposed control strategy compared with the artificial results.
Contribution 6 by Li et al. proposed a robotic intracellular pressure measurement method using a traditional micropipette electrode system setup, in which intracellular pressure, a key physical parameter of the intracellular environment, has been found to regulate multiple cell physiological activities and impact cell micromanipulation results. This measurement method does not necessitate specialized and expensive equipment and causes limited damage to cell viability. The experimental intracellular pressure measurement value for porcine oocytes, which is consistent with the value reported in the relevant literature, verifies the effectiveness of the proposed measurement method. Moreover, a 90% survival rate of operated oocytes was obtained after measurement, indicating limited damage to cell viability.
In Contribution 7, Isabel et al. presented a novel active device for shoulder rotation based on force control. This device is designed for shoulder rehabilitation and can perform three different exercises. This device uses a force-control architecture and an adaptive-speed PI controller. The study conducted in 12 shoulder rehabilitation sessions showed a significant improvement in shoulder rotation. The patients reported high levels of device acceptance. This device could be a valuable tool for shoulder rehabilitation and improve the efficiency of physiotherapy services.
In Contribution 8, Shu et al. proposed a method for the accurate 3D posture sensing of soft actuators. It utilizes miniaturized sponge-resistive materials and LSTM neural networks. This method considers the hysteresis and non-linear sensing signals. It was demonstrated on a 3-DOF pneumatic bellow-shaped actuator with nine sensors. The sensors exhibited a low error rate in predicting the position of the actuator tip.
In Contribution 9, Rostkowska et al. focused on appearance-based localization using catadioptric cameras. The authors proposed an efficient neural network architecture for recognizing indoor scenes from distorted images. The system was optimized for real-time inference on edge devices. Transfer learning and fine-tuning of a limited number of catadioptric images yielded the best results. The system was tested on small-scale datasets and compared with state-of-the-art systems. The proposed system performed well in terms of accuracy and inference time, providing a cost- and energy-efficient solution for appearance-based localization.
In contribution 10, Weidenbach et al. discussed the challenges of estimating the 6D poses of transparent objects for robotic manipulation. Robotic manipulation requires accurate object pose knowledge. Estimating the poses of transparent objects is challenging because of unreliable depth data and the different effects of clutter. The authors proposed a transparency-aware multi-segmentation approach and transparency-aware bounding boxes. The proposed approach improves pose estimation for transparent objects without modifying the pose estimator. The study demonstrated a 4.3% increase in the ADD-S AUC value using transparency-aware segmentation. The transparency-aware segmentation can be applied to any dataset that provides a 3D model of an object and its ground-truth pose.

3. Conclusions

In summary, this topical collection includes ten novel and interesting works in the area of sensors for robots and robotic sensing applications. We would like to express our deepest gratitude to the Managing Team of Sensors for their continuous support throughout the preparation of this collection. We greatly appreciate all the contributing authors and the anonymous expert reviewers for their invaluable efforts in selecting submissions of the utmost quality.

Author Contributions

Conceptualization and supervision X.Z., writing—review and editing, M.S., writing—original draft preparation, Q.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was jointly supported by the National Natural Science Foundation of China (62027812, 62273186) and the Beijing Advanced Innovation Center for Intelligent Robots and Systems under Grant No. 2019IRS05.

Conflicts of Interest

The authors declare no conflicts of interest.

List of Contributions

  • Chen, Z.; Qiao, X.; Wu, P.; Zhang, T.; Hong, T.; Fang, L. Unmanned Aerial Vehicle (UAV) Robot Microwave Imaging Based on Multi-Path Scattering Model. Sensors 2022, 22, 8736.
  • Liu, L.; Guo, X.; Fang, Y. A Reinforcement Learning-Based Strategy of Path Following for Snake Robots with an Onboard Camera. Sensors 2022, 22, 9867.
  • Yang, Y.-R.; Kang, Q.; She, R. The Effective Coverage of Homogeneous Teams with Radial Attenuation Models. Sensors 2023, 23, 350.
  • Imtiaz, M.B.; Qiao, Y.; Lee, B. Prehensile and Non-Prehensile Robotic Pick-and-Place of Objects in Clutter Using Deep Reinforcement Learning. Sensors 2023, 23, 1513.
  • Shi, X.; Li, M.; Dong, Y.; Feng, S. Research on Surface Tracking and Constant Force Control of a Grinding Robot. Sensors 2023, 23, 4702.
  • Li, M.; Qiu, J.; Li, R.; Liu, Y.; Du, Y.; Liu, Y.; Sun, M.; Zhao, X.; Zhao, Q. Robotic Intracellular Pressure Measurement Using Micropipette Electrode. Sensors 2023, 23, 4973.
  • Alguacil-Diego, I.M.; Cuesta-Gómez, A.; Pont, D.; Carrillo, J.; Espinosa, P.; Sánchez-Urán, M.A.; Ferre, M. A Novel Active Device for Shoulder Rotation Based on Force Control. Sensors 2023, 23, 6158.
  • Shu, J.; Wang, J.; Cheng, K.C.-C.; Yeung, L.-F.; Li, Z.; Tong, R.K.-y. An End-to-End Dynamic Posture Perception Method for Soft Actuators Based on Distributed Thin Flexible Porous Piezoresistive Sensors. Sensors 2023, 23, 6189.
  • Rostkowska, M.; Skrzypczyński, P. Optimizing Appearance-Based Localization with Catadioptric Cameras: Small-Footprint Models for Real-Time Inference on Edge Devices. Sensors 2023, 23, 6485.
  • Weidenbach, M.; Laue, T.; Frese, U. Transparency-Aware Segmentation of Glass Objects to Train RGB-Based Pose Estimators. Sensors 2024, 24, 432.

References

  1. Dzedzickis, A.; Subačiūtė-Žemaitienė, J.; Šutinys, E.; Samukaitė-Bubnienė, U.; Bučinskas, V. Advanced applications of industrial robotics: New trends and possibilities. Appl. Sci. 2021, 12, 135. [Google Scholar] [CrossRef]
  2. Liu, H.; Stoll, N.; Junginger, S.; Thurow, K. Mobile robot for life science automation. Int. J. Adv. Robot. Syst. 2013, 10, 288. [Google Scholar] [CrossRef]
  3. Schina, D.; Esteve-González, V.; Usart, M. An overview of teacher training programs in educational robotics: Characteristics, best practices and recommendations. Educ. Inf. Technol. 2021, 26, 2831–2852. [Google Scholar] [CrossRef]
  4. Dupont, P.; Nelson, B.; Goldfarb, M.; Hannaford, B.; Menciassi, A.; Yang, G. A decade retrospective of medical robotics research from 2010 to 2020. Sci. Robot. 2021, 6, eabi8017. [Google Scholar] [CrossRef] [PubMed]
  5. Sheridan, T. A review of recent research in social robotics. Curr. Opin. Psychol. 2020, 36, 7–12. [Google Scholar] [CrossRef] [PubMed]
  6. Gans, N.; Rogers, J. Cooperative multirobot systems for military applications. Curr. Robot. Rep. 2021, 2, 105–111. [Google Scholar] [CrossRef]
  7. Pagliarini, L.; Lund, H. The future of Robotics Technology. J. Robotics Netw. Artif. Life 2017, 3, 270–273. [Google Scholar] [CrossRef]
  8. Karreman, D.; Ludden, G.; Evers, V. Beyond R2D2: Designing multimodal interaction behavior for robot-specific morphology. ACM Trans. Hum.-Robot Interact. (THRI) 2019, 8, 1–32. [Google Scholar] [CrossRef]
  9. Regtien, P. Sensors for applications in robotics. Sens. Actuators 1986, 10, 195–218. [Google Scholar] [CrossRef]
  10. Zhao, Y.; Xing, H.; Guo, S.; Wang, Y.; Cui, J.; Ma, Y.; Liu, Y.; Liu, X.; Feng, J.; Li, Y. A novel noncontact detection method of surgeon’s operation for a master-slave endovascular surgery robot. Med. Biol. Eng. Comput. 2020, 58, 871–885. [Google Scholar] [CrossRef]
  11. Lambeta, M.; Chou, P.; Tian, S.; Yang, B.; Maloon, B.; Most, V.; Calandra, R. Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation. IEEE Robot. Autom. Lett. 2020, 5, 3838–3845. [Google Scholar] [CrossRef]
  12. Raril, C.; Manjunatha, J. Fabrication of novel polymer-modified graphene-based electrochemical sensor for the determination of mercury and lead ions in water and biological samples. J. Anal. Sci. Technol. 2020, 11, 3. [Google Scholar] [CrossRef]
  13. Li, Z.; Li, S.; Luo, X. An overview of calibration technology of industrial robots. IEEE/CAA J. Autom. Sin. 2021, 8, 23–36. [Google Scholar] [CrossRef]
  14. Cui, Y.; Liu, F.; Mu, J. Integrating sensing and communications for ubiquitous IoT: Applications, trends, and challenges. IEEE Netw. 2021, 35, 158–167. [Google Scholar] [CrossRef]
  15. Zhu, W.; Wang, J.; Sun, W.; Zhou, S.; He, M. Preparation of gradient hydrogel for pressure sensing by combining freezing and directional diffusion processes. Chem. Eng. J. 2023, 451, 138335. [Google Scholar] [CrossRef]
  16. Krishnamurthi, R.; Kumar, A.; Gopinathan, D.; Nayyar, A.; Qureshi, B. An overview of IoT sensor data processing, fusion, and analysis techniques. Sensors 2020, 20, 6076. [Google Scholar] [CrossRef]
  17. Cheng, M.; Zhu, G.; Zhang, F.; Tang, W.; Yang, J.; Zhu, L. A review of flexible force sensors for human health monitoring. J. Adv. Res. 2020, 26, 53–68. [Google Scholar] [CrossRef]
  18. Halwani, M.; Ayyad, A.; AbuAssi, L.; Abdulrahman, Y.; Almaskari, F.; Hassanin, H.; Zweiri, Y. A Novel Vision-Based Multi-Functional Sensor for Normality and Position Measurements in Precise Robotic Manufacturing. Precis. Eng. 2024, 88, 367–381. [Google Scholar] [CrossRef]
  19. Netzer, O.; Heimler, B.; Shur, A.; Behor, T.; Amedi, A. Backward spatial perception can be augmented through a novel visual-to-auditory sensory substitution algorithm. Sci. Rep. 2021, 11, 11944. [Google Scholar] [CrossRef]
  20. Ali, A.; Abouelghar, M.; Belal, A.; Saleh, N.; Yones, M.; Selim, A.; Savin, I. Crop yield prediction using multi sensors remote sensing. Egypt. J. Remote Sens. Space Sci. 2022, 25, 711–716. [Google Scholar]
  21. Alamzadeh, I.; Alexandropoulos, G.; Shlezinger, N.; Imani, M. A reconfigurable intelligent surface with integrated sensing capability. Sci. Rep. 2021, 11, 20737. [Google Scholar] [CrossRef] [PubMed]
  22. Andronie, M.; Lăzăroiu, G.; Ștefănescu, R.; Uță, C.; Dijmărescu, I. Sustainable, smart, and sensing technologies for cyber-physical manufacturing systems: A systematic literature review. Sustainability 2021, 13, 5495. [Google Scholar] [CrossRef]
  23. Landaluce, H.; Arjona, L.; Perallos, A.; Falcone, F.; Angulo, I.; Muralter, F. A review of IoT sensing applications and challenges using RFID and wireless sensor networks. Sensors 2020, 20, 2495. [Google Scholar] [CrossRef] [PubMed]
  24. Lin, K.; Li, Y.; Sun, J.; Zhou, D.; Zhang, Q. Multi-sensor fusion for body sensor network in medical human–robot interaction scenario. Inf. Fusion 2020, 57, 15–26. [Google Scholar] [CrossRef]
  25. Sanchez-Ibanez, J.; Perez-del-Pulgar, C.; García-Cerezo, A. Path planning for autonomous mobile robots: A review. Sensors 2021, 21, 7898. [Google Scholar] [CrossRef]
  26. Petráček, P.; Krátký, V.; Petrlík, M.; Báča, T.; Kratochvíl, R.; Saska, M. Large-scale exploration of cave environments by unmanned aerial vehicles. IEEE Robot. Autom. Lett. 2021, 6, 7596–7603. [Google Scholar] [CrossRef]
  27. Sánchez, P.; Papaelias, M.; Márquez, F. Autonomous underwater vehicles: Instrumentation and measurements. IEEE Instrum. Meas. Mag. 2020, 23, 105–114. [Google Scholar] [CrossRef]
  28. Xue, L.; Yamazaki, H.; Ren, R.; Wanunu, M.; Ivanov, A.; Edel, J. Solid-state nanopore sensors. Nat. Rev. Mater. 2020, 5, 931–951. [Google Scholar] [CrossRef]
  29. Li, S.; Zhang, Y.; Wang, Y.; Yin, Z.; Wang, H.; Zhang, Y. Physical sensors for skin-inspired electronics. InfoMat 2020, 2, 184–211. [Google Scholar] [CrossRef]
  30. Qin, J.; Jiang, S.; Wang, Z.; Cheng, X.; Li, B.; Shi, Y.; Zhu, W. Metasurface micro/nano-optical sensors: Principles and applications. ACS Nano 2022, 16, 11598–11618. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhao, X.; Sun, M.; Zhao, Q. Sensors for Robots. Sensors 2024, 24, 1854. https://doi.org/10.3390/s24061854

AMA Style

Zhao X, Sun M, Zhao Q. Sensors for Robots. Sensors. 2024; 24(6):1854. https://doi.org/10.3390/s24061854

Chicago/Turabian Style

Zhao, Xin, Mingzhu Sun, and Qili Zhao. 2024. "Sensors for Robots" Sensors 24, no. 6: 1854. https://doi.org/10.3390/s24061854

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop