Preliminary Work on a Virtual Reality Interface for the Guidance of Underwater Robots
Abstract
:1. Introduction
2. HRI
2.1. Teleoperation
2.2. Virtual Reality
2.3. VR in HRI for Underwater Intervention Systems
2.4. Usability
- Product effect (output, effectiveness and satisfaction at the time of use).
- Product attributes (interface and interaction).
- Processes used to develop the product.
- Organizational capability.
- Product-oriented standards (ISO 9126, 2001; ISO 14598, 2001).
- Process-oriented standards (ISO 9241, 1992/2001; ISO 13407, 1999).
- Effectiveness: How well do the users achieve their goals using the system?
- Efficiency: What resources are consumed in order to achieve their goals?
- Satisfaction: How do the users feel about their experience of the system?
3. Experimental Setup and VR Developments
3.1. HTC Vive vs. Oculus Rift
3.2. Unity (Game Engine) vs. UWSim
4. Results
4.1. Suitability of HTC Glasses for Immersive VR 3D
4.2. VR Functionalities
4.3. VR Interface
- The movement section, which shows information about the vehicle movement speed and rotation in the first version.
4.4. Usability Tests
- In the first test the camera is in third person and the user needs to take the black box and bring it to the white container (in Figure 11). The user only knew about the camera change button and that the elements moved with the touchpad.
- In the second test the problem to solve is the same but the camera is on the arm of the robot (in Figure 12).
- In the third test, the camera is placed on the body of the robot (in Figure 13).
- In the last test, there is an obstacle between the black box and the robot (Figure 14). This obstacle can be a Cylinder, a Horizontal Wall or a Vertical Wall, adding a random factor.
- Are the controls easy to learn?
- Is the environment realistic?
- Do you think that the interface is really useful?
- Would you add something to the interface?
4.5. Efficiency
4.6. Comparison with Previous Work
4.7. The Training and Integration Server
5. Discussion and Future Work
- Integrate another robot into the simulation (later to be a real robot), with the aim that they cooperate in solving problems with the user controlling only one of them at a given time. Although a priori it would be interesting to have an interface able to allow two or more users guiding their own robots, previous experiences have shown how the umbilical cables knot themselves in this case. In the latest version of the simulator we have already included two robots but we have yet to incorporate a mechanism integrated into the interface that will allow the operator to switch control from one to the other. As the simulation becomes more complex, it can become necessary to create a master-slave model to define the relationship between the robots and the human user.
- Connect the interface to a server simulator (discussed in Section 4.7) as a first step towards making a connection with a real robot. We plan to develop a connection layer [3] between the real robot and the simulator. In order to do so, it will be necessary to transform the signals for the buttons (which are detected in C#) to movement signals for the robot which has an ROS system.
- Transform our VR interface into an AR one, with information provided by the robot sensors (simulated or real).
- To further develop the learning process we plan to improve the interface manual and the VR videos to make it easier to understand the hardware. A more complex sequence of problems is being developed, and there are plans to use it in a master degree in underwater robotics (http://www.master-mir.eu/?trk=public-post_share-update_update-text).
- As use of a VR system can cause fatigue problems when used for long time periods [46], we would like to explore techniques and design concepts that help to reduce fatigue.
Supplementary Materials
Author Contributions
Funding
Conflicts of Interest
References
- Lin, Q.; Kuo, C. On Applying Virtual Reality to Underwater Robot Tele-Operation and Pilot Training. Int. J. Virtual Real. 2001, 5, 71–91. [Google Scholar] [CrossRef]
- Ridao, P.; Carreras, M.; Ribas, D.; Sanz, P.J.; Oliver, G. Intervention AUVs: The next challenge. Annu. Rev. Control. 2015, 40, 227–241. [Google Scholar] [CrossRef] [Green Version]
- Yuh, J.; Choi, S.K.; Ikehara, C.; Kim, G.H.; Me Murty, G.; Ghasemi-Nejhad, M.; Sarlear, N.; Sugihara, K. Design of a semi-autonomous underwater vehicle for intervention missions (SAUVIM). In Proceedings of the 1998 International Symposium on Underwater Technology, Tokyo, Japan, 15–17 April 1998; IEEE Press: Hoboken, NJ, USA, 1998; pp. 63–68. [Google Scholar]
- Sanz, P.J.; Ridao, P.; Oliver, G.; Melchiorri, C.; Casalino, G.; Silvestre, C.; Petillot, Y.; Turetta, A. TRIDENT: A Framework for Autonomous Underwater Intervention Missions with Dexterous Manipulation Capabilities. IFAC Proc. Vol. 2010, 43, 187–192. [Google Scholar] [CrossRef]
- Khatib, O.; Yeh, X.; Brantner, G.; Soe, B.; Kim, B.; Ganguly, S.; Stuart, H.; Wang, S.; Cutkosky, M.; Edsinger, A.; et al. Ocean One: A Robotic Avatar for Oceanic Discovery. IEEE Robot. Autom. Mag. 2016, 23, 20–29. [Google Scholar] [CrossRef]
- Prats, M.; Perez, J.; Fernandez, J.J.; Sanz, P.J. An open source tool for simulation and supervision of underwater intervention missions. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2012, Algarve, Portugal, 7–12 October 2012; Curran Associates, Inc.: Red Hook, NY, USA, 2012; pp. 2577–2582. [Google Scholar]
- Miller, G. The magical number seven, plus or minus two: Some limits on our capabilities for processing information. Psychol. Rev. 1956, 63, 81–97. [Google Scholar] [CrossRef] [Green Version]
- Sanz, P.J.; de la Cruz, M.; Lunghi, G.; Veiga, C.; Marín, R.; Di Castro, M. The Role of HRI within COMOMUIS Research Project. In Proceedings of the Jornadas Nacionales de Robótica, Alicante, Spain, 13–14 June 2019; Fernando, T.M., Óscar, R.G., Eds.; Universidad de Alicante: Alicante, Spain, 2019; pp. 141–147. [Google Scholar]
- Preece, J.; Rogers, Y.; Sharp, H.; Benyon, D.; Holland, S.; Carey, T. Human-Computer Interaction; Addison-Wesley Longman Ltd.: Essex, UK, 1994; ISBN 0201627698. [Google Scholar]
- Sheridan, T.B.; Verplank, W.L. Human and Computer Control of Undersea Teleoperators; Technical Report; Massachusetts Inst of Tech Man-Machine Systems Lab: Cambridge, MA, USA, 1978; Available online: https://apps.dtic.mil/dtic/tr/fulltext/u2/a057655.pdf (accessed on 3 May 2019).
- Peshkova, E.; Hitz, M.; Kaufmann, B. Survey on Natural Interaction Techniques for an Unmanned Aerial Vehicle System. IEEE Pervasive Comput. 2017, 16, 34–42. [Google Scholar] [CrossRef]
- Chen, J.Y.C.; Haas, E.C.; Barnes, M.J. Human Performance Issues and User Interface Design for Teleoperated Robots. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2007, 37, 1231–1245. [Google Scholar] [CrossRef]
- Dicianno, B.E.; Sibenaller, S.; Kimmich, C.; Cooper, R.A.; Pyo, J. Joystick Use for Virtual Power Wheelchair Driving in Individuals with Tremor: Pilot Study. J. Rehabil. Res. Dev. 2009, 46, 269–275. [Google Scholar] [CrossRef]
- Huang, C.M.; Mutlu, B. Anticipatory robot control for efficient human–robot collaboration. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human–robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; IEEE Press: Hoboken, NJ, USA, 2016; pp. 7–10. [Google Scholar]
- Shim, H.; Jun, B.; Lee, P.; Baek, H.; Lee, J. Workspace control system of underwater tele-operated manipulators on an ROV. Ocean Eng. 2010, 37, 1036–1047. [Google Scholar] [CrossRef]
- Sutherland, I.E. The ultimate display. In Proceedings of the International Federation of Information Processing IFIPS Congress, New York, NY, USA, 24–29 May 1965; Volume 2, pp. 506–508. [Google Scholar]
- Rheingold, H. Virtual Reality; Summit Books: New York, NY, USA, 1991. [Google Scholar]
- Chin, C.; Lin, W.; Lin, J. Experimental validation of open-frame ROV model for virtual reality simulation and control. J. Mar. Sci. Technol. 2018, 23, 267–287. [Google Scholar] [CrossRef] [Green Version]
- Anthes, C.; García Hernandez, R.; Wiedemann, M.; Kranzlmüller, D. State of the Art of Virtual Reality Technologies. In Proceedings of the IEEE Aerospace Conference, Big Sky, MT, USA, 5–12 March 2016. [Google Scholar] [CrossRef]
- Burdea, G.; Coiffet, P. Virtual Reality Technology. Presence 2003, 12, 663–664. [Google Scholar] [CrossRef]
- Bowman, D.; North, C.; Chen, J.; Polys, N.; Pyla, P.; Yilmaz, U. Information–rich virtual environments: Theory, tools, and research agenda. In Proceedings of the VRST ‘03 ACM Symposium on Virtual Reality Software and Technology, Osaka, Japan, 1–3 October 2003; pp. 81–90. [Google Scholar] [CrossRef]
- Sherman, W.; Craig, A. Understanding Virtual Reality—Interface, Application, and Design; Elsewier Science: New York, NY, USA, 2003. [Google Scholar]
- Lin, M.C.; Otaduy, M.A.; Boulic, R. Virtual reality software and technology. IEEE Comput. Graph. Appl. 2008, 28, 18–19. [Google Scholar] [CrossRef] [PubMed]
- Kot, T.; Novak, P. Utilization of the Oculus Rift HMD in Mobile Robot Teleoperation. Appl. Mech. Mater. 2014, 555, 199–208. [Google Scholar] [CrossRef]
- García, J.C.; Patrão, B.; Almeida, L.; Pérez, J.; Menezes, P.; Dias, J.; Sanz, P.J. A Natural Interface for Remote Operation of Underwater Robots. IEEE Comput. Graph. 2015, 37, 34–43. [Google Scholar] [CrossRef] [Green Version]
- Kot, T.; Novak, P. Application of virtual reality in teleoperation of the military mobile robotic system TA-ROS. Int. J. Adv. Robot. Syst. 2018, 15, 1729881417751545. [Google Scholar] [CrossRef] [Green Version]
- Schiza, E.; Matsangidou, M.; Neokleous, K.; Pattichis, C. Virtual Reality Applications for Neurological Disease: A Review. Front. Robot. AI 2019, 6, 100. [Google Scholar] [CrossRef] [Green Version]
- Rizzo, A.S.; Koenig, S.T. Is clinical virtual reality ready for primetime? Neuropsychology 2017, 31, 877–899. [Google Scholar] [CrossRef] [Green Version]
- Tan, Y.; Niu, C.; Zhang, J. Head-Mounted Display-Based Immersive Virtual Reality Marine-Engine Training System. IEEE Syst. Man Cybern. Mag. 2020, 6, 46–51. [Google Scholar] [CrossRef]
- Hsu, E.B.; Li, Y.; Bayram, J.D.; Levinson, D.; Yang, S.; Monahan, C. State of virtual reality based disaster preparedness and response training. PLoS Curr. 2013, 5. [Google Scholar] [CrossRef]
- Engelbrecht, H.; Lindeman, R.; Hoermann, S. A SWOT Analysis of the Field of Virtual Reality for Firefighter Training. Front. Robot. AI 2019, 6, 101. [Google Scholar] [CrossRef] [Green Version]
- Haydar, M.; Maidi, M.; Roussel, D.; Mallem, M.; Drap, P.; Bale, K.; Chapman, P. Virtual Exploration of Underwater Archaeological Sites: Visualization and Interaction in Mixed Reality Environments. In Proceedings of the 9th International Symposium on Virtual Reality, Archaeology and Intelligent Cultural Heritage, Braga, Portugal, 2–5 December 2008; Eurographics Ass: Geneva, Switzerland, 2008. [Google Scholar] [CrossRef]
- Bekele, M.; Champion, E. A Comparison of Immersive Realities and Interaction Methods: Cultural Learning in Virtual Heritage. Front. Robot. AI 2019, 6, 91. [Google Scholar] [CrossRef] [Green Version]
- Azuma, R.T. A Survey of Augmented Reality. Presence Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Brantner, G.; Khatib, O. Controlling Ocean One. In Field and Service Robotics; Springer International Publishing: Cham, Switzerland, 2018; pp. 3–17. [Google Scholar] [CrossRef]
- Gancet, J.; Weiss, P.; Antonelli, G.; Folkert Pfingsthorn, M.; Calinon, S.; Turetta, A.; Walen, C.; Urbina, D.; Govindaraj, S.; Letier, P.; et al. Dexterous Undersea Interventions with Far Distance Onshore Supervision: The DexROV Project. IFAC-PapersOnLine 2016, 49, 414–419. [Google Scholar] [CrossRef]
- Abran, A.; Khelifi, A.; Suryn, W.; Seffah, A. Usability Meanings and Interpretations in ISO Standards. Softw. Qual. J. 2003, 11, 325–338. [Google Scholar] [CrossRef]
- Seffah, A.; Donyaee, M.; Kline, R.B.; Padda, H.K. Usability measurement and metrics: A consolidated model. Softw. Qual. J. 2006, 14, 159–178. [Google Scholar] [CrossRef]
- Brooke, J. SUS-A quick and dirty usability scale. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B., McClelland, I.L., Weerdmeester, B., Eds.; Taylor & Francis, Milton Park: Abingdon-on-Thames, Oxfordshire, UK, 1996; pp. 189–194. [Google Scholar]
- Marsh, T. Evaluation of Virtual Reality Systems for Usability; Proceedings CHI 99, 15–20 May 1999; ACM: New York, NY, USA, 1999; pp. 61–62. [Google Scholar] [CrossRef]
- Unity User Manual. Available online: https://docs.unity3d.com/Manual/index.html (accessed on 3 March 2019).
- Fernández, J.J.; Prats, M.; Sanz, P.J.; García, J.C.; Marín, R.; Robinson, M.; Ribas, D.; Ridao, P. Grasping for the Seabed: Developing a New Underwater Robot Arm for Shallow-Water Intervention. IEEE Robot. Autom. Mag. 2013, 20, 121–130. [Google Scholar] [CrossRef]
- Feiner, S.; MacIntyre, B.; Haupt, M.; Solomon, E. Windows on the world: 2D windows for 3D augmented reality. In Proceedings of the 6th Annual ACM Symposium on User Interface Software and Technology, Atlanta, GA, USA, 3–8 December 1993; pp. 145–155. [Google Scholar] [CrossRef]
- McCauley, M.; Sharkey, T. Cybersickness: Perception of Self-motion in Virtual Environments. Teleoperators and Virtual Environments. Presence 1989, 3, 311–318. [Google Scholar]
- LaViola, J.J., Jr. A discussion of cybersickness in virtual environments. ACM SIGCHI Bull. 2000, 32, 47–56. [Google Scholar] [CrossRef]
- Lambooij, M.; Ijsselsteijn, W.; Fortuin, M.; Heynderickx, I. Visual Discomfort and Visual Fatigue of Stereoscopic Displays: A Review. J. Imaging Sci. Technol. 2009, 53, 30201-1. [Google Scholar] [CrossRef] [Green Version]
Project | New Characteristics |
---|---|
SAUVIM [3] | Multiple displays, keyboards, joysticks, several expert users for robot. |
TRIDENT [4] | GUI, one human controller, contextual GUIs |
MERBOTS [24] | VR cockpit with track and estimation of human poses, one human controller (not expert). |
Venus [25] | 3D models from sensors data, AR interface |
OCEAN [5,35] | Bimanual haptic devices, stereoscopic vision, GUI, a world display, constrains: overrides human actions. |
DexROV [36] | Real time simulation environment, haptic devices (arm and hand exoskeletons), cognitive engine to translate user instructions. |
Usability Definitions | Standards |
---|---|
The capability of the software product to be understood, learned, used and attractive to the user, when used under specified conditions. | ISO/IEC 9126-1, 2000 |
The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use. | ISO 9241-11, 1998 |
The ease with which a user can learn to operate, prepare inputs for, and interpret outputs of a system or component. | IEEE Std. 610.12-1990 |
First VR Interface | Second VR Interface |
---|---|
Information about the vehicle movement speed and rotation. | Information about the vehicle movement speed and rotation and also depth of the vehicle. |
Solid letters panel. | Transparent letters panel. |
No sound. | Water, Motor and Impact sound. |
No added help. | To make easier to understand the hardware: an interface manual and several VR videos (all of them linked in the Supplementary Materials section). |
Two cameras. | Vehicle camera in different position. |
Two work modes: vehicle and arm. | Three work modes: vehicle, arm and vehicle camera with arm control. |
Age | Sex | T1 | Tr | T2 | Tr | T3 | Tr | T4 | Tr | Af | Pl |
---|---|---|---|---|---|---|---|---|---|---|---|
23 | Male | 352 | 1 | 199 | 1 | 1182 | 7 | 183 | 1 | 0.5 | 7 |
43 | Male | 265 | 1 | 385 | 1 | 395 | 2 | 296 | 2 | 0.2 | 8 |
27 | Male | 430 | 1 | 650 | 3 | 201 | 1 | 318 | 1 | 0.7 | 9 |
22 | Male | 543 | 3 | 138 | 1 | 156 | 1 | 236 | 1 | 1 | 9 |
28 | Male | 463 | 1 | 286 | 1 | 274 | 1 | 421 | 1 | 0.4 | 8.5 |
45 | Male | 647 | 3 | 360 | 1 | 558 | 2 | 352 | 1 | 0.1 | 7.5 |
28 | Male | 250 | 2 | 89 | 1 | 432 | 3 | 367 | 2 | 0.5 | 8 |
30 | Female | 965 | 3 | 337 | 1 | 451 | 2 | 724 | 3 | 0 | 8 |
24 | Male | 125 | 1 | 122 | 1 | 139 | 1 | 393 | 4 | 0.5 | 9 |
29 | Male | 381 | 4 | 491 | 2 | 676 | 2 | 919 | 4 | 0.1 | 8 |
38 | Male | 115 | 1 | 103 | 1 | 325 | 2 | 193 | 1 | 1 | 8.5 |
24 | Male | 313 | 2 | 97 | 1 | 256 | 1 | 453 | 4 | 1 | 9 |
54 | Female | 343 | 3 | 167 | 1 | 780 | 3 | 344 | 1 | 0.2 | 7.5 |
46 | Male | 352 | 1 | 196 | 1 | 744 | 4 | 240 | 1 | 0.4 | 8.25 |
22 | Male | 192 | 2 | 75 | 1 | 163 | 1 | 141 | 2 | 1 | 9.5 |
44 | Male | 430 | 2 | 198 | 1 | 711 | 3 | 811 | 3 | 0.2 | 9 |
46 | Male | 197 | 3 | 115 | 1 | 250 | 1 | 167 | 1 | 0.4 | 7 |
22 | Male | 144 | 1 | 155 | 1 | 167 | 1 | 153 | 1 | 1 | 9 |
22 | Male | 197 | 3 | 80 | 1 | 167 | 1 | 249 | 2 | 1 | 7.75 |
23 | Male | 163 | 1 | 135 | 1 | 150 | 1 | 176 | 1 | 1 | 8 |
24 | Male | 158 | 1 | 161 | 1 | 190 | 1 | 135 | 1 | 0.8 | 8 |
44 | Female | 238 | 1 | 174 | 1 | 745 | 5 | 311 | 1 | 0.2 | 7.75 |
33 | Female | 207 | 1 | 298 | 2 | 300 | 1 | 335 | 1 | 0.4 | 7 |
31 | Male | 207 | 3 | 80 | 2 | 324 | 3 | 159 | 2 | 1 | 7 |
48 | Male | 186 | 1 | 238 | 1 | 462 | 1 | 711 | 4 | 0.4 | 8.5 |
22 | Male | 51 | 1 | 46 | 1 | 46 | 1 | 55 | 1 | - | - |
Age | Sex | T1 | Tr | T2 | Tr | T3 | Tr | T4 | Tr | Af | Pl | R |
---|---|---|---|---|---|---|---|---|---|---|---|---|
22 | Female | 207 | 2 | 190 | 1 | 266 | 1 | 180 | 2 | 0.1 | 8.5 | No |
23 | Female | 126 | 1 | 197 | 1 | 446 | 3 | 336 | 1 | 0.1 | 8 | No |
27 | Male | 135 | 1 | 98 | 1 | 149 | 1 | 263 | 2 | 0.7 | 9.5 | Yes |
21 | Male | 75 | 1 | 73 | 1 | 98 | 1 | 226 | 3 | 0.8 | 7 | No |
25 | Male | 138 | 2 | 68 | 1 | 98 | 1 | 113 | 1 | 0.8 | 8.85 | No |
20 | Male | 82 | 1 | 198 | 2 | 103 | 1 | 336 | 3 | 1 | 7.5 | No |
28 | Male | 71 | 1 | 83 | 1 | 112 | 1 | 101 | 1 | 0.5 | 9 | Yes |
25 | Female | 133 | 1 | 162 | 1 | 100 | 1 | 173 | 1 | 1 | 6.75 | No |
22 | Male | 75 | 1 | 101 | 1 | 102 | 1 | 109 | 1 | 0.8 | 7 | No |
24 | Male | 83 | 1 | 86 | 1 | 89 | 1 | 134 | 1 | 0.8 | 9 | Yes |
46 | Male | 285 | 2 | 241 | 1 | 275 | 1 | 257 | 1 | 0.4 | 9 | Yes |
38 | Male | 165 | 2 | 226 | 1 | 201 | 1 | 278 | 1 | 1 | 9.5 | Yes |
45 | Male | 250 | 1 | 193 | 1 | 233 | 1 | 260 | 1 | 0.1 | 8.5 | Yes |
43 | Male | 91 | 1 | 169 | 1 | 290 | 1 | 275 | 1 | 0.2 | 9 | Yes |
24 | Male | 111 | 1 | 76 | 1 | 112 | 1 | 152 | 1 | 0.8 | 9 | Yes |
48 | Male | 186 | 1 | 238 | 1 | 462 | 1 | 711 | 4 | 0 | 8.5 | No |
30 | Female | 159 | 1 | 145 | 1 | 149 | 1 | 292 | 1 | 0.1 | 9 | Yes |
22 | Male | 25 | 1 | 31 | 1 | 43 | 1 | 47 | 1 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
de la Cruz, M.; Casañ, G.; Sanz, P.; Marín, R. Preliminary Work on a Virtual Reality Interface for the Guidance of Underwater Robots. Robotics 2020, 9, 81. https://doi.org/10.3390/robotics9040081
de la Cruz M, Casañ G, Sanz P, Marín R. Preliminary Work on a Virtual Reality Interface for the Guidance of Underwater Robots. Robotics. 2020; 9(4):81. https://doi.org/10.3390/robotics9040081
Chicago/Turabian Stylede la Cruz, Marcos, Gustavo Casañ, Pedro Sanz, and Raúl Marín. 2020. "Preliminary Work on a Virtual Reality Interface for the Guidance of Underwater Robots" Robotics 9, no. 4: 81. https://doi.org/10.3390/robotics9040081
APA Stylede la Cruz, M., Casañ, G., Sanz, P., & Marín, R. (2020). Preliminary Work on a Virtual Reality Interface for the Guidance of Underwater Robots. Robotics, 9(4), 81. https://doi.org/10.3390/robotics9040081