Next Article in Journal
The Effects of Asymmetric Diurnal Warming on Vegetation Growth of the Tibetan Plateau over the Past Three Decades
Next Article in Special Issue
The Effect of Coding Teaching on Students’ Self-Efficacy Perceptions of Technology and Design Courses
Previous Article in Journal
Study of the Cooling Effects of Urban Green Space in Harbin in Terms of Reducing the Heat Island Effect
Previous Article in Special Issue
Virtual Reality Learning Activities for Multimedia Students to Enhance Spatial Ability

Sustainability 2018, 10(4), 1102;

A Low-Cost Immersive Virtual Reality System for Teaching Robotic Manipulators Programming
Computer Technology and Computation Department, University of Alicante, 03690 Alicante, Spain
Developmental and Educational Psychology Department, University of Alicante, 03690 Alicante, Spain
Author to whom correspondence should be addressed.
Received: 30 January 2018 / Accepted: 5 April 2018 / Published: 7 April 2018


Laboratory tasks are a powerful pedagogical strategy for developing competences in science and engineering degrees, making students understand in a practical way the theoretical topics explained in the classroom. However, performing experiments in real conditions is usually expensive in terms of time, money and energy, as it requires expensive infrastructures that are generally difficult to maintain in good conditions. To overcome this problem, virtual reality has proven to be a powerful tool to achieve sustainability, making it easy to update laboratories without the need to acquire new equipment. Moreover, the ability to introduce practical knowledge into classrooms without leaving them, makes virtual laboratories capable of simulating typical operating environments as well as extreme situations in the operation of different devices. A typical subject in which students can benefit from the use of virtual laboratories is robotics. In this work we will develop an immersive virtual reality (VR) pedagogical simulator of industrial robotic arms for engineering students. With the proposed system, students will know the effects of their own designed trajectories on several different robotic arms and cell environments without having to buy all of them and being safe of damaging the cell components. The simulation will be checking for collisions of the elements in the scene and alert the student when they happen. This can be achieved with a robotic simulator, but the integration with immersive VR is intended to help students better understand robotics. Moreover, even having a real robotic arm available for students, with this proposed VR method, all the students have the opportunity to manage and learn his own version of the robotic cell, without waiting times generated by having less robotic arms than students in classroom.
robotics; simulation; Virtual-Reality; pedagogy; education

1. Introduction

Virtual reality technologies have been available for a long time to support the teaching-learning process in articles of educational scope [1]. With new technologies, cheaper and more efficient, its use has become widespread at all educational levels. Specifically, Merkant et al. [2] present a meta-analysis on the effectiveness of virtual reality-based learning tools on the performance of high school and university students: a set of 69 studies that included more than 8000 students were meta-analyzed. The conclusions of this research show an improvement in learning, with resources in individual games providing the best results.
Although the advantages of this technology are numerous, there are also difficulties in its use. Petrakou [3] clarifies that to obtain greater benefits students should be more familiar with virtual world environments and improve their technical skills; in addition, it would be desirable to overcome the technical problems associated with these computer generated environments. The results of Dalgarno, et al. [4] agree with Petrakou that the use of immersive 3D virtual worlds requires levels of instructional and technical support that make them difficult to use.
The usefulness of VR over design and engineering is undoubted. The use of this type of tools fosters collaborative and interdisciplinary work, helps in the understanding of complex concepts and allows working in environments and conditions that, for various reasons, are not easy to recreate in a real way [5,6,7,8,9,10,11,12,13,14].
In the same way, the usefulness that AR / VR environments are having in the world of engineering has been transferred to university engineering education where there are diverse works on virtual reality. Miyata [15] proposes the formation of interdisciplinary groups to promote creativity when developing VR applications. Divergent thinking is encouraged as one of the essential stages of the process. The evaluation of the teaching-learning process is carried out by means of a questionnaire of 13 dimensions that evaluate the collaborative process, on the one hand, as well as another questionnaire of 5 dimensions oriented more to the evaluation of the individual process in five dimensions or aspects. Precisely, the creative side is addressed in other more recent works [16,17]. Despite its relevance in autonomous learning and in the capacities of future engineers and designers, the enhancement of this quality by virtual learning environments is not sufficiently contrasted. In fact, the majority of works focus on assessing the quality of immersion and interaction with the environment, as well as the convenience of its use [18]. Aspects such as the promotion of imagination and creativity remain most of the time in the background, especially since they are difficult to be measured objectively [19].
Several research projects have been carried out to develop methodological proposals that can be applied in different areas of higher education, including those related to engineering education. Thus, Bell and Fogler [20] set out a series of recommendations to guide educational methods to students based on the teaching-learning styles of the instructor and the student, which often do not coincide. Following this scheme a course for Chemical Engineering is designed supported by Vicher virtual reality environment (Virtual Chemical Reaction Module). The objective of the course is to guide the teaching of chemical reactions. After the experience, the quality and effectiveness of these technologies in the teaching-learning process was evaluated. The results of the students who used the VR module were significantly better, showing a better acquisition of the evaluated competences. In addition, most of the use of technology was transferred to the field of motivation since these students stated that the VR tool had enhanced their learning. Similarly, Hashemipour [21] presented a Virtual Reality environment oriented to the field of mechanical engineering and industry based on modules. The evaluation of the usability of the system followed the SUMI method of Kirakowski [22] and five specific categories of virtual reality were added: clear entry and exit points, sense of presence, navigation and orientation, faithful viewpoints, and realistic feedback. In the same line, Sutcliffe and Gault [23] proposed a series of criteria that allow evaluating the usability of a VR application. The twelve criteria, which must be scored by the users of the environment, are fundamentally oriented to the interface with the user in each of its aspects.
Other experiences in the university world have been based on interdisciplinary projects [24]. In particular, Häfner [25] proposes the design of a course to learn how to create virtual worlds by proposing industrial projects among students of different degrees and promoting the development of diverse skills. The results showed an improvement in the quality of the breeding, as it was concluded from the analysis of the projects carried out and a series of questionnaires.
The main objective of this study was to design, build and test a low-cost immersive virtual reality system [26,27] to enhance teaching of robotics in engineering. The work focuses specifically on cells containing robotic arms. The immersion, real time, low cost, the absence of delays and educational approach were the factors that were taken into account when making the proposed design.
This article is organized as follows: Section 2 gives a background of software used in the literature and a brief insight of issues present while teaching robotics in engineering courses. Section 3, describes the design and implementation of the proposed simulation software and how the connection between this program and the VR glasses is performed. In Section 4, a prototype of the proposed method is created and described. It also contains the experiments performed to ensure the correctness of the method. Conclusions are drawn in Section 5.

2. Background

With the rise of Industry 4.0, more students choose robotics because they want to be prepared for the near future. Robotics is a branch of engineering with a high degree of difficulty. This requires that learning be appropriate and conscientious. From the educational point of view, there exist some problems when dealing with courses of this topic:
  • Cost of robotic arms.
  • Alternate use among students.
  • Space required by robots.
  • Courses not requiring attendance.
  • Different robot models.
  • Hard to evaluate programs of students.
  • Safety problems due to inexperienced students.
Robotic arms are expensive pieces of hardware as there are other related components such as tools, safety barriers or controllers. Having a working and maintained robotic arm in a educational institution may require excessive funds. This cost limits the number of available robots for students. Another limiting factor is the great space required for these robots. Even if the institution have one or two robotic arms, the number of students will be higher, making it impossible for each student to use a robot exclusively. In case the course does not require attendance, the impact of this problem is increased as no student has an industrial robotic arm available at home.
There exist different robotic arm manufacturers, and each one has several models. In learning, it is interesting to teach as many different options as possible, but this is very difficult to achieve, as it requires having different robotic arms available. This makes it difficult to evaluate robot programs generated by the students, as teacher needs to test them in the corresponding robot.
Students are in the middle of a learning process, and this can lead to safety problems. Robotic arms are heavy machines with fast moving parts [28] and inexperienced students may harm objects and humans in the surroundings.
Simulation helps to minimize the impact of the exposed problems. There exist robotic toolboxes widely used in education [29,30]. MATLAB is a common tool used to develop the mentioned toolboxes and other custom educational resources used on engineering courses [31,32,33,34], but they are non-immersive and lacks real-time rendering as it is an interpreted language not designed to provide fast 3D rendering capabilities and they tend to be used to teach only fundamental concepts of robotics. There are also simulation software that can be used for teaching [35,36,37,38,39]. The toolboxes and simulators found in the literature can be used to teach robotics but they are very limited compared to an industrial robotic simulation software with advanced capabilities such as collision detection, easy-to-use cell design interface, trajectory visualisation and checking, connection to physical robot, robot program generation that works on the real robot etc. This way, not only students will understand better the insights of robotics, but be more prepared for work outside academic world. They also lack immersive capabilities that may be useful on educational environments.
However, with the rise of immersive virtual reality in recent years [40,41,42], it is interesting to consider its use to enhance the quality of teaching in the field of robotics. The use of this technology enhances the 3D spatial vision of the student and creates an immersive experience that helps the user to better understand the insights of robotics. Given the lack of studies that have proven the goodness of robotic simulators in immersive virtual reality scenarios for education, this research presents the implementation and testing of a robotic simulator that offers immersion capacity and requires low-cost devices. This industrial robotics simulation software has advanced features that, together with the immersive feeling, will allow students to better understand the knowledge of robotics. In addition, they will be better prepared to work outside the academic world [43], where they will be able to find similar simulation software and robotic arms.

3. Implementation

The proposed method consists of a custom simulator software specific for robotic arms and a cheap virtual reality cardboard device attached to a mobile phone to stream 3D rendered video from the computer and send head orientation information to the simulator. Interconnection of these parts is shown in Figure 1. The following subsections will explain each one of those parts in detail.

3.1. Simulation Software

Simulation in robotics allows users to design, visualize, monitor, perform safety checks and path planning among other things. It is also a powerful tool in areas such research and development [44]. For this reason, a custom made software simulation for robotic arms has been developed [45] to allow users to design and test robotic cell environments in virtual scenarios. The simulation software has been designed to be as close as possible in capabilities to real industrial robotic simulation software, so students have a better idea of how robotics works in real-life scenarios.
The simulator uses OpenGL [46] to render the models and the robotic arm of the scene in a three-dimensional environment.
In order to simulate the movement of the robotic arm, forward kinematics [47] was used. It allows the simulator to know the world position and orientation of each joint when rotations are applied to them locally.
The standard Denavit-Hartenberg (DH) convention [48] has been used to describe the kinematic chain of the robotic arms in the simulator. The parameters of the DH table vary for each model and manufacturer. Each DH matrix i 1 T i converts from coordinate system i 1 of a joint into i. Forward kinematics T of the robotic arm are obtained by the successive multiplication of those matrices as shown in Equation (1), allowing the transformation from an arbitrary joint’s coordinate system into another.
T = i = 1 n i 1 T i
Software uses the value of the angles of robot joints to reproduce the actual pose of the robotic arm. It applies the described kinematic equations to obtain the pose. This allows users to apply all transformations allowed to the models and camera to freely modify and inspect the scene from any viewpoint.
Inverse kinematics has been used to allow the simulator to obtain the pose angles that allows the robotic arm to move to an specific position and orientation [49]. This is needed to allow the student to perform simulations of trajectories or to check if a position and orientation is reachable by the selected robotic arm.
Collision checks between simulated entities inside the virtual environment are detected with ODE (Open Dynamics Engine) library [50]. There exist several methods for performing rigid body collision checking. However, due to the real-time characteristics of the proposed VR simulation model a combination of AABB (Axis Aligned Bounding Box) and OBB (Oriented Bounding Box) have been selected [51]. The selected collision method performs a fast check with AABB and if there is a possibility that there is a collision inside the Axis Aligned Bounding Box, then a finer check is performed with OBB to enhance the precision of detections [52]. This two step method of collision detection is intended to speed up detections and fulfil real-time requirement. Other methods can be used depending on the needed requirements [53,54]. Figure 2a shows an example of the explained two step collision detection method. The 3D bolt models are surrounded by an AABB, shown in blue. If needed, the algorithm performs collision check against the OBB, marked in green. Figure 2b shows the collision detection system working in a real scenario, checking collisions of robot joints against environment objects. To simplify and enhance the visibility, this figure shows only OBB’s, although previous checks against AABB’s are performed. Red boxes show the objects involved in the collision, in this case both the tool of the robotic arm and the controller.
As a result, a simulation software with integrated GUI (Graphical User Interface) is obtained as shown in Figure 3.
This GUI allows users to design and test virtual robotic environments in 3D without the need of real physical hardware. As this method is intended to be used for educational purposes, it allows institutions to teach robotics without purchasing expensive robotic arms. It also increases safety, as it avoids inexperienced students to handle expensive and potentially dangerous machines.

3.2. Virtual Reality

Virtual reality allows users to create an immersive experience, allowing to navigate and explore a computer generated world without being in it [55].
It has been an interesting topic used by different sectors such as footwear [56], automotive [57] or healthcare [58]. Virtual reality combined with robotics it has already been studied by other authors, to interactively program robotic arms [59,60].
There exist various ways to achieve immersive virtual reality. One specially low-cost method is to use cardboard glasses as shown in Figure 4a. Figure 4b shows a detail of the bay used to insert a smartphone and treat its screen as VR display. There exist more expensive options that doesn’t need an additional phone to work as they include extra hardware and a pair of small LCD screens, such as Oculus Rift or HTC Vive glasses. As everyone now tends to have a smart-phone, the first option seems to be the cheapest one.
The simulator used symmetric frustum to calculate the perspective. However, in order to achieve the stereo effect, asymmetric frustums [61] has been used. Figure 5 shows the different frustum used for each eye. Vertices between the near/far planes that are closer to the green convergence line will show with negative parallax while further ones will appear with positive parallax. The simulator updates the position and orientation of every object in the virtual scene once per frame, and then renders the same scene to two different horizontal viewports, each one corresponding to a separate eye and with different frustums.
In order to achieve a correct virtual reality experience, the orientation of the student head needs to be sent to the simulator software. An application on the phone is constantly sending IMU (Inertial Measurement Unit) data to the simulator using an UDP network socket over Wi-Fi. This information is then used by the software in the computer to obtain an approximation of the orientation of the user’s head [62]. The obtained approximation is applied to the 3D camera of the scene, affecting the rendered viewpoint.
Jitter generated by the noise of the incoming IMU data stream received from the cellphone has been reduced to obtain smooth movements of the camera controlling the scene inside the simulation software. A Complementary filter has been used to fuse sensor data and obtain an smoothed representation of the noisy input data [63]. Figure 6a,b show a comparison of both raw and filtered roll and pitch values obtained from a short period of time. The results show an smooth capture of head movements and its translation to the virtual camera can be achieved with the proposed method, which is an important step to give students an immersive experience not present in other educative software for robotics.
The proposed simulation software has been adapted to include VR support. The output of asymmetric frustums applied to two parallel viewports in the rendering process can be seen in Figure 7.
The last step of the process is to send the rendered 3D output of the simulator software [64] with both asymmetric frustums to the phone with virtual reality glasses attached. This has been solved by using streaming software present on NVIDIA devices and a client on the phone, both using the NVIDIA Gamestream Protocol [65]. The rate of frames per second needs to be constant and as high as the rendering device is able to achieve. This allow users to have a smooth motion experience without stuttering or lags.

4. Experiment

4.1. Prototype Preparation

An scene has been created with the simulator to recreate a real robotic cell inside the virtual environment. Figure 8 shows both the real cell on the left side and the simulated one in the right side. The real cell is used to apply glue to shoe soles fed by a rotary table. This process is used in the footwear industry to join the sole to the rest of the shoe and is normally realised by hand using brushes. This prototype is a test bench, but the proposed method can be used for other real-life scenarios.
The proposed simulation software is running on a desktop computer with CPU i5-3470, 8 GB DDR3 of RAM, GeForce GTX 1060-6GB GPU. The cardboard glasses used for VR are the Iggual IGG313527. The phone placed inside the cardboard glasses is a LeeCo Le 2 Max running Android 6.0 with 8-core SnapDragon 820 CPU, 6 GB of RAM with a display of 5.7” and a native resolution of 2560 x 1440 px with a refresh rate of 60 Hz.

4.2. Measuring Real-Time Capabilities

The performed experiment measured the total spent time between sensors on the phone captures movement of user’s head and the processed and rendered 3D landscape image with dual view-ports is sent back to the VR cardboard glasses. This is needed to ensure that no noticeable delay or stuttering are passed to glasses, minimizing motion sickness [66].
Lag may come from any of the different parts that compose the proposed method. Even when Wi-Fi is being used to connect the smartphone of cardboard glasses to the computer running the simulator, the average measured latency of the network is 4 ms. For this value, the collision checking process and the representation of all objects within the simulated scene for two viewports must be taken into account. With the rendering engine running at 60 frames per second, the maximum delay that can be added because of the processing part of the computer is 16.66 ms. Finally, the rendered image is sent back to the smartphone, adding more delay.
Figure 9 shows the total amount of milliseconds spent from the IMU data transfer start to the received rendered frame back to the smartphone over a period of time, taking into account network, streaming and simulation processing time.
The experiment captured a sample of 200 steps to fetch the data shown in the previous graph, where the value of each step was calculated as the time to complete a full communication cycle from the phone to simulator and back to the phone. The average value of 11 ms for the whole test scenario, confirms that the proposed method can be used to achieve the desired immersive experience without noticeable lags. This ensures that key contribution factors of the paper such as real-time and immersion are fulfilled and can be used to enhance teaching of robotics. If the quality of the network is high, the delay of IMU and stream messages should remain stable and low, maintaining the real-time capabilities of the method. Performance problems of the computer can negatively affect the number of images per second that the simulator can process, which leads to a greater lag and a negative VR experience. Also, the small variance of the resulting data is affected by the different amount of time needed to calculate collisions and position of elements in the scene depending of the complexity of the pose for the robot. This is directly affected by each point of the trajectory, leading to lower computation times if collision is detected on early stages of the detection process, as the system does not continue checking for more collisions.

5. Conclusions

The positive impact of virtual reality on the teaching-learning process has been reviewed in this paper, especially in areas where practical teaching is an important focus, such as engineering courses.
To take advantage of this technology, custom simulation software has been expanded and adapted to not only design, visualize, monitor, perform safety checks and path planning, but also to use immersive virtual reality as a tool to improve the quality of teaching in robotics technology courses. This simulator has been designed to be used in educational institutions such as high schools or universities. The proposed method makes use of currently available and easily accessible technologies, such as simulation, immersion, virtual reality, Wi-Fi technology or streaming.
Other approaches in literature regarding VR on education are designed as non-realistic VR systems or based on interpreted programming tools like MATLAB that doesn’t scale in speed to get immersive VR real-time experiences. Others are focused on the emulation of the movements of a single robot, without taking into account other cell elements, or lacks advanced features such as collision detection or reachability path checks among others, as they are used to teach only fundamental concepts of robotics. Those drawbacks justify the design, construction and test of the proposed method.
The impact of this method focuses on increasing quality and reducing costs in education of robotics. This has been achieved through the use of cardboard glasses, which are inexpensive and allow students to use their own smartphone to interact with the simulation from their computer. It also prevents educational institutions from spending large amounts of money and space to buy, maintain and store robotic arms and other expensive materials.
Our approach aims to provide not only an immersive experience for students to improve the quality of teaching, but also to enable students to interact with an industrial robotics simulator with advanced capabilities such as collision detection, an easy-to-use cell design interface, path visualization and checking, connection to the physical robot and the ability to generate robot programs that work like a real robot. In this way, not only will students better understand the concepts associated with robotics, but they will also be better prepared to work outside the academic world.
Safety is other key point of the method, as the simulation reduces the chance of accidents caused by inexperienced students experimenting with heavy and dangerous robotic arms. The collision detection system included in the simulator is intended to detect and prevent those accidents. Students are able to design cells by using trial and error without generating collisions with real life cell elements.
Experimentation showed that the proposed method is able to meet the real-time needs of an immersive virtual reality system, to create an enhanced realistic experience to the students, avoiding stuttering and lag in the process.
Future works will be aimed at testing the proposed virtual reality robotic simulator on educational institutions with real students to measure the impact on their teaching. The required user studies aimed to analyse and confirm that the proposed method would be applicable and useful in real-world educational environments are planned to start during the academic year 2018/19 in the Degree of Robotic Engineering of the University of Alicante.


This work was funded by the Ministry of Economy and Competitiveness of Spain (Ministerio de Economía y Competitividad de España), through Reference TIN2017-89266-R Project.

Author Contributions

María Luisa Pertegal-Felices and Antonio Jimeno-Morenilla analyzed the literature background; Vicente Roman-Ibañez and Higinio Mora-Mora conceived and designed the experiments; Vicente Roman-Ibañez performed the experiments; Francisco Pujol-López analyzed the data; Vicente Roman-Ibañez and Antonio Jimeno-Morenilla wrote the paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References and Note

  1. Psotka, J. Immersive training systems: Virtual reality and education and training. Instr. Sci. 1995, 23, 405–431. [Google Scholar] [CrossRef]
  2. Merchant, Z.; Goetz, E.T.; Cifuentes, L.; Keeney-Kennicutt, W.; Davis, T.J. Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: A meta-analysis. Comput. Educ. 2014, 70, 29–40. [Google Scholar] [CrossRef]
  3. Petrakou, A. Interacting through avatars: Virtual worlds as a context for online education. Comput. Educ. 2010, 54, 1020–1027. [Google Scholar] [CrossRef]
  4. Dalgarno, B.; Lee, M.J.W.; Carlson, L.; Gregory, S.; Tynan, B. An Australian and New Zealand scoping study on the use of 3D immersive virtual worlds in higher education. Australas. J. Educ. Technol. 2011, 27, 1–15. [Google Scholar] [CrossRef][Green Version]
  5. Vergara, D.; Rubio, M.P.; Lorenzo, M. New Approach for the Teaching of Concrete Compression Tests in Large Groups of Engineering Students. J. Prof. Issues Eng. Educ. Pract. 2017, 143, 5016009. [Google Scholar] [CrossRef]
  6. Zwolinski, P.; Tichkiewitch, S.; Sghaier, A. The Use of Virtual Reality Techniques during the Design Process: from the Functional Definition of the Product to the Design of its Structure. CIRP Ann. 2007, 56, 135–138. [Google Scholar] [CrossRef]
  7. Nomura, J.; Sawada, K. Virtual reality technology and its industrial applications. Annu. Rev. Control 2001, 25, 99–109. [Google Scholar] [CrossRef]
  8. Bruno, F.; Muzzupappa, M. Product interface design: A participatory approach based on virtual reality. Int. J. Hum. Comput. Stud. 2010, 68, 254–269. [Google Scholar] [CrossRef]
  9. Shen, Y.; Ong, S.K.; Nee, A.Y.C. Augmented reality for collaborative product design and development. Des. Stud. 2010, 31, 118–145. [Google Scholar] [CrossRef]
  10. Ong, S.; Mannan, M. Virtual reality simulations and animations in a web-based interactive manufacturing engineering module. Comput. Educ. 2004, 43, 361–382. [Google Scholar] [CrossRef]
  11. Impelluso, T.; Metoyer-Guidry, T. Virtual Reality and Learning by Design: Tools for Integrating Mechanical Engineering Concepts. J. Eng. Educ. 2001, 90, 527–534. [Google Scholar] [CrossRef]
  12. Stone, R. Virtual reality for interactive training: An industrial practitioner’s viewpoint. Int. J. Hum. Comput. Stud. 2001, 55, 699–711. [Google Scholar] [CrossRef]
  13. Gamo, J. A Contribution to Virtual Experimentation in Optics. In Advanced Holography- Metrology and Imaging; InTech: Rijeka, Croatia, 2011; Chapter 16; pp. 357–374. [Google Scholar]
  14. Casas, S.; Portalés, C.; García-Pereira, I.; Fernández, M. On a First Evaluation of ROMOT—A RObotic 3D MOvie Theatre—For Driving Safety Awareness. Multimodal Technol. Interact. 2017, 1, 6. [Google Scholar] [CrossRef]
  15. Miyata, K.; Umemoto, K.; Higuchi, T. An educational framework for creating VR application through groupwork. Comput. Graph. 2010, 34, 811–819. [Google Scholar] [CrossRef]
  16. Abulrub, A.H.G.; Attridge, A.; Williams, M.A. Virtual Reality in Engineering Education: The Future of Creative Learning. Int. J. Emerg. Technol. Learn. (iJET) 2011, 6, 751–757. [Google Scholar]
  17. Jimeno-Morenilla, A.; Sánchez-Romero, J.L.; Mora-Mora, H.; Coll-Miralles, R. Using virtual reality for industrial design learning: A methodological proposal. Behav. Inf. Technol. 2016, 35, 897–906. [Google Scholar] [CrossRef]
  18. Saleeb, N.; Dafoulas, G.A. Effects of Virtual World Environments in Student Satisfaction. Int. J. Knowl. Soc. Res. 2011, 2, 29–48. [Google Scholar] [CrossRef]
  19. Thorsteinsson, G.; Page, T. Creativity in technology education facilitated through virtual reality learning environments: A case study. J. Educ. Technol. 2007, 3, 74–87. [Google Scholar]
  20. Bell, J.T.; Fogler, H.S. Investigation and application of virtual reality as an educational tool. Available online: (accessed on 29 March 2018).
  21. Hashemipour, M.; Manesh, H.F.; Bal, M. A modular virtual reality system for engineering laboratory education. Comput. Appl. Eng. Educ. 2011, 19, 305–314. [Google Scholar] [CrossRef]
  22. Kirakowski, J. The use of questionnaire methods for usability assessment. Assessment 1994, 2008, 1–17. [Google Scholar]
  23. Sutcliffe, A.; Gault, B. Heuristic evaluation of virtual reality applications. Interact. Comput. 2004, 16, 831–849. [Google Scholar] [CrossRef]
  24. Saunier, J.; Barange, M.; Blandin, B.; Querrec, R.; Taoum, J. Designing adaptable virtual reality learning environments. In Proceedings of the Proceedings of the 2016 Virtual Reality International Conference, Laval, France, 23–25 March 2016. [Google Scholar]
  25. Häfner, P.; Häfner, V.; Ovtcharova, J. Teaching Methodology for Virtual Reality Practical Course in Engineering Education. Procedia Comput. Sci. 2013, 25, 251–260. [Google Scholar] [CrossRef]
  26. Brown, A.; Green, T. Virtual Reality: Low-Cost Tools and Resources for the Classroom. TechTrends 2016, 60, 517–519. [Google Scholar] [CrossRef]
  27. Vergara, D.; Rubio, M.; Lorenzo, M. On the Design of Virtual Reality Learning Environments in Engineering. Multimodal Technol. Interact. 2017, 1, 11. [Google Scholar] [CrossRef]
  28. Haddadin, S.; Albu-Schäffer, A.; Hirzinger, G. Safe Physical Human-Robot Interaction: Measurements, Analysis & New Insights. Robot. Res. 2011, 66, 395–407. [Google Scholar]
  29. Corke, P.I. Robotics, Vision and Control; Springer: Berlin, Germany, 2017; p. 580. [Google Scholar]
  30. Gil, A.; Reinoso, O.; Marin, J.M.; Paya, L.; Ruiz, J. Development and deployment of a new robotics toolbox for education. Comput. Appl. Eng. Educ. 2015, 23, 443–454. [Google Scholar] [CrossRef]
  31. Tijani, I.B. Teaching fundamental concepts in robotics technology using MATLAB toolboxes. In Proceedings of the IEEE Global Engineering Education Conference (EDUCON), Abu Dhabi, United Arab Emirates, 10–13 April 2016; pp. 403–408. [Google Scholar]
  32. Cocota, J.A.N.; D’Angelo, T.; Monteiro, P.M.d.B.; Magalhães, P.H.V. Design and implementation of an educational robot manipulator. In Proceedings of the XI Technologies Applied to Electronics Teaching (TAEE), Bilbao, Spain, 11–13 June 2014; pp. 1–6. [Google Scholar]
  33. Chinello, F.; Scheggi, S.; Morbidi, F.; Prattichizzo, D. KUKA Control Toolbox. IEEE Robot. Autom. Mag. 2011, 18, 69–79. [Google Scholar] [CrossRef]
  34. Flanders, M.; Kavanagh, R.C. Build-A-Robot: Using virtual reality to visualize the Denavit-Hartenberg parameters. Comput. Appl. Eng. Educ. 2015, 23, 846–853. [Google Scholar] [CrossRef]
  35. Freese, M.; Singh, S.; Ozaki, F.; Matsuhira, N. Virtual Robot Experimentation Platform V-REP: A Versatile 3D Robot Simulator. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2010; Volume 6472, pp. 51–62. [Google Scholar]
  36. Hurtado, C.V.; Valerio, A.R.; Sanchez, L.R. Virtual Reality Robotics System for Education and Training. In Proceedings of the IEEE Electronics, Robotics and Automotive Mechanics Conference, Morelos, Mexico, 28 September–1 October 2010; pp. 162–167. [Google Scholar]
  37. Candelas, F.A.; Puente, S.T.; Torres, F.; Gil, P.; Ortiz, F.G.; Pomares, J. A Virtual Laboratory for Teaching Robotics. Int. J. Eng. Educ. 2003, 19, 363–370. [Google Scholar]
  38. Mehta, I.; Bimbraw, K.; Chittawadigi, R.G.; Saha, S.K. A teach pendant to control virtual robots in Roboanalyzer. In Proceedings of the International Conference on Robotics and Automation for Humanitarian Applications (RAHA), Kerala, India, 18–20 December 2016; pp. 1–6. [Google Scholar]
  39. Arnay, R.; Hernández-Aceituno, J.; González, E.; Acosta, L. Teaching kinematics with interactive schematics and 3D models. Comput. Appl. Eng. Educ. 2017, 25, 420–429. [Google Scholar] [CrossRef]
  40. Dede, C. Immersive Interfaces for Engagement and Learning. Science 2009, 323, 66–69. [Google Scholar] [CrossRef] [PubMed]
  41. Weidlich, D.; Cser, L.; Polzin, T.; Cristiano, D.; Zickner, H. Virtual reality approaches for immersive design. CIRP Ann. Manuf. Technol. 2007, 56, 139–142. [Google Scholar] [CrossRef]
  42. Mujber, T.; Szecsi, T.; Hashmi, M. Virtual reality applications in manufacturing process simulation. J. Mater. Process. Technol. 2004, 155156, 1834–1838. [Google Scholar] [CrossRef]
  43. Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
  44. Zlajpah, L. Simulation in robotics. Math. Comput. Simul. 2008, 79, 879–897. [Google Scholar] [CrossRef]
  45. Román-Ibáñez, V.; Jimeno-Morenilla, A.; Pujol-López, F.; Salas-Pérez, F. Online simulation as a collision prevention layer in automated shoe sole adhesive spraying. Int. J. Adv. Manuf. Technol. 2017, 95, 1243–1253. [Google Scholar] [CrossRef]
  46. Shreiner, D.; Sellers, G.; Kessenich, J.M.; Licea-Kane, B. OpenGL Programming Guide: The Official Guide to Learning OpenGL, Version 4.3; Graphics Programming, Addison-Wesley: Boston, MA, USA, 2013. [Google Scholar]
  47. Waldron, K.; Schmiedeler, J. Kinematics. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2008; pp. 9–33. [Google Scholar]
  48. Radavelli, L.; Simoni, R.; Pieri, E.D.; Martins, D. A Comparative Study of the Kinematics of Robots Manipulators by Denavit-Hartenberg and Dual Quaternion. Mec. Comput. 2012, XXXI, 13–16. [Google Scholar]
  49. Manocha, D.; Canny, J.F. Efficient inverse kinematics for general 6R manipulators. IEEE Trans. Robot. Autom. 1994, 10, 648–657. [Google Scholar] [CrossRef]
  50. Smith, R. Open Dynamics Engine ODE, Multibody Dynamics Simulation Software. 2004.
  51. Lin, M.; Manocha, D.; Cohen, J.; Gottschalk, S. Collision detection: Algorithms and applications. In Algorithms for Robotics Motion and Manipulation: 1996 Workshop on the Algorithmic Foundations of Robotics; A K Peters: Wellesley, MA, USA, 1996; pp. 129–142. [Google Scholar]
  52. Gottschalk, S.; Lin, M.C.; Manocha, D. OBBTree. In Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques - SIGGRAPH ’96; ACM: New York, NY, USA, 1996; pp. 171–180. [Google Scholar]
  53. Fares, C.; Hamam, Y. Collision Detection for Rigid Bodies: A State of the Art Review. In Proceedings of the International Conference Graphicon, Novosibirsk Akademgorodok, Russia, 20–14 June 2005. [Google Scholar]
  54. Reggiani, M.; Mazzoli, M.; Caselli, S. An experimental evaluation of collision detection packages for robot motion planning. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System, Lausanne, Switzerland, 30 September–4 October 2002; Volume 3, pp. 2329–2334. [Google Scholar]
  55. Mihelj, M.; Novak, D.; Begus, S. Introduction to Virtual Reality. In Intelligent Systems, Control and Automation: Science and Engineering; Springer Nature: Berlin, Germany, 2014; Volume 68, pp. 1–16. [Google Scholar]
  56. Jimeno-Morenilla, A.; Sánchez-Romero, J.L.; Salas-Pérez, F. Augmented and Virtual Reality techniques for footwear. Comput. Ind. 2013, 64, 1371–1382. [Google Scholar] [CrossRef][Green Version]
  57. Lányi, C.S. Virtual reality in healthcare. Stud. Comput. Intell. 2006, 19, 87–116. [Google Scholar]
  58. Lawson, G.; Salanitri, D.; Waterfield, B. Future directions for the development of virtual reality within an automotive manufacturer. Appl. Ergon. 2016, 53, 323–330. [Google Scholar] [CrossRef] [PubMed]
  59. Michas, S.; Matsas, E.; Vosniakos, G.C. Interactive programming of industrial robots for edge tracing using a virtual reality gaming environment. Int. J. Mechatron. Manuf. Syst. 2017, 10, 237–259. [Google Scholar]
  60. Jen, Y.; Taha, Z.; Vui, L. VR-Based Robot Programming and Simulation System for an Industrial Robot. Int. J. Ind. Eng. Theory Appl. Pract. 2008, 15, 314–322. [Google Scholar]
  61. Fleck, M.M. Perspective projection: The wrong imaging model. Available online: (accessed on 29 March 2018).
  62. Starlino Electronics. A Guide To using IMU (Accelerometer and Gyroscope Devices) in Embedded Applications. Available online: (accessed on 29 March 2018).
  63. Alam, F.; Zhaihe, Z.; Jiajia, H. A Comparative Analysis of Orientation Estimation Filters using MEMS based IMU. In Proceedings of the 2nd International Conference on Research in Science, Engineering and Technology (ICRSET’2014), Dubai, The United Arab Emirates, 21–22 March 2014; pp. 86–91. [Google Scholar]
  64. Cai, W.; Shea, R.; Huang, C.Y.; Chen, K.T.; Liu, J.; Leung, V.C.M.; Hsu, C.H. A Survey on Cloud Gaming: Future of Computer Games. IEEE Access 2016, 4, 7605–7620. [Google Scholar] [CrossRef]
  65. Gutman, C.; Waxemberg, D.; Neyer, A.; Bergeron, M.; Hennessy, A. Moonlight an open source NVIDIA Gamestream Client. Available online: (accessed on 29 March 2018).
  66. Lackner, J.R. Simulator sickness. J. Acoust. Soc. Am. 1992, 92, 2458. [Google Scholar] [CrossRef]
Figure 1. Schema of the proposed method.
Figure 1. Schema of the proposed method.
Sustainability 10 01102 g001
Figure 2. Collision detection method samples. (a) Detail of Axis Aligned Bounding Box (AABB) and Oriented Bounding Box (OBB) interaction; (b) Collision detection on a simulated cell.
Figure 2. Collision detection method samples. (a) Detail of Axis Aligned Bounding Box (AABB) and Oriented Bounding Box (OBB) interaction; (b) Collision detection on a simulated cell.
Sustainability 10 01102 g002
Figure 3. Robotic simulator running.
Figure 3. Robotic simulator running.
Sustainability 10 01102 g003
Figure 4. Virtual reality glasses. (a) Cheap cardboard virtual reality (VR) glasses; (b) Detail of the smartphone insertion bay.
Figure 4. Virtual reality glasses. (a) Cheap cardboard virtual reality (VR) glasses; (b) Detail of the smartphone insertion bay.
Sustainability 10 01102 g004
Figure 5. Explanation of the asymmetric frustums.
Figure 5. Explanation of the asymmetric frustums.
Sustainability 10 01102 g005
Figure 6. Comparison of filtered and raw input Inertial Measurement Unit (IMU) streams. (a) Raw and filtered roll values; (b) Raw and filtered pitch values.
Figure 6. Comparison of filtered and raw input Inertial Measurement Unit (IMU) streams. (a) Raw and filtered roll values; (b) Raw and filtered pitch values.
Sustainability 10 01102 g006
Figure 7. Simulator rendering two viewports with asymmetric frustums.
Figure 7. Simulator rendering two viewports with asymmetric frustums.
Sustainability 10 01102 g007
Figure 8. Real and simulated robotic cell for shoe-sole gluing.
Figure 8. Real and simulated robotic cell for shoe-sole gluing.
Sustainability 10 01102 g008
Figure 9. Milliseconds spent for single VR steps over time.
Figure 9. Milliseconds spent for single VR steps over time.
Sustainability 10 01102 g009

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (
Back to TopTop