Next Article in Journal
A Bayesian-Attack-Graph-Based Security Assessment Method for Power Systems
Next Article in Special Issue
A Robust AR-DSNet Tracking Registration Method in Complex Scenarios
Previous Article in Journal
Internet of Things with Deep Learning Techniques for Pandemic Detection: A Comprehensive Review of Current Trends and Open Issues
Previous Article in Special Issue
A Virtual Reality Environment Based on Infrared Thermography for the Detection of Multiple Faults in Kinematic Chains
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Virtual Reality for Career Development and Exploration: The CareProfSys Profiler System Case

by
Maria-Iuliana Dascalu
1,*,
Iulia-Cristina Stanica
1,
Ioan-Alexandru Bratosin
1,
Beatrice-Iuliana Uta
1 and
Constanta-Nicoleta Bodea
2,3
1
Department of Engineering in Foreign Languages, Faculty of Engineering in Foreign Languages, National University of Science and Technology POLITEHNICA Bucharest, 060042 Bucharest, Romania
2
Department of Economic Informatics and Cybernetics, Bucharest University of Economic Studies, 010552 Bucharest, Romania
3
“COSTIN C. KIRITESCU”, National Institute for Economic Research, Romanian Academy, 050711 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(13), 2629; https://doi.org/10.3390/electronics13132629
Submission received: 8 May 2024 / Revised: 28 June 2024 / Accepted: 2 July 2024 / Published: 4 July 2024

Abstract

:
This paper presents an innovative use case of virtual reality (VR) for career development and exploration, within the context of the CareProfSys recommendation system for professions. The recommender users receive recommendations not only in textual format but as WebVR gamified scenarios as well, having thus the possibility to try activities specific to the suggested professions and decide whether they are suitable for them or not. This paper describes, from a functional and technical point of view, scenarios for six different jobs: computer network specialists, civil engineers, web and multimedia developers, chemical engineers, project managers, and university professors. Extended experiments were performed, using an internal protocol, with 47 students enrolled in engineering studies. The results of the experiments were measured with the aid of four instruments: two questionnaires, one unstructured interview, and the VR simulation performance recording module. Positive results were obtained: the users admitted that such a tool was useful when choosing one’s career and that it was entertaining. Most of the students considered the VR scenarios as learning or testing experiences, too. Thus, we claim that a VR form of providing job recommendations is more appealing to young people and brings value to career development initiatives.

1. Introduction

Career development requires continuous professional skill increase. Several learning approaches, such as experiential learning, simulations, serious gaming, etc. [1,2,3], promote practice in work environments as one of the most effective solutions for skill development.
Acting in real work contexts might be in some cases extremely expensive, high risk for trainees or other parties involved (for example, patients in medicine education) or almost impossible, due to the restricted access in the work context (for example, in maritime or space professional skill development). For this reason, virtual environments (VEs), reproducing the relevant characteristics of working environments, often replace real work environments by allowing people to perform activities to gain work experience and increase their level of professional skills.
Virtual reality (VR) technologies started to be widely applied in professional education and training, mainly because they create VEs that overcome the constraints associated with real-world situations and allow trainees to easily execute tasks and acquire relevant work experiences. In 2018, a survey mentioned in [4] showed that two-thirds of the higher education institutions in the US participating in that study had either partially or fully deployed AR/VR solutions and organized dedicated space and labs for students to access or develop AR/VR solutions. In 2020, another survey [5] showed that the respondents considered education to be the second most likely sector to be disrupted by immersive technologies. AlGerafi et al. [6] conducted a comprehensive literature review on AR/VR technology implementation in education. A total of 789 studies published during the period 2014–2023 focusing on AR/VR in education were identified. The research revealed that most AR/VR platforms were implemented in higher education.
Many benefits of AR/VR environments on students’ academic performance have been reported by different studies [7,8,9] by considering not only the cognitive but also the emotional aspects. The students’ active engagement in learning increased, as well as the understanding of theories and abstract concepts that are the basis of teaching. To achieve the expected benefit from AR/VR implementation, several activity design decisions must be made, and many technical components must be made available.
From the point of view of educational design, there are scenarios that are defined to be applied to shape users’ behavior for more efficient skill development. These scenarios are designed based on several learning theories, such as social learning theory, constructivism, and collaborative learning. For example, the constructivism paradigm defines learning as being experimental and experiential. The trainees may discover knowledge based on their own experiences and are encouraged to implement this knowledge [10] in problem-solving situations. Different learning methods, such as student-centered learning, personalized learning, problem-solving learning, activity-based learning, multimodal learning, methods for creating a sense of presence, etc., are applicable in educational settings, including virtual worlds. For example, it was proven that the use of serious games leads to high levels of user interactivity, involvement, and motivation [1,11,12]. All these educational approaches allow an increase in VR solutions’ popularity and their degree of usage in professional education and training [13].
Considering the technical aspects, the important components of VR systems are the devices that allow users to access and act in VEs. The popularity of VR solutions depends on the availability of these devices, their affordable costs, easiness of usage, and the quality of experiences that they make possible. Initially, VR solutions implied the usage of mobile devices (smartphones, tablets, iPads, etc.) and advanced headsets that assure quality, easiness of usage, and affordable costs [14,15]. Currently, the most applied devices are head-mounted displays (HMDs), such as HTC VIVE, which was released in 2016, HTC VIV Pro2, released in 2018, and HTC VIVE Cosmos, released in 2019. In 2021, HTC VIVE Focus 3 and HTC VIVE were released, with high performance and quick configurations. Currently, Oculus Rift and Meta Quest (1 and 2) are also popular. The myopic goggle devices may improve the comfort of visually impaired users. More and more functionalities, related to 3D display, ultra-high resolution, large live view, and availability of somatosensory interaction are demanded by the users of VR devices [16].
VR systems evolved over time, from augmented reality (AR)/VR towards immersive VR (IVR) and the metaverse [1]. The mixture of AR and VR, also named mixed reality (MR), enables users to experience both physical and virtual spaces [17]. This type of solution is mainly applied in cases when users need to interact with virtual entities (objects) but still have awareness of their physical environment. This is achieved in AR/VR solutions by allowing users to perceive virtual objects within a real/physical space.
Immersive virtual reality (IVR) allows users to achieve a mental immersion sense of presence in VEs, meaning that users immerse themselves in VEs and might interact and imagine non-existent things or changes in the VEs. Immersion makes participants feel like they are “really there” [17]. IVR solutions assure more personalized learning experiences for different learning styles, speeds, and levels of pre-existing abilities. And they also assure a high level of active participation due to an increased sense of presence within the learning activities [18].
The metaverse is a space created for people to live according to specific rules. In the metaverse, people can perform different social activities, such as debating ideas, solving problems, playing games, executing projects, etc., in a collaborative way. The metaverse space may be fully virtual, like a VR system, or partially virtual, as in the case of AR/VR solutions. The difference between the metaverse and AR/VR or VR systems relies on the metaverse characteristic of being “shared” (multi-user environment), “persistent” (users with a persistent identity, allowing them “to live”) and “de-centralized” [13]. AR/VR and/or VR systems may be parts of a metaverse, together with other elements, such as artificial intelligence components, that assure compliance with the predefined rules. Several technologies, such as blockchains are used to ensure the characteristic of de-centralization.
The main objective of the current study is to assess the effectiveness of a VR platform integrated into an advanced career counseling system, to assist users who are interested in receiving profession recommendations, but also to visualize activities connected to the recommended professions. The user can try activities specific to the suggested professions by accessing the WebVR scenarios from this address: https://careprof.github.io/, accessed on 3 May 2024. Since developing such scenarios is not easy, we have chosen to develop scenarios for only six professions within the project: computer network specialists, civil engineers, web and multimedia developers, chemical engineers, project managers, and university professors. The experiments’ results are very encouraging and motivate us to extend this VR component to additional scenarios and professions.
This paper contains several sections: the first one is introductive; the second one highlights related work concerning AR/VR systems used in career development; the third one presents in detail the VR module from our proposed career recommender system CareProfSys, both from functional and technical points of view; the Section 4 underlines the materials and methods used in the two rounds of CareProfSys experiments; the Section 5 describes the experimental results and their analysis; the Section 6 discusses the obtained results and presents the limitations of this study, while the seventh and Section 7 draws some conclusions related to the use of VR in a recommender system for professions. This paper has Appendix A, Appendix B, Appendix C, Appendix D and Supplementary Materials.

2. Related Work

VR technologies are increasingly used for developing professional capabilities, as required by jobs in different industries, for career development. There are diverse applications of VR systems for educating and training the workforce in many industries, as shown below.

2.1. AR/VR Systems for Career Development in Architecture, Engineering, and Construction Industry (AEC)

Tan et al. [16] presented the main findings from the literature research conducted on the adoption of AR/VR technologies in professional education and training in the AEC industry. A special focus was assured on quantifying the impact of AR/VR on the improvement of real-time interactions, based on the development of tracking and positioning capabilities. The following four traditional application groups were identified for AEC: immersive AR/VR learning, which provides a risk-free construction environment for simulations; AR/VR for structure analysis, which aims to help users visualize and understand complex spatial arrangements; visual-aided design tools, which enhance users’ abilities for interior and exterior design; and AR/VR-based teaching aids, for teaching the users to make appropriate design decisions. Other AEC application areas that were identified and discussed are safety education and equipment operation (how to safely manipulate complex equipment) and hazard recognition (hazard identification and accident prevention).
A career in civil engineering requires high spatial abilities and a good understanding of construction sequence. In [19,20,21], there are several examples of systems for the visualization of construction sites and the sequence of construction phases, to better understand the dynamic and complex space constraints. Considering that construction sites are very complex environments with high risks of accidents, in [21,22], VR systems for the safety education of college students are presented.
Structural analysis involves concepts and algorithms for imagining and assessing complex spatial arrangements [23]. AR/VR systems may help users to better visualize implicit structural knowledge [24] and use their knowledge in model development. In [25,26], VR systems are presented that allow users to develop models and easily modify, update, and expand them. Dib et al. [27] proposed an interactive virtual steel structure model that allows trainees to analyze the structure from multiple angles. Ayer et al. [28] proposed an AR/VR system based on an educational game that allows students to design, visualize, and assess exterior curtain walls. Chang et al. [29] developed an AR mobile solution for interior design. By using this solution, the trainees can position several virtual objects on a design plane and interact with these objects. The spatial imagination skills of trainees can be improved by integrating AR with a global positioning system (GPS) [30]. Su et al. [31] presented a VR system that can be used to improve the control skills of the operators who are using construction excavators. Jeelani et al. [32] developed an immersive environment to conduct experiments for hazard identification. The experiments’ results proved that the use of VR improved the hazard identification and management skills of construction professionals.

2.2. AR/VR Systems for Career Development in Medicine

Medical skill development requires significant learning time to prepare professionals to complete complex medical procedures and interact with patients [17]. AR/VR solutions may provide low-cost interactive alternatives to traditional learning approaches. Oxford Medical Simulation is a VR system that allows trainees to simulate different patient care scenarios, from discovering and documenting the medical history to the treatment administration. After completing the simulation, the trainee receives feedback and can repeat the scenario to improve their performance.
There are several AR/VR systems for dentistry professionals’ education [33]. The “Which DentSim” solution incorporates VR for training in ergonomic postures, instant feedback, and exam simulation [34,35]. The solution may improve the hand–eye coordination, which is an important skill in dentistry. Other relevant VR systems in dentistry are CDS-100 (designed by the EPED Inc. Computerized Dental Simulators), Objective Structured Clinical Examination (OSCE), and Moog Simodont Dental Trainer.
AR/VR technologies may provide highly rated learning experiences in intensive care medicine (ICM) due to the ability to assure a high degree of realism and data by tracking every user’s input and interaction [36,37]. In ICM, VR has been used in bronchoscopy-guided intubation training, to increase procedural abilities for reducing time and increasing the precision of the medical intubations [38,39]. VR systems were also used in cardiopulmonary resuscitations, making them more effective and accurate, and in ICM doctor–patient communication.
VR systems are also used in radiography education [40]. The Shaderware system allows 3D interactive simulations for improving radiographic equipment handling skills, receptor placement, collimation, side marker placement, exposure factor selection, control of scatter, and image quality assessment [41]. Monash University developed a VR simulation system to train medical radiation science students. The usage of this system led to higher student perception scores for clinical and technical skills [42].

2.3. AR/VR Systems for Career Development in Science

A growing number of universities are developing AR/VR systems to teach complex and abstract concepts, such as the impact of climate change on polar environments [13]. To assess this impact, the AR/VR solutions allow students to perform interactive virtual field trips to the Arctic. Another example is related to astronomy students, who have the chance to explore interactive, 3D models of astronomical objects in a virtual, collaborative environment.
VR solutions are also applied in geosciences [43]. Geology students can experience virtual field trips by visiting complex virtual geological sites. This is the alternative to physical site trips that have many logistics constraints. VEs improve the understanding of geological concepts, allowing students to develop their skills, such as the identification of rock and surface features, as well as data recording and reporting. Data visualization and analysis may develop students’ spatial awareness, which is one of the core competencies in geosciences. Virtual field trips have been performed as experiments for visualizing the evolution of geologic structures and understanding three-dimensional geological structures. The use of AR maps improved understanding and the development of spatial orientation skills.

2.4. VR in Career Development in Arts, Humanities, and Other Domains

AR/VR systems are used in Language Training, by allowing users to practice more realistic interactions and to contextualize language grammar and vocabulary in real/virtual-world settings [13].
Strong communication, teamwork, and other social skills are necessary for all professionals’ categories, including functional managers, project managers, marketing, accounting specialists, and other business administration professionals. Dong et al. [44] proposed an AR solution (ARVita) that allows users wearing HMDs to interact with their peers, sitting around a table during dynamic visual simulations.

3. CareProfSys Web Virtual Reality Scenarios for Career Development and Exploration

CareProfSys is a recommendation system developed by the authors which aims to provide career counseling using advanced user profile analysis, automatically extracted from various data sources (forms, CV, social media profiles). CareProfSys users will receive recommendations of professional occupations, based on these data, using ontological inferences from the “Classification of occupations in Romania” (COR) ontology [45], aligned with the European list of qualifications and classification algorithms specific to machine learning [46]. A conversational agent will provide personalized advice on recommended occupations and necessary steps for the future [47], while web virtual reality (WebVR) scenes will help users visualize activities connected to a recommended profession. Once users receive the recommended professions through the recommendation mechanism, they can try activities specific to the suggested professions by accessing the WebVR scenarios from this address: https://careprof.github.io/. Since developing such scenarios is not easy, we have chosen to develop scenarios for only six professions within the project: computer network specialists, civil engineers, web and multimedia developers, chemical engineers, project managers, and university professors. We believe this is sufficient to exemplify the concept of integrating VR into a job recommendation system which offers answers, not only in a textual form but in VR form. We claim that this way of providing recommendations is more appealing to young people, who can explore activities which are specific to a certain profession and decide whether that profession is suitable for them or not.

3.1. Functionalities of Web VR Scenarios

Motivation and fun can be increased in VR by including gamification elements. Therefore, our scenarios include various difficulty levels, scores, and badges for achievements.

3.1.1. Computer Network Specialists

The networking scenario is simulated in an office scene where the user must reproduce and configure various network layouts using elements such as PCs, servers, switches, and routers. The virtual reality training process contains three levels of difficulty (easy, medium, and hard), each featuring a different network that needs to be reproduced (see Figure 1).
Each level begins with a whiteboard (Figure 2) from which the user can select the desired difficulty. Then, the necessary buttons are pressed to generate the correct number and type of elements in accordance with the network layout displayed on the board. The elements will automatically appear in the room, positioned as shown in the layout. The tasks for each level are described below.
The easy level should be executed in a maximum of 10 min and contains the following tasks:
  • Select easy difficulty from the board;
  • Generate a PC, a server, and a switch;
  • Connect the elements with cables according to the layout on the board;
  • Click the start button on a computer; the screen shows a display, and the IP must be configured (from settings, with the IP address already displayed on the board); the IPv4 field is highlighted as a clue, so the user does not fill in other fields (MAC address, network mask, local address); the program also allows configuration of the equivalent IPv6 address;
  • The same IP configuration process is repeated for the server;
  • Both connections must be checked on the corresponding device’s console, using the “ping” command followed by the previously configured correct IP address (see Figure 3).
The medium level should be executed in a maximum of 15 min and the below tasks are performed:
  • Select medium difficulty from the board;
  • Generate four PCs, two switches, and one router;
  • Connect the elements with cables according to the given layout;
  • Click the start button on all computers, the screen lights up, and IPs need to be configured; the IPv4 is no longer highlighted as a clue; the program also allows configuration of the equivalent IPv6 address;
  • All connections must be checked in the device’s corresponding console, using the “ping” command followed by the previously configured correct IP address;
  • Configure the router connecting two distinct local networks; the IP addresses of its two interfaces (eth0 and eth1) must be correctly configured according to the layout and then activated.
The hard level should be executed in no more than 20 min:
  • Select hard difficulty from the board;
  • Generate six PCs, three switches, and three routers;
  • Connect the elements with cables according to the given layout;
  • Click the start button on all computers, the screen lights up, and IPs need to be configured; there are no hints at all, and IPs are no longer present on the board; the user must configure valid IP addresses based on the network addresses mentioned;
  • All connections must be checked in the device’s corresponding console, using the “ping” command followed by the previously configured correct IP address;
  • Configure the three routers connecting distinct local networks; this includes a more complicated configuration of the middle router connected to the other two routers. The routers must have the Routing Internet Protocol (RIP) IP list configured; this involves inserting all known IPs of the other two routers into the RIP settings tab. This must be repeated for all routers. The RIP allows routers to choose the best network route when redirecting packets. Another task users must complete before having a functional network is configuring the gateway address for each computer. The gateway address is the IP of the router on the gigabit Ethernet connection on the same side of the LAN where the computer is located.
The increase in difficulty is represented by a series of parameters: a larger number of elements in a network, more complex connections (e.g., router-to-router link), more tasks to perform, fewer clues included on the initial layout that needs to be reproduced, and fewer restrictions and help in the IP configuration process.
Each level has a different maximum time to complete all tasks, and the score is calculated based on the time of completion. Additionally, the number of tasks that need to be performed within the necessary time requires better familiarity with the VR application and better networking skills as one reaches a higher level of difficulty.
If the user performs exceptionally well and achieves certain special actions, they will unlock a series of achievements: picture perfect—connecting the cables of all elements of a network without any mistakes; beginner’s luck—successfully completing the first level; speedy IP—configuring the IP of a PC in the hard level in under 30 s; network master—completing all three levels of difficulty.
The necessary prior knowledge is described in Table 1.

3.1.2. Civil Engineers

The following scenario replicated in VR aims to provide an idea about some of the duties of a civil engineer. The scenario focuses on workplace safety within an active construction site (see Figure 4).
The “player’s” goal is to interact with as many workers as possible to activate certain interaction options that change the appearance of characters who are not properly equipped. These actions include properly equipping with safety helmets/goggles/hearing protection; returning to work; mandatory wearing of identification badges; extinguishing cigarettes; stopping other unsafe actions. This scenario is open-ended, allowing the player to interact with the workers in any order. The differences from one level to another consist of an increased number of actions that need to be performed in the same time interval. This scenario includes nine types of characters (see Figure 4), allocated to different positions to provide diversity. Some workers patrol a predefined area, while others stay at their workstations.
Each correct action increases the player’s score by one point. If the player reaches 33 points (at the easiest difficulty) within the allotted time, a level completion message will be displayed; otherwise, they can restart the current level. For the following difficulties, we will increase the required score (50 for medium and 65 for difficult) to achieve the objectives, but the time will remain the same. Player performance can be calculated by considering the remaining time after the objective has been achieved. The activity diagram below represents the workflow within this VR scenario; see Figure 5. If the user performs exceptionally well and achieves certain special actions, they will unlock a series of achievements: civil expert—successfully completing the highest difficulty level; beginner’s luck—successfully completing the first level; thunder speed—completing the easy level in under 1 min; penthouse constructed—completing all three levels of difficulty.
The necessary prior knowledge is described in Table 2.

3.1.3. Web and Multimedia Developers

For the scenario involving work as web and multimedia developers, we have chosen to simulate working on the font end, one of the possible duties of a web design specialist. The goal is to replicate the template provided in Figure 6a on the work canvas in Figure 6b.
The template is generated based on a classic arrangement of web pages, consisting of distributing space into multiple elements like header, content, footer, aside, main, etc.
The header is the top part of a web page that contains the navigation menu. The next part of the web page is the “content,” which is divided into “aside” and “center”; the “aside” part is distributed on the right or left side of the content, and contains either a navigation menu, ads, banners, or important information briefly described. The “main” part contains a more exact description of the elements, videos, text blocks, and other details; this part defines the website’s purpose. The last part is the “footer” where links to social networks and contact information like phone number, email, or company address are placed. At the beginning of the session, the necessary information for achieving the objective is presented (see Figure 7). In this menu, the desired difficulty is chosen (upon which depends the complexity of the model that needs to be made). Pressing the “Instructions” button will display information about each element of a web page discussed earlier.
Replicating web elements from a model like the one in Figure 6a is carried out on the work canvas (see Figure 6b). This canvas is divided into two main parts. The left part contains the web elements generated, and the right part displays the menu for generating elements. The menu generates elements based on which section is selected. For example, in Figure 6b, the “content” option was selected; in this case, the menu contains a button that can generate the “aside” part, another button generates the “main” element, an “undo” button, and an “exit” button, which will cause the user to exit from this menu. The “undo” button deletes the last generated element in that section. Each section contains a list of separate elements that store the elements generated in that section. The “content,” “footer,” or “header” parts each contain a separate list of objects that store the generated elements to be able to delete them in an intuitive way. If the “footer” section is selected, the menu will change and the option to distribute the social media icons on the right or left of the section will appear.
The complexity of the templates that need to be replicated depends on the difficulty chosen at the start of the game. For the easiest difficulty, the user must create a simple template consisting of a header (the top part of a website containing the navigation menu), two buttons, and a center (where the web page content is found) which is divided into content (usually the information provided by the website) and aside (a secondary navigation bar or submenu with information). The last section is composed of a footer-type bar. The score is calculated based on how many buttons were pressed to make the example and the time used (maximum for this difficulty is 10 min). The steps at this difficulty are represented in the activity diagram below. At medium difficulty, the user must create a template composed of a header (the top part of a website containing the navigation menu), three buttons positioned on the left, and the center, which is divided into content and contains four boxes for an aside. The last section is composed of a footer bar that has social media icons. The score is calculated based on how many buttons were pressed to create the example and the time used (the maximum for this difficulty is 7.5 min). At the highest difficulty, the user must create a template composed of a header, three buttons of which one is positioned on the left and two centered, and the center, which is divided into content and contains six boxes and two asides, one on the right and one on the left. The last section is composed of a footer bar that has two social media icons on the left and two on the right. The score is calculated based on how many buttons were pressed to create the example and the time used (the maximum for this difficulty is 5 min). When the player selects the difficulty level, a timer and a score bar are generated. The score records the number of pressed buttons on the workboard; a low score means that the player made a few mistakes in replicating the model. To assess how well they performed, the final score and the time it took to complete the model will be considered. The highest level of difficulty also includes a score limit that must not be exceeded to successfully complete the level and reduce the working time.
In cases where the user performs exceptionally well and achieves certain special actions, they will unlock a series of achievements: web god—creating the template correctly on the first attempt; beginner’s luck—successfully completing the first level; Zuckerberg’s child—completing the easy level with the minimum number of presses; Bill Gates would be jealous—completing all three levels of difficulty.
The necessary prior knowledge is described in Table 3.

3.1.4. Chemical Engineers

The chemical engineering scenario takes place in a hospital laboratory, where the user needs to perform a series of chemical analyses of varying complexities. Specific elements such as test tubes, pipettes, reagents, and analyzers are used. The virtual reality training process includes three levels of difficulty (easy, medium, hard), each with different tasks to be performed. On the laboratory wall is a panel displaying buttons to choose the difficulty level, as well as instructions related to the tasks that need to be completed (see Figure 8a).
The easiest level should take a maximum of 5 min; on the table, there is a single blood test tube and one opaque reagent tube (see Figure 8b), and the possible tasks are as follows:
  • Select the easy level difficulty on the panel;
  • The user must wash their hands for 5 s;
  • The user must put on gloves;
  • The user takes the blood test tube from the rack;
  • The user uses a pipette to take a drop of blood and places it into the second test tube with a reagent (opaque, without seeing the color of the reagent);
  • The user inserts the resulting test tube into the analyzer;
  • The analyzer screen displays the result “Analysis successfully completed”.
The medium level lasts a maximum of 8 min and the user’s goal is to identify the correct reagent for performing a glucose analysis; on the table, there is a single blood test tube and six reagent test tubes of different colors:
  • Select the medium difficulty level on the panel;
  • The user must wash their hands for 5 s;
  • The user must put on gloves;
  • The user takes the blood test tube from the rack;
  • The user uses a pipette to place a drop of blood in each reagent test tube;
  • The user sequentially inserts each resulting test tube into the analyzer and reads on the laptop screen whether the analysis was successfully completed, or an error occurred (there is only one test tube with the correct reagent);
  • The user selects the correct reagent color by pressing the corresponding button on the screen.
The hard level lasts 10 min maximum. Its goal is the correct classification of glucose values identified in blood samples; on the table, there are 5 blood test tubes, numbered, and 10 reagent test tubes of different colors—only 5 of these contain the correct reagent:
  • Select the hard difficulty level on the panel;
  • The user must wash their hands for 5 s;
  • The user must put on gloves;
  • The user uses a pipette to place a drop of blood from each blood test tube to be analyzed into each correct reagent test tube;
  • The five resulting test tubes are inserted into the analyzer and the result is read on the laptop screen; if there is at least one incorrect pairing, the process must be repeated;
  • If the analysis is successfully completed, the glucose result for each test tube is displayed: low—less than 65 mg/dL; normal—65–110 mg/dL; high—more than 110 mg/dL;
  • The user must label the test tubes based on values: (blue—low; green—normal; red—high).
Increasing difficulty is achieved through a series of elements: a higher number of tasks to complete, more complex tasks, less linear scenarios, and fewer indications on how tasks should be performed. Each level has a maximum duration to complete all tasks, and the score is calculated based on the time of completion. Additionally, tasks at higher difficulty levels require a good understanding of the virtual reality system, controls, and chemical engineering principles; otherwise, there is not enough time to complete all tasks. If the user performs exceptionally well and achieves certain special actions, they will unlock a series of achievements: geek god—choosing the correct reagent on the first try at medium or hard level; beginner’s luck—successfully completing the first level; speedy chemist—completing the easy level in under 1 min; mad scientist—completing all three levels of difficulty.
The necessary prior knowledge is described in Table 4.

3.1.5. Project Managers

The project manager scenario is set in a conference room, where the user needs to create either a Gantt chart, a work breakdown structure (WBS) diagram, or both, depending on the chosen difficulty; see Figure 9. Specific elements include work breakdown structure, Gantt chart, work package, and activity. The training process includes various tasks that can be divided into levels of difficulty: easy, medium, and hard. On one of the room’s walls, there is a panel with buttons from which the type of task to be performed can be selected.
The easiest level lasts a maximum of 10 min. At this difficulty, the user will only create the project’s WBS diagram. Possible tasks include the following:
  • Select the easy level difficulty from the panel;
  • The user must read the instructions;
  • Select the WBS diagram;
  • The user creates multiple columns to allocate tasks;
  • The user then creates tasks for each column based on requirements;
  • The user must name each task;
  • The user confirms the end of the activity.
The medium level lasts another 10 min. At the medium difficulty, the user must create the Gantt chart of a project based on the requirements presented at the beginning of the scenario. Possible tasks are:
  • Select the medium level of difficulty;
  • The user must select the Gantt chart;
  • The user must create the necessary number of WPs (work packages);
  • The user must create the necessary number of activities for each WP;
  • The user must add the number of months needed to complete the project;
  • The user must select the boxes corresponding to the months in which the respective activity must be carried out.
The hard level lasts 12 min. The goal is to create both the WBS and Gantt charts for the project, approximately 6 min each at maximum. Possible tasks include the following:
  • Select the hard level of difficulty;
  • The user must read the instructions;
  • Select the WBS diagram;
  • The user creates multiple columns where tasks are allocated;
  • The user then creates tasks for each column based on requirements;
  • The user must name each task;
  • The user confirms the end of the activity for this;
  • The user returns to the selection menu;
  • The user must select the Gantt chart;
  • The user must create the necessary number of WPs;
  • The user must create the necessary number of activities for each WP;
  • The user must add the number of months needed to complete the project;
  • The user must select the boxes corresponding to the months in which the respective activity must be carried out.
The increase in difficulty is achieved by adding additional tasks that need to be completed in designing the WBS or Gantt charts. Each level has a maximum duration to complete all tasks, and the score is calculated based on the time of completion and the number of deletions made. In addition, the tasks at higher difficulty levels require a good understanding of the virtual reality system, controls, and project management principles; otherwise, there is not enough time to complete all tasks.
The necessary prior knowledge is described in Table 5.

3.1.6. University Professors

This scenario takes place in a classroom equipped with desks, computers, a blackboard, and a projector: see Figure 10. Students are seated at desks engaging in various activities (programming, talking, raising hands to ask questions). At the easy difficulty level, a programming course—Data structures and algorithms (DSA)—is simulated, where the teacher needs to ensure that students are attentive; at the medium level, a laboratory scenario (same subject, DSA) is simulated, where practical exercises are performed, and the teacher checks the correctness of the students’ exercises. At the hard level, there is an emergency, and the teacher should react accordingly. On the blackboard (whiteboard) is a panel displaying buttons for choosing the difficulty level, and instructions related to the tasks to be performed are projected on the side wall of the classroom.
In the easy level (execution time 5 min), the user has a goal to teach a DSA course and alert inattentive students; there are 16 students (four rows of 4 students) seated at desks, without computers in front of them; students are in various states—either sitting quietly at desks, talking among themselves, gesturing, raising their hand to ask a question, or standing and talking on the phone. Possible tasks include the following:
  • Select the easy level on the panel;
  • The teacher goes to the lectern and turns on the computer;
  • A PowerPoint presentation about pointers appears on the projector and computer screen, which the teacher must navigate using the mouse;
  • When reaching the last slide, the teacher must draw a diagram/explanation on the board about double pointers;
  • There are inattentive students in the room; the teacher must identify them and hold their attention (those who are talking or on the phone); when five students (or all the inattentive ones, if fewer than five) are identified, the level is completed.
In the medium level (execution time 10 min), the user’s goal is to teach a DSA lab (introduction to programming in C) and verify the correctness of a programming exercise; there are 12 students (four rows of 3 students) seated at desks, with computers in front, programming. Possible tasks include the following:
  • Select the medium level on the panel;
  • The teacher goes to the lectern and opens the computer;
  • A PowerPoint presentation with the theoretical elements of the C language appears on the projector and computer screen, which the teacher must navigate using the mouse;
  • The last slide includes a coding exercise that students must solve; the teacher must walk between desks and classify the responses as correct/incorrect for five students;
  • The teacher returns to the lectern and shows a correct solution to the exercise.
In the hard level (execution time 15 min), the user’s goal is to evacuate a laboratory in an emergency (outbreak of a fire); there are 12 students (four rows of 3 students) seated at desks, with computers in front, programming. Possible tasks include the following:
  • Select the hard level on the panel;
  • The teacher goes to the lectern and opens the computer;
  • A PowerPoint presentation with the theoretical elements of the C language appears on the projector and computer screen, which the teacher must navigate using the mouse;
  • The last slide includes a coding exercise that students must solve; the teacher must walk between desks and classify the responses as correct/incorrect for five students;
  • Suddenly, a fire breaks out due to a short circuit; the teacher must go to each student’s desk and ensure they head towards the exit;
  • After the students are successfully evacuated, the teacher takes the fire extinguisher and extinguishes the fire.
The increase in difficulty is achieved through a series of elements: more complex tasks, the need for more advanced programming knowledge, and unforeseen situations. In addition, higher difficulty levels count the wrong decisions made by the user in the form of errors (e.g., classifying a student’s solution as correct when it was incorrect, or vice versa). In the medium level, if more than seven wrong choices are made (thus making it impossible for the user to correctly classify the solutions of five students), the level is stopped and considered failed, requiring a restart from the beginning. Each level has a maximum duration to complete all tasks, and the score is calculated based on the time of completion. In addition, tasks at higher difficulty levels require a good understanding of the virtual reality system, controls, and principles of programming in C; except for the easy level (specially created for accommodation), there is not enough time to complete all tasks without this knowledge. For the hard level, tasks are combined, including teaching, checking the correctness of student solutions, and following the evacuation protocol in case of a fire. In case the user performs exceptionally well and accomplishes certain special actions, they will unlock a series of achievements: snape’s disciple—attracting the attention of five students in less than 30 s (easy level); first year teacher success—successfully completing the first level; overqualified teacher—completing the medium/hard level without any errors; honorary professor—completing all levels of difficulty.
The prior knowledge necessary for successfully performing the tasks in the scenario includes basic knowledge of various established terms in programming, particularly in the C language (variables—declaration and initialization, reading, writing with scanf/printf, format specifiers, address operator, pointers, arithmetic operators), and knowledge of mathematical operations (e.g., calculating the arithmetic mean).

3.2. Technical Details of Web VR Scenarios

3.2.1. Architecture of the CareProfSys WebVR Module

The architecture of the CareProfSys VR system is specific to all Unity applications (see Figure 11), based on the Entity–Component–System architectural model common in video game development. Each object in a scene (3D models, cameras, light sources, VR equipment) represents entities, the associated data (geometric transformations, materials, physical principles, etc.) are represented through components, and transformations from one state to another (the actual game logic) are executed through systems [48].

3.2.2. Technologies

For the development of a VR application that can be run directly from a web browser, we used the Unity Engine along with specific packages such as WebXR or VRTK Tilia. An application can be hosted on a web browser (see Figure 11) if the application’s build type is WebGL. WebGL is a JavaScript API for rendering 3D graphics without the need for additional plugins. WebXR Exporter is a Unity package that allows the development of VR applications in WebGL format, compatible with browsers such as Mozilla Firefox, Google Chrome, Microsoft Edge on Windows, Oculus Browser, and Firefox Reality on Oculus Quest. The WebGL build contains two folders and an index page (html format). One folder contains the images displayed on the web page, and the other contains the game data, JavaScript framework, application loader, and web assembly file. These resources are selected from a script included on the index page that loads the application.

3.2.3. VR Equipment

Due to its format, WebGL, the application is compatible with multiple VR equipment models, as successful tests have been conducted with the HTC Vive Cosmos Elite, Oculus Rift, and Meta Quest (1 and 2). The technical specifications of each system are summarized in Table 6.

3.2.4. Teleportation and Movement in VR

Basic functionalities such as movement, rotation, and interaction were managed using the VRTK Tilia packages, which contain a collection of common features for VR environments. The VRTK Tilia package was used to configure two types of movement, teleportation that uses a curved object indicator (beam) for precise teleportation (see Figure 12) and smooth integrated axis movement that simulates walking. The two movements were integrated so that players could choose the teleportation method if they suffered from motion sickness.
User interface (UI) interactions are managed by an event system that is mapped to the corresponding VR controller for the left hand and generates a spherical object pointer when that controller interacts with any type of user interface (buttons, menus, etc.). Button mapping was conducted with custom scripts, valid for multiple types of controllers, corresponding to various models of virtual reality headsets available on the market. UI elements are detailed in the dedicated subsection.

3.2.5. User Interface (UI)

The UI specific to virtual reality contains graphical elements in “world space,” meaning they have a fixed position and rotation in space and are not constantly displayed as an overlay on the screen. This behavior is typical in virtual reality because overlaying the interface on the screen of a VR headset can cause discomfort or distract the user if it is placed at the edge of the person’s field of view. The interface in world space can be comfortably viewed by the user whenever they direct their gaze to the graphic element’s position in the scene. Interaction in VR with UI elements is achieved through ray casting—a ray is sent from the top of the left-hand controller until it intersects with an interactable graphic element (such as buttons, sliders, or text input fields). Unity allows for custom configuration of event triggers so that user interaction with a graphic element is triggered when the sphere pointer enters the UI element’s region.

3.2.6. Physics

The UI specific to virtual reality contains graphical elements in “world space,” meaning they have a fixed position and rotation in space and are not constantly displayed as an overlay on the screen. This behavior is typical in virtual reality because overlaying the interface on the screen of a VR headset can cause discomfort or distract the user if it is placed at the edge of the person’s field of view. The interface in world space can be comfortably viewed by the user whenever they direct their gaze to the graphic element’s position in the scene. Interaction in VR with UI elements is achieved through ray casting—a ray is sent from the top of the left-hand controller until it intersects with an interactable graphic element (such as buttons, sliders, or text input fields). Unity allows for custom configuration of event triggers so that user interaction with a graphic element is triggered when the sphere pointer enters the UI element’s region.
The logic of all scenes is based to some extent on physics. Objects in scenes that the user can interact with have Collider components attached, so that direct collisions between them and the player’s hands can be detected. This allows actions such as connecting cables between the correct components (in the networking scene) and pressing buttons to start computers (networking scene, teacher scene, or chemistry scene). Most such interactions are in the chemistry scene, where the player can put on gloves, the contact between the pipette and test tube will ensure a blood drop is picked up, the contact between the pipette and reagent will result in the introduction of the blood drop, the contact of the test tube with the analyzer will allow the analysis to be performed, and dropping an object and its contact with the floor will trigger the automatic regeneration of another identical object, in the original position (e.g., test tube in stand): see Figure 13. Additionally, most objects also have a RigidBody component attached, which is responsible for ensuring that objects are affected by the laws of physics (picking up decor or task-related items, items on tables falling if pushed, etc.), thus increasing realism.

3.2.7. Character Animations

In various scenes, there are animated characters (e.g., for civil engineering, university professor, project manager). The animations are freely sourced from an official Adobe site under a CC0 license [49]. These animations are made for humanoid-type characters. This allows the use of a system in Unity called the Avatar System, which is how Unity interprets that a certain object is humanoid. Due to the similarity between the bone structures of different humanoid models (see Figure 14), animations can be reused for other characters using the process of Animation Retargeting.

3.2.8. Specific Technical Details for the Networking Scenario

To accelerate the implementation process, various existing packages in Unity are customized for our needs. Such examples include a physical keyboard for entering the IP and console commands (ping, ipconfig)—Figure 15. The user must press the corresponding keys using two sticks like those used for drumming. Each character pressed via the corresponding key will appear on the screen in real time.
Another predefined package is used for a console. The console allows customization of one’s commands, so we have defined the two commands, ipconfig and ping, with behavior like that offered by an operating system’s command line.
Also, the real-time procedural cable generator allows the user to create custom cables based on the elements that need to be connected in the network. Cables are generated in real time, with a certain number of internal points that are linearly interpolated based on parameters of density, oscillation, and tilt, as well as the start and end points of the cable (the points of contact of the user’s hand with the two network elements that need to be connected according to the displayed scheme).

3.2.9. Specific Technical Details for the Civil Engineering Scenario

All workers are animated with at least three types of active animations. As mentioned above, when the player approaches a worker, a menu with multiple options activates, some being equipping a safety helmet, equipping an ID, and returning to work. Each of these interactions, if applicable to the worker, will increase the player’s score. In Figure 16, we can see all animations that can be used by each worker. Animations are configured for standing, walking, sitting, talking, multiple variations for falling, getting up, and a boxing animation. Their logic is defined in the WorkerBehaviour script and depends on the state of parameters from Figure 16. The animations do not depend only on one parameter that corresponds by name but on the overall state of all parameters.

3.2.10. Specific Technical Details for the Web and Multimedia Developer Scenario

Each element generated on the specific front-end board has a script attached that handles the generation of sub-elements and the display of respective menus, with the generated elements being of prefab type and stored in separate lists.
The “Validate Template” button is used when the user believes they have accurately represented the front end. Verification is performed by comparing the hierarchies of objects containing the page elements. If the order of elements and their position on the work canvas matches that in the model that needs to be created, then the hierarchies will coincide, and the session will close. The logic that ensures this functionality is in the WebDevLogic script, where we have functions for checking the selected difficulty that look for the number of children in active objects that reference the difficulty. Here, there are also functions for generating the first objects of the types content, footer, and header, and for displaying the menu.

3.2.11. Specific Technical Details for the Chemical Engineer Scenario

A particular element of this scenario is the use of a specific particle system for simulating water flow. A particle system made up of spheres (the droplets), and the trail they leave behind is used, creating the sensation of flow; see Figure 17. The maximum time in which hands remain in collision with the water is measured by a specific function in the collision script attached to the water; when this time exceeds 5 s, the task is accomplished.

3.2.12. Specific Technical Details for the Project Manager Scenario

One of the most important aspects of the project management scenario is the dynamic menu created using Unity’s automatic layout options, specifically Vertical Layout, Horizontal Layout, and Ignore Layout, to construct a Gantt chart; see Figure 18. Each element on the canvas is a prefab that has a parent object with a defined layout for WPs and activities; the parent has a Vertical Layout Group that automatically resizes its children to fit the available space. For the first text bar and the months for each WP or activity, we have a Horizontal Layout Group with the same function but applied horizontally. By using layout functions, we were able to create the dynamic menu by instantiation and gave it an appealing look.

3.2.13. Specific Technical Details for the University Professor Scenario

A significant technical element in this scene is the SpriteRenderer, responsible for rendering 2D Sprite images. These are used for projecting in front of the class the course/lab support, using a script that facilitates the transition from one slide to another following collisions with mouse buttons, and then mirroring the content from the computer screen onto the projection screen, as well as for displaying a student’s solution on the personal computer. These solutions are randomly generated from a collection of images provided, ensuring that repeating a level will never generate the same arrangement of solutions for the students in the class. Another essential UI element in the scene is the Canvas—used not only for displaying the buttons corresponding to the difficulty levels but also for displaying the possible actions for the teacher (e.g., “Pay attention”, classifying students’ solutions as “Correct” or “Wrong”). These canvases do not have a global position in the scene; they are generated in the vicinity of the directly targeted student and are visible only when the teacher is near that desk. The states in which the students are found are controlled using AnimationController elements (a finite automaton where animations represent the states, and transitions between these are made when certain conditions are fulfilled through scripts/user actions). At the easy level, there are five available animations: Sitting, Sitting and programming, Sitting and talking, Sitting and pointing, and Talking on the phone; the professor can only classify students as “inattentive” when they are found in one of the states “Sitting and talking” or “Talking on the phone”. After this action, these students will transition to the “Sitting” state. Each animation is initially assigned randomly, thus varying with each regeneration of the level, thereby avoiding redundancy/monotony. At the medium level, all students are in the state Sitting and programming, working on the given exercise. At the hard level, students are initially in animations specific to programming activities; after the outbreak of a fire, they will switch to one of the animations “scared” or “terrified”; after the teacher’s intervention, they will sequentially enter the animation of moving towards the exit. Additionally, a series of particle systems for flames and explosions are used, accompanied by corresponding sound effects for added realism. An example is in Figure 19, featuring a small fire (containing sparks, smoke, light, and sound) and the corresponding particle systems. When the teacher uses the extinguisher and launches the jet that collides with the flames, they will be extinguished.
For the hard level, an artificial intelligence script about the movement of students during evacuation is used. Thus, students have attached a NavMeshAgent component which determines their automatic movement towards the door once the teacher is in its vicinity. A series of intermediate points/positions in space are defined, through which the agent must pass on the way to the door. When the student reaches the door, the corresponding object of that student is destroyed, marking the successful evacuation of the person.

4. Materials and Methods Used in CareProfSys Experiments

All experiments conducted related to the CareProfSys job profiler system took place according to an internal protocol further described. The objectives of the experiments were testing/analysis of the utility/user experience (UX) of the prototype of the CareProfSys system.
It was necessary to conduct experiments in two rounds to allow correction of shortcomings identified by users, and then to validate the implemented corrections. Each of the two rounds lasted two working days. The location of the experiments was in the laboratories of the Career Counseling and Guidance Center from POLITEHNICA Bucharest. Two rooms were needed: one for the experiments in VR and the other for filling out feedback questionnaires.
The hardware and logistics equipment used in the experiments were 2 laptops/computers for experiments in VR and filling in the questionnaires; an Oculus Rift or Meta Quest 2 VR headset; some room arrangement (a minimum free space of 1.5 m × 2 m without obstacles); VR guardian scan in the open space.
The software requirements for experiments are listed below: locally configured CareProfSys web application; Oculus application installed on the computer/laptop; version 2020.3.24f1 of Unity (when using the Oculus Rift headset); CareProfSys application in VR installed and updated in Unity on a test PC/laptop (in the case of using the Oculus Rift headset); CareProfSys application in VR installed on Meta Quest 2 (in case of using Meta Quest 2 headset); Meta Quest account and casting link accessed [50] (when using Meta Quest 2 headset).
The preconditions for experiment users were as follows:
  • Scheduling for the experiment;
  • (Optional) filling in a CV in Europass format, in order to upload it to the CareProfSys platform;
  • Account on social networks, to give the CareProfSys system access to the information available there;
  • (Optional) basic knowledge of the occupations for which VR scenarios were built.
For the experiments to run smoothly, 4 members of the development team were involved in each of the two rounds of experiments. The test procedure applied had 5 stages, as shown in Table 7.
In the first round, we had as participants first-year students from various specializations and linguistic branches from POLITEHNICA in Bucharest; in the second round, we tested the system with students in the third and fourth years from POLITEHNICA in Bucharest, from two specializations and linguistic branches. In the first round, we had 30 participants (13 girls and 17 boys), and in the second round, 17 third- and fourth-year students from two specializations and language pathways (11 girls and 6 boys). In total, we summarized 47 users who participated in the experiments.
To be able to test the VR module, which contains specific scenarios for only 6 professions, we also applied an introductory self-assessment questionnaire, through which we gave users the chance to choose one of the 6 jobs for which we had scenarios in VR as the closest to their own profile. At the same time, we were able to evaluate the accuracy of the recommendations given by the system. The images below (Figure 20, Figure 21, Figure 22, Figure 23, Figure 24, Figure 25 and Figure 26) capture participants in each of the 5 stages of the experiments:

5. Results

The results of the experiments are measured with the aid of four instruments:
  • The introductory self-assessment questionnaire shared with participants via a Google form;
  • The final feedback questionnaire, distributed to participants through a Google form;
  • VR simulation performance recording module, which exports the following information to text files for each user simulation session: start time of the training session; the level of difficulty achieved; last task achieved; total number of system errors; score; completion time of the training session;
  • Interview with users, after completing the final feedback.

5.1. First Round of Experiments

The first round of experiments involved 13 girls and 17 boys, but there were no big differences in opinions or performance due to users belonging to one group or another. All participants are students at POLITEHNICA Bucharest, in the first year, so very close in mentality and profile to high school students.

5.1.1. Results of the Introductory Self-Assessment Questionnaire

The questionnaire completed before the actual testing of the system within the project aims at self-assessment of a set of skills, knowledge, and professional interests. The results are available in Appendix A.
Thus, the following competencies were self-evaluated by the participants:
  • Skill and movement: 30% of respondents appreciate that they have a high level of skill.
  • Information and information search: 50% of students believe they know how to become informed.
  • Working with computers: over 80% of students believe they have knowledge of operating with computers.
  • Management: over 80% of respondents believe they have good and very good management skills.
  • Working with specialized machinery and equipment: over 74% of respondents believe they have the necessary skills to work with specialized machinery and equipment.
  • Construction: over 74% of respondents self-assess their construction skills as satisfactory.
  • Support and care: 90% have high and very high skills (over 56% of respondents have very high skills).
  • Communication, collaboration, and creativity: over 96% believe they have high communication, collaboration, and creativity skills.
  • Interests related to motivation to help others: over 96% of respondents say they are eager to help others.
  • Preferred subjects in high school: over 73% of surveyed students chose mathematics/computer science as their preferred subjects during high school; 10% had sciences (physics, chemistry, biology) as preferred subjects; the lowest percentages of choice were recorded for arts and sports, and economic and creative fields were absent from the choice.
  • Future profession: the most desired profession was programmer—50% of respondents; 16% of respondents chose cybersecurity engineer as their desired profession.
  • Matching with the six professions modeled within the CareProfSys system: the profession with which most of the surveyed students identified was web programmer (66.7%); in second place of the choices is the university teaching profession (over 53%); in third place is the profession of network engineer (40%).
It was observed that the same skills and interests were also extracted from the CareProfSys recommendation module, i.e., the answers provided to the form on the platform and the information extracted from the CV and social profiles were consistent with those chosen by students in the self-assessment questionnaire.
The motivations for choosing one of the six professions modeled within the system as suitable for the respondent included the following reasons:
  • For the web programmer, the reasons are related to choosing a profession with a future, passion for computers/programming, interest in cyber security, and problem-solving thinking;
  • For the university teacher, the reasons are perception as a prestigious profession, desire to share accumulated knowledge, pleasure in helping others and working with people, and management and communication skills;
  • For the network engineer, the motivations are computer work, preference to learn and perform practical activities, pleasure to program, calm temperament, and concern for cybersecurity.
The choice in the self-assessment questionnaire of at least one of the six professions as compatible with the personal profile is also explained by the fact that students were told that only six professions were available in VR and all wanted to try scenarios in VR, although only 16 of the 30 received recommendations of professions that were in VR. It should be noted, however, that more than half of the participants (16 out of 30) had received recommendations for professions from CareProfSys that were among their wishes, which means that the recommendation algorithm is correct. As for the other 14, the CareProfSys system raised questions and a desire to learn more about the recommended professions.
Regarding the Europass CV, 50% of first-year students said that they wrote this CV in the required format, which meant that uploading it to the system would increase their chances of a more accurate profile and, therefore, a better recommendation. However, there was no difference in the quality of the recommendation result between the two groups of students: those who uploaded the Europass CV to the platform, and the others. This means that the data provided by the form in the referral module and social media profiles are sufficient. Each user must answer some questions in the recommendation module, may or may not upload a resume, etc.

5.1.2. Results of the Feedback Questionnaire

The questionnaire completed after testing the system, available in Appendix B, covers the following dimensions of evaluation of this individual experience:
  • Medium chosen to be tested: the most tested scenario was that of university professor (33.3%); the least tested was that of chemical engineer (3.3%).
  • Projection in the future regarding the choice of the tested profession: over 80% of responding students believed that they could practice the tested profession in the future.
  • Satisfaction with the activities carried out according to the scenario chosen for testing: 90% of respondents found these experiences satisfactory and very satisfactory.
  • Appreciation of the interface—intuitive and easy to use: over 95% of respondents believed that the interface was intuitive and easy to use.
  • Providing clear reasoning and explanations by the system: 90% of respondents believed they had received clear and sufficient explanations from the system.
  • Appreciation of the interface—navigating menus, interacting with objects, accessing functionalities: 90% of respondents considered these features to be easy.
  • Clarity and immersiveness of the visual part of the system: over 93% of respondents positively appreciated these qualities.
  • Appreciating the experience as realistic and immersive: over 73% of respondents considered the experience realistic and immersive.
  • Identifying distractions and technical limitations that interrupted VR immersion.
  • Hardware limitations caused by sensor position.
  • Accommodation with the environment.
  • Getting used to the joystick.
  • Dizziness induced by movement with the joystick.
  • Consideration of unforeseen situations: adjustment of tone of voice and emotional intelligence.
  • Inaccurate placement of objects.
  • Difficulty in using controllers.
  • VR headset quality.
  • Problems when colliding with objects.
  • Suggestion for introducing gravity.
  • Promptness of the system to commands and user movements: over 90% of the surveyed students appreciated the system as efficient in terms of promptness of ordering and reductions.
  • Variety of experiences to form a clear opinion about the job: over 76% believed they could relate to this experience for choosing a profession in the future.
  • Comfort and safety in using VR equipment: in total, 6 responses out of 12 provided identified the following types of risks: hitting walls/other objects in the room, difficulty reading text for visually impaired people, limited free space in the test room, motion sickness, the existence of neural disorders such as epilepsy and dizziness, self-injury due to lack of experience in using the VR system.
  • General impression: over 96% of respondents rated this experience as positive.
As a future profession, the most desired is that of programmer—50% of respondents. This choice is in line with the pattern of skills that respondents consider highly developed: working with computers: over 80% of students consider that they have knowledge of operating computers; working with specialized machines and equipment: over 74% of respondents; information and information search: 50% of students consider that they know how to become informed. Regarding interests in certain preferred subjects in high school, over 73% of surveyed students chose mathematics/computer science as their preferred subjects during high school. In total, 10% had sciences (physics, chemistry, biology) as their preferred subjects. These interests support the skill set necessary to achieve the profession of programmer.
In second place as a future professional choice is that of university professor (53% of respondents). This choice is in line with the set of skills needed to fulfill such a profession: assistance and care: 90% have high and very high skills (over 56% of respondents have very high skills); communication, collaboration, and creativity: over 96% believe they have high communication, collaboration, and creativity; information and information search skills: 50% of students believe they know how to become informed; management: over 80% of respondents believe they have good and very good management skills. In addition to the set of necessary skills comes the professional interest related to the motivation to help others: over 96% of respondents declared themselves eager to help others and had a social interest specific to education and the university teaching occupation.
There is correspondence between the options related to the most desired future profession, that of programmer—50% of respondents—which agrees with the preferred choice of the six professions modeled by the system: the profession with which most of the surveyed students identified was web programmer (66.7%).
The post-test feedback questionnaire shows a positive/satisfactory appreciation of the measured dimensions, so testing a profession is an excellent opportunity to better understand a profession in an interactive way, and it projects the user into a realistic scenario. Due to the variety of experiences offered by the tested system to form a clear opinion about the job, over 76% of respondents believe that they can relate to this experience for choosing a profession in the future.

5.1.3. Results from the VR Simulation Performance Recording Module

The automatic recording of user performance within CareProfSys has proven that very few users reach the difficult level in a short time, as found in the experiments, which means that there is no need for more complicated scenarios for such a career recommendation tool. Most participants (19) could not pass the first level, the easy one. However, one of them also reached level 2—the difficult one—and 10 reached level 1—the medium one. The scores are not very enlightening, as not all scenarios use the same method to calculate the score. The student who reached the difficult level said she had never used VR before, which means the teacher scenario is intuitive enough for users. Two students stated that the activities they carried out in VR were not as satisfying as they expected and that they did not see themselves performing those activities further, but they gave positive feedback on the web and VR experience. Analyzing their performance in the system, we notice that one performed well (reached level 1), while the other did not (remained with 0 points earned). Therefore, the degree of satisfaction does not depend on the score accumulated in VR gamified scenarios, which is fine from the point of view of the purpose of the system, aimed at highlighting the activities specific to a job, with good and bad, even if this presentation is made in an attractive manner (in the form of a game). There were no differences between the VR devices used in terms of performance.

5.1.4. Results from the Final Interview

The answers provided by participants in the unstructured interview strengthened the answers given in the feedback questionnaire: most students were delighted with the experience, with only one student disappointed with the project manager scenario. This is also explained by the fact that he did not have the knowledge used in the script, but also that the script had some gaps in functionality. Most students see themselves practicing the professions explored in VR. The gamification elements were particularly appreciated. The interview revealed the reasons why students applied to a certain specialization within POLITEHNICA Bucharest, and they admitted that a tool such as CareProfSys would have been useful to them during high school, when they were investigating what career to pursue next. However, students said that they would not 100% trust the recommendations provided by an IT tool (recommendation engine, chatbot, etc.) and that they still feel the need for advice from trusted people: teachers, older friends, parents, or school counselors. They found the VR module both useful and engaging, due to the gamification elements. All of them considered the scenarios a learning experience, and most of them said they understood what a specialist should do after participating in this experiment.

5.2. Second Round of Experiments

The second round of experiments involved 11 girls and six boys, but there were no significant differences in opinions or performance due to users belonging to one group or another. All participants were students at POLITEHNICA Bucharest, in the third and fourth years from two specializations and linguistic branches.

5.2.1. Results of the Introductory Self-Assessment Questionnaire

The results obtained in the self-assessment questionnaire applied before using the CareProfSys system are consistent with the results from the CareProfSys recommender module. Of the 17 users, all obtained recommendations for the professions that had scenarios in VR. This is explained by the fact that the students were older than the participants in the first round of experiments and more determined about the career they wanted to pursue. The detailed results obtained from this questionnaire are available in Appendix C.

5.2.2. Results of the Feedback Questionnaire

The results obtained in the CareProfSys post-use feedback questionnaire (available in Appendix D) were better than those in the first round of experiments: everyone considered the experiment a pleasant experience. However, two participants considered that the activities performed in VR were not satisfactory enough for them, both having extensive experience in using VR but failing to reach the difficult level on the platform. Therefore, again, there is no correlation between the level of VR knowledge and the performance obtained when interacting with the system.

5.2.3. Results from the VR Simulation Performance Recording Module

The performance recorded by the system was slightly superior to that of the first round of experiments. Most participants (nine) could not pass the first level, the easy one. However, two of them reached level 2—the difficult one—and six reached level 1—the medium one. Thus, in the first round of experiments, 63.33% failed to pass the first level, while in the second round, this percentage dropped to 52.94%.

5.2.4. Results from the Final Interview

The interview validated the answers given to the feedback questionnaire: most students were delighted with the experience and stated that the system has a friendly interface, and the idea of CareProfSys is useful and applicable in real life. Most of them considered the scenario a learning experience, also, but others already knew the information tested there, so they considered the scenarios as testing experiences.

6. Discussion

After the first round of experiments, considering all the analyzed dimensions, as well as the recommendations and motivations presented by respondents on the choices regarding their future profession, we can consider the CareProfSys system as a useful, necessary, and functional tool in guiding the career and future professional choices of students in the technical field. The general impression was extremely good: over 96% of respondents rated the CareProfSys experience as positive, particularly based on their opinion about the VR scenarios. All the functionality or technical deficiencies noted during the first experiments were resolved before the second round of experiments, except for the fact that we decided not to increase the difficulty of the scenarios, as very few students reached the last level. After the second round of experiments, we noticed that the defects reported during the first round were fixed. As a result of the proposals of the participants in the experiments, a guide for using scenarios in VR was also created. There was no correlation between the users’ performance in VR and their anterior experiences with VR or their gender. Most of the students considered the VR scenarios as learning or testing experiences.
As threats to the validity of the results, we mention that the usefulness of a career recommendation system can be truly assessed over time, when users choose their recommended profession or not. Students who have glasses with diopters higher than +/−1 did not want to participate in the experiment, so we cannot say anything about the usefulness of the system for that category of participants. Also, all our participants were students enrolled in engineering studies, so they liked technology, and all our VR scenarios were related to technical professions. We cannot anticipate the feedback of young people who do not have a technical background, even though the career counselors who were members of the project saluted the idea of such an instrument.
The experiments were conducted according to a protocol, and all participants signed a data protection agreement and an informed consent form (available in Supplementary Materials): participants agreed that the experiments’ data would be used for teaching or research purposes. There are no known or foreseeable risks associated with the experimental protocol we used. In the unlikely event that side effects caused by using VR occur (e.g., vertigo, feeling nauseous or dizzy, lack of balance), participants should inform the persons in charge of conducting the experiments to discontinue the VR activities. Even if not registered, the following types of risks were identified as possible by our users: hitting walls/other objects in the room, difficulty reading text for visually impaired people, limited free space in the test room, motion sickness, the existence of neural disorders such as epilepsy and dizziness, and self-injury due to lack of experience in using the VR system. We did not test the system on persons with disabilities, so we cannot say anything related to the efficiency of VR scenarios for them.

7. Conclusions

The CareProfSys project [51] and the positive results obtained in the experiments with the 47 students from POLITEHNICA Bucharest presented in this study demonstrate another use case of VR: developing and exploring a suitable career by integrating scenarios for various professions within a broad recommendation system. Thus, the CareProfSys system aims to provide support for young people (high school students, students, professionals who want a professional reconversion) in finding the ideal profession and some help for counselors in career guidance centers.
This article presents a set of VR scenarios which can be used to discover activities specific to six professions, in the framework of exploiting various VR/AR systems used to develop domain-specific skills. The innovative feature of our VR scenarios is that they are web-based and integrated into a recommender system for jobs. We tested the scenarios and the idea of such a VR-based recommender using a complex methodology. Several instruments were applied: the introductory self-assessment questionnaire—available in Supplementary Material S4 and the raw results in Supplementary Material S6 and S8; the questionnaire completed after testing the system—available in Supplementary Material S5 and the raw results in Supplementary Material S7 and S9; unstructured interviews; and the VR simulation performance recording module. The majority of the participants received recommendations of jobs that were among their wishes from CareProfSys, which means that the recommendation algorithm is efficient. Also, VR scenarios gave our users a desire to learn more about the recommended professions, thus proving the formative feature of our system. Our study is also relevant in terms of how to increase users’ satisfaction related to the immersive experience provided by a VR environment.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/electronics13132629/s1, S1: Informed consent, S2: General_Data_Protection_Regulation_Consent, S3: Theoretical Knowledge—Prerequisites_for_VR, S4: Introductory_self-assessment_questionnaire, S5: Feedback_questionnaire, S6: Introductory_self-assessment_questionnaire_responses_round1, S7: Feedback_questionnaire_responses_round1, S8: Introductory_self-assessment_questionnaire_responses_round2, S9: Feedback_questionnaire_responses_round2.

Author Contributions

Conceptualization, M.-I.D. and C.-N.B.; methodology, M.-I.D. and I.-C.S.; software, I.-C.S. and I.-A.B.; validation, M.-I.D., I.-C.S., and B.-I.U.; formal analysis, M.-I.D.; investigation, I.-C.S.; resources, C.-N.B.; data curation, B.-I.U.; writing—original draft preparation, M.-I.D., I.-C.S. and B.-I.U.; writing—review and editing, M.-I.D.; visualization, B.-I.U.; supervision, M.-I.D. and C.-N.B.; project administration, M.-I.D.; funding acquisition, M.-I.D. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by a grant of the Ministry of Research, Innovation and Digitization, CNCS—UEFISCDI, project number TE 151 from 14/06/2022, within PNCDI III: “Smart Career Profiler based on a Semantic Data Fusion Framework”.

Data Availability Statement

Data obtained in the experiments are available in the Supplementary Materials.

Acknowledgments

The authors thank the students of the Faculty of Engineering in Foreign Languages—POLITEHNICA Bucharest who participated in this study, and the members of UPB-CCOC—The Career Counseling and Guidance Center from POLITEHNICA Bucharest, for their valuable input during the implementation of the CareProfSys project.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A

The results from the introductory self-assessment questionnaire from the first round of experiments are further presented in Figure A1, Figure A2, Figure A3, Figure A4, Figure A5, Figure A6, Figure A7, Figure A8, Figure A9, Figure A10, Figure A11, Figure A12 and Figure A13.
Figure A1. Self-assess yourself related to handling and moving skills, first round of experiments.
Figure A1. Self-assess yourself related to handling and moving skills, first round of experiments.
Electronics 13 02629 g0a1
Figure A2. Self-assess yourself related to information skills, first round of experiments.
Figure A2. Self-assess yourself related to information skills, first round of experiments.
Electronics 13 02629 g0a2
Figure A3. Self-assess yourself related to working with computers, first round of experiments.
Figure A3. Self-assess yourself related to working with computers, first round of experiments.
Electronics 13 02629 g0a3
Figure A4. Self-assess yourself related to management skills, first round of experiments.
Figure A4. Self-assess yourself related to management skills, first round of experiments.
Electronics 13 02629 g0a4
Figure A5. Self-assess yourself related to working with machinery and specialized equipment, first round of experiments.
Figure A5. Self-assess yourself related to working with machinery and specialized equipment, first round of experiments.
Electronics 13 02629 g0a5
Figure A6. Self-assess yourself related to constructing, first round of experiments.
Figure A6. Self-assess yourself related to constructing, first round of experiments.
Electronics 13 02629 g0a6
Figure A7. Self-assess yourself related to assisting and caring, first round of experiments.
Figure A7. Self-assess yourself related to assisting and caring, first round of experiments.
Electronics 13 02629 g0a7
Figure A8. Self-assess yourself related to communication, collaboration and creativity, first round of experiments.
Figure A8. Self-assess yourself related to communication, collaboration and creativity, first round of experiments.
Electronics 13 02629 g0a8
Figure A9. Self-assess yourself related to willingness to help others, first round of experiments.
Figure A9. Self-assess yourself related to willingness to help others, first round of experiments.
Electronics 13 02629 g0a9
Figure A10. What were your favorite disciplines during high school?- first round of experiments.
Figure A10. What were your favorite disciplines during high school?- first round of experiments.
Electronics 13 02629 g0a10
Figure A11. What profession do you want to have in the future?
Figure A11. What profession do you want to have in the future?
Electronics 13 02629 g0a11
Figure A12. From the below professions, what do you think suits you?- first round of experiments.
Figure A12. From the below professions, what do you think suits you?- first round of experiments.
Electronics 13 02629 g0a12
Figure A13. Do you have a Europass CV?
Figure A13. Do you have a Europass CV?
Electronics 13 02629 g0a13

Appendix B

The results from the feedback questionnaire from the first round of experiments are further presented in Figure A14, Figure A15, Figure A16, Figure A17, Figure A18, Figure A19, Figure A20, Figure A21, Figure A22, Figure A23, Figure A24 and Figure A25.
Figure A14. Please select the environment you tested, first round of experiments.
Figure A14. Please select the environment you tested, first round of experiments.
Electronics 13 02629 g0a14
Figure A15. Accuracy: Do you see yourself doing these activities in the future, in a workplace?-first round of experiments.
Figure A15. Accuracy: Do you see yourself doing these activities in the future, in a workplace?-first round of experiments.
Electronics 13 02629 g0a15
Figure A16. Relevance: Are the activities you performed as satisfying as expected?- first round of experiments.
Figure A16. Relevance: Are the activities you performed as satisfying as expected?- first round of experiments.
Electronics 13 02629 g0a16
Figure A17. User experience: How intuitive and user-friendly is the interface?—first round of experiments.
Figure A17. User experience: How intuitive and user-friendly is the interface?—first round of experiments.
Electronics 13 02629 g0a17
Figure A18. User experience: Does the system provide clear explanations or rationales for the scenarios?—first round of experiments.
Figure A18. User experience: Does the system provide clear explanations or rationales for the scenarios?—first round of experiments.
Electronics 13 02629 g0a18
Figure A19. User experience: How intuitive and user-friendly is the interface (navigating menus, interacting with objects, and accessing controls)?—first round of experiments.
Figure A19. User experience: How intuitive and user-friendly is the interface (navigating menus, interacting with objects, and accessing controls)?—first round of experiments.
Electronics 13 02629 g0a19
Figure A20. User experience: Are the visuals displayed in the VR environment clear and immersive?—first round of experiments.
Figure A20. User experience: Are the visuals displayed in the VR environment clear and immersive?—first round of experiments.
Electronics 13 02629 g0a20
Figure A21. Immersion: How effectively does the VR system create a sense of presence and immersion within the virtual environment?—first round of experiments.
Figure A21. Immersion: How effectively does the VR system create a sense of presence and immersion within the virtual environment?—first round of experiments.
Electronics 13 02629 g0a21
Figure A22. Immersion: Do you consider you had a realistic and engaging experience? first round of experiments.
Figure A22. Immersion: Do you consider you had a realistic and engaging experience? first round of experiments.
Electronics 13 02629 g0a22
Figure A23. Performance: How responsive are the VR interactions to user inputs and movements?—first round of experiments.
Figure A23. Performance: How responsive are the VR interactions to user inputs and movements?—first round of experiments.
Electronics 13 02629 g0a23
Figure A24. Content quality: Does the VR scenario offer a sufficient variety of experiences for you to understand the job’s specificity?—first round of experiments.
Figure A24. Content quality: Does the VR scenario offer a sufficient variety of experiences for you to understand the job’s specificity?—first round of experiments.
Electronics 13 02629 g0a24
Figure A25. Overall impression: Did you like it?—first round of experiments.
Figure A25. Overall impression: Did you like it?—first round of experiments.
Electronics 13 02629 g0a25

Appendix C

The results from the introductory self-assessment questionnaire from the secondround of experiments are further presented in Figure A26, Figure A27, Figure A28, Figure A29, Figure A30, Figure A31, Figure A32, Figure A33, Figure A34, Figure A35, Figure A36, Figure A37 and Figure A38.
Figure A26. Self-assess yourself related to handling and moving skills, second round.
Figure A26. Self-assess yourself related to handling and moving skills, second round.
Electronics 13 02629 g0a26
Figure A27. Self-assess yourself related to information skills, second round.
Figure A27. Self-assess yourself related to information skills, second round.
Electronics 13 02629 g0a27
Figure A28. Self-assess yourself related to working with computers, second round.
Figure A28. Self-assess yourself related to working with computers, second round.
Electronics 13 02629 g0a28
Figure A29. Self-assess yourself related to management skills, second round.
Figure A29. Self-assess yourself related to management skills, second round.
Electronics 13 02629 g0a29
Figure A30. Self-assess yourself related to working with machinery and specialized equipment, second round.
Figure A30. Self-assess yourself related to working with machinery and specialized equipment, second round.
Electronics 13 02629 g0a30
Figure A31. Self-assess yourself related to constructing, second round.
Figure A31. Self-assess yourself related to constructing, second round.
Electronics 13 02629 g0a31
Figure A32. Self-assess yourself related to assisting and caring, second round.
Figure A32. Self-assess yourself related to assisting and caring, second round.
Electronics 13 02629 g0a32
Figure A33. Self-assess yourself related to communication, collaboration, and creativity, second round.
Figure A33. Self-assess yourself related to communication, collaboration, and creativity, second round.
Electronics 13 02629 g0a33
Figure A34. Self-assess yourself related to willingness to help others, second round.
Figure A34. Self-assess yourself related to willingness to help others, second round.
Electronics 13 02629 g0a34
Figure A35. What were your favorite disciplines during high school?—second round.
Figure A35. What were your favorite disciplines during high school?—second round.
Electronics 13 02629 g0a35
Figure A36. What profession do you want to have in the future?—second round.
Figure A36. What profession do you want to have in the future?—second round.
Electronics 13 02629 g0a36
Figure A37. From the below professions, what do you think suits you?—second round.
Figure A37. From the below professions, what do you think suits you?—second round.
Electronics 13 02629 g0a37
Figure A38. Do you have a Europass CV?—second round.
Figure A38. Do you have a Europass CV?—second round.
Electronics 13 02629 g0a38

Appendix D

The results from the feedback questionnaire from the first round of experiments are further presented in Figure A39, Figure A40, Figure A41, Figure A42, Figure A43, Figure A44, Figure A45, Figure A46, Figure A47, Figure A48, Figure A49 and Figure A50.
Figure A39. Please select the environment you tested, second round.
Figure A39. Please select the environment you tested, second round.
Electronics 13 02629 g0a39
Figure A40. Accuracy: Do you see yourself doing these activities in the future, in a workplace?—second round.
Figure A40. Accuracy: Do you see yourself doing these activities in the future, in a workplace?—second round.
Electronics 13 02629 g0a40
Figure A41. Relevance: Are the activities you performed as satisfying as expected?—second round.
Figure A41. Relevance: Are the activities you performed as satisfying as expected?—second round.
Electronics 13 02629 g0a41
Figure A42. User experience: How intuitive and user-friendly is the interface?—second round.
Figure A42. User experience: How intuitive and user-friendly is the interface?—second round.
Electronics 13 02629 g0a42
Figure A43. User experience: Does the system provide clear explanations or rationales for the scenarios?—second round.
Figure A43. User experience: Does the system provide clear explanations or rationales for the scenarios?—second round.
Electronics 13 02629 g0a43
Figure A44. User experience: How intuitive and user-friendly is the interface (navigating menus, interacting with objects, and accessing controls)?—second round.
Figure A44. User experience: How intuitive and user-friendly is the interface (navigating menus, interacting with objects, and accessing controls)?—second round.
Electronics 13 02629 g0a44
Figure A45. User experience: Are the visuals displayed in the VR environment clear and immersive?—second round.
Figure A45. User experience: Are the visuals displayed in the VR environment clear and immersive?—second round.
Electronics 13 02629 g0a45
Figure A46. Immersion: How effectively does the VR system create a sense of presence and immersion within the virtual environment?—second round.
Figure A46. Immersion: How effectively does the VR system create a sense of presence and immersion within the virtual environment?—second round.
Electronics 13 02629 g0a46
Figure A47. Immersion: Do you consider you had a realistic and engaging experience?—second round.
Figure A47. Immersion: Do you consider you had a realistic and engaging experience?—second round.
Electronics 13 02629 g0a47
Figure A48. Performance: How responsive are the VR interactions to user inputs and movements?—second round.
Figure A48. Performance: How responsive are the VR interactions to user inputs and movements?—second round.
Electronics 13 02629 g0a48
Figure A49. Content quality: Does the VR scenario offer a sufficient variety of experiences for you to understand the job’s specificity?—second round.
Figure A49. Content quality: Does the VR scenario offer a sufficient variety of experiences for you to understand the job’s specificity?—second round.
Electronics 13 02629 g0a49
Figure A50. Overall impression: Did you like it?—second round.
Figure A50. Overall impression: Did you like it?—second round.
Electronics 13 02629 g0a50

References

  1. Capecchi, I.; Borghini, T.; Barbierato, E.; Guazzini, A.; Serritella, E.; Raimondi, T.; Saragosa, C.; Bernetti, I. The Combination of Serious Gaming and Immersive Virtual Reality through the Constructivist Approach: An Application to Teaching Architecture. Educ. Sci. 2022, 12, 536. [Google Scholar] [CrossRef]
  2. Dascalu, M.I.; Bodea, C.-N.; Ordóñez de Pablos, P.; Lytras, M.D. Improving e-learning communities through optimal composition of multidisciplinary learning groups. Comput. Human Behav. 2014, 30, 362–371. [Google Scholar] [CrossRef]
  3. Baker, L.; Wright, S.; Mylopoulos, M.; Kulasegaram, K.; Ng, S. Aligning and applying the paradigms and practices of education. Acad. Med. 2019, 94, 1060. [Google Scholar] [CrossRef]
  4. Fineman, B. 2018 VR/AR in Research and Education Survey; Internet 2: Ann Arbor, MI, USA, 2018. [Google Scholar]
  5. Perkins Coie; XR Association. 2020 Augmented and Virtual Reality Survey Report. 2020. Available online: https://www.perkinscoie.com/images/content/2/3/231654/2020-AR-VR-Survey-v3.pdf (accessed on 3 April 2024).
  6. AlGerafi, M.A.M.; Zhou, Y.; Oubibi, M.; Wijaya, T.T. Unlocking the Potential: A Comprehensive Evaluation of Augmented Reality and Virtual Reality in Education. Electronics 2023, 12, 3953. [Google Scholar] [CrossRef]
  7. Silva, M.; Bermúdez, K.; Caro, K. Effect of an Augmented Reality App on Academic Achievement, Motivation, and Technology Acceptance of University Students of a Chemistry Course. Comput. Educ. X Real. 2023, 2, 100022. [Google Scholar] [CrossRef]
  8. Liu, Q.; Yu, S.; Chen, W.; Wang, Q.; Xu, S. The Effects of an Augmented Reality Based Magnetic Experimental Tool on Students’ Knowledge Improvement and Cognitive Load. J. Comput. Assist. Learn. 2021, 37, 645–656. [Google Scholar] [CrossRef]
  9. Sun, J.C.-Y.; Ye, S.-L.; Yu, S.-J.; Chiu, T.K.F. Effects of Wearable Hybrid AR/VR Learning Material on High School Students’ Situational Interest, Engagement, and Learning Environment. J. Sci. Educ. Technol. 2023, 32, 1–12. [Google Scholar] [CrossRef]
  10. Dewey, J. Democracy and Education: An Introduction to the Philosophy of Education; Macmillan: New York, NY, USA, 1916. [Google Scholar]
  11. Avila-Pesántez, D.; Rivera, L.A.; Alban, M.S. Approaches for serious game design: A systematic literature review. Comput. Educ. J. 2017, 8, 1–11. [Google Scholar]
  12. Checa, D.; Bustillo, A. A review of immersive virtual reality serious games to enhance learning and training. Multimed. Tools Appl. 2020, 79, 5501–5527. [Google Scholar] [CrossRef]
  13. Swati, J.; Pramod, P.J. A Collaborative Metaverse based A-La-Carte Framework for Tertiary Education (CO-MATE). Heliyon 2023, 9, e13424. [Google Scholar]
  14. Alhalabi, W. Virtual reality systems enhance students’ achievements in engineering education. Behav. Inf. Technol. 2016, 35, 919–925. [Google Scholar] [CrossRef]
  15. Wang, P.; Bai, X.L.; Billinghurst, M.; Zhang, S.S.; Zhang, X.Y.; Wang, S.X.; He, W.P.; Yan, Y.X.; Ji, H.Y. AR/MR Remote Collaboration on Physical Tasks: A Review. Robot. Comput. Integr. Manuf. 2021, 72, 102071. [Google Scholar] [CrossRef]
  16. Tan, Y.; Xu, W.; Li, S.; Chen, K. Augmented and Virtual Reality (AR/VR) for Education and Training in the AEC Industry: A Systematic Review of Research and Applications. Buildings 2022, 12, 1529. [Google Scholar] [CrossRef]
  17. Dick, E. The Promise Ofimmersive Learning: Augmented and Virtual Reality’s Potential in Education, Information Technology & Inovation Foundation—ITIF Report. Available online: https://itif.org/publications/2021/08/30/promise-immersive-learning-augmented-and-virtual-reality-potential/ (accessed on 3 April 2024).
  18. Hwang, G.-J.; Chien, S.-Y. Definition, roles, and potential research issues of the metaverse in education: An artificial intelligence perspective. Comput. Educ. Artif. Intell. 2022, 3, 100082. [Google Scholar] [CrossRef]
  19. Sampaio, A.Z.; Martins, O.P. The application of virtual reality technology in the construction of bridge: The cantilever and incremental launching methods. Autom. Constr. 2014, 37, 58–67. [Google Scholar] [CrossRef]
  20. Vergara, D.; Rubio, M.P.; Lorenzo, M. New Approach for the Teaching of Concrete Compression Tests in Large Groups of Engineering Students. J. Prof. Issues Eng. Educ. Pract. 2017, 143, 05016009. [Google Scholar] [CrossRef]
  21. Goulding, J.; Nadim, W.; Petridis, P.; Alshawi, M. Construction industry offsite production: A virtual reality interactive training environment prototype. Adv. Eng. Inform. 2012, 26, 103–116. [Google Scholar] [CrossRef]
  22. Pedro, A.; Le, Q.T.; Park, C.S. Framework for Integrating Safety into Construction Methods Education through Interactive Virtual Reality. J. Prof. Issues Eng. Educ. Pract. 2015, 142, 04015011. [Google Scholar] [CrossRef]
  23. Chou, C.; Hsu, H.L.; Yao, Y.S. Construction of a Virtual Reality Learning Environment for Teaching Structural Analysis. Comput. Appl. Eng. Educ. 1997, 5, 223–230. [Google Scholar] [CrossRef]
  24. Guerrero-Mosquera, L.F.; Gómez, D.; Thomson, P. Development of a virtual earthquake engineering lab and its impact on education. Dyna 2018, 85, 9–17. [Google Scholar] [CrossRef]
  25. Setareh, M.; Bowman, D.A.; Kalita, A.; Gracey, M.; Lucas, J. Application of a Virtual Environment System in Building Sciences Education. J. Archit. Eng. 2004, 11, 165–172. [Google Scholar] [CrossRef]
  26. Fogarty, J.; El-Tawil, S. Exploring Complex Spatial Arrangements and Deformations in Virtual Reality. In Proceedings of the Structures Congress, Boston, MA, USA, 3–5 April 2014; pp. 1089–1096. [Google Scholar]
  27. Dib, H.N.; Adamo, N. An Augmented Reality Environment for Students’ Learning of Steel Connection Behavior. In Proceedings of the ASCE International Workshop on Computing in Civil Engineering, Seattle, WA, USA, 25–27 June 2017; pp. 51–58. [Google Scholar]
  28. Ayer, S.K.; Messner, J.I.; Anumba, C.J. Augmented Reality Gaming in Sustainable Design Education. J. Archit. Eng. 2016, 22, 04015012. [Google Scholar] [CrossRef]
  29. Chang, Y.S.; Hu, K.J.; Chiang, C.W.; Lugmayr, A. Applying Mobile Augmented Reality (AR) to Teach Interior Design Students in Layout Plans: Evaluation of Learning Effectiveness Based on the ARCS Model of Learning Motivation Theory. Sensors 2019, 20, 105. [Google Scholar] [CrossRef]
  30. Riera, A.S.; Redondo, E.; Fonseca, D. Geo-located teaching using handheld augmented reality: Good practices to improve the motivation and qualifications of architecture students. Univers. Access Inf. Soc. 2014, 14, 363–374. [Google Scholar] [CrossRef]
  31. Su, X.; Dunston, P.S.; Proctor, R.W.; Wang, X.Y. Influence of training schedule on development of perceptual–motor control skills for construction equipment operators in a virtual training system. Autom. Constr. 2013, 35, 439–447. [Google Scholar] [CrossRef]
  32. Jeelani, I.; Han, K.; Albert, A. Development of virtual reality and stereo-panoramic environments for construction safety training, Engineering. Constr. Archit. Manag. 2020, 27, 1853–1876. [Google Scholar] [CrossRef]
  33. Huang, T.-K.; Yang, C.-H.; Hsieh, Y.-H.; Wang, J.-C.; Hung, C.-C. Augmented reality (AR) and virtual reality (VR) applied in dentistry. Kaohsiung J. Med. Sci. 2018, 34, 243–248. [Google Scholar] [CrossRef] [PubMed]
  34. Roy, E.; Bakr, M.M.; George, R. The need for virtual reality simulators in dental education: A Review. Saudi Dent. J. 2017, 29, 41–47. [Google Scholar] [CrossRef] [PubMed]
  35. Verstreken, K.; Van, C.J.; Marchal, G.; Naert, I.; Suetens, P.; Steenberghe, D. Computer-assisted planning of oral implant surgery: A three-dimensional approach. Int. J. Oral. Maxillofac. Implant. 1996, 11, 806. [Google Scholar]
  36. Makris, D.; Tsolaki, V.; Robertson, R.; Dimopoulos, G.; Rello, J. The future of training in intensive care medicine: A European perspective. J. Intensive Med. 2023, 3, 52–61. [Google Scholar] [CrossRef]
  37. Said, A.S.; Cooley, E.; Moore, E.A.; Shekar, K.; Maul, T.M.; Kollengode, R.; Zakhary, B. Development of a standardized assessment of simulation-based extracorporeal membrane oxygenation educational courses. ATS Sch. 2022, 3, 242–257. [Google Scholar] [CrossRef]
  38. Nonas, S.A.; Fontanese, N.; Parr, C.R.; Pelgorsch, C.L.; Rivera-Tutsch, A.S.; Charoensri, N.; Saengpattrachai, M.; Pongparit, N.; Gold, J.A. Creation of an international interprofessional simulation-enhanced mechanical ventilation course. ATS Sch. 2022, 3, 270–284. [Google Scholar] [CrossRef] [PubMed]
  39. Seam, N.; Lee, A.J.; Vennero, M.; Emlet, L. Simulation training in the ICU. Chest 2019, 156, 1223–1233. [Google Scholar] [CrossRef] [PubMed]
  40. Tay, Y.X.; McNulty, J.P. Radiography education in 2022 and beyond—Writing the history of the present: A narrative review. Radiography 2023, 29, 391–397. [Google Scholar] [CrossRef]
  41. Cosson, P.; Willis, R.N. Comparison of Student Radiographers’ Performance in a Real X-ray Room after Training with a Screen-Based Computer Simulator. 2012. Available online: http://www.lecturevr.com/distrib/etc/WhitePaper-ComparisonOfStudentRadiographersPerformanceInaRealXrayRoomAfterTrainingWithAScreenBasedComputerSimulator.pdf (accessed on 3 April 2024).
  42. Sapkaroski, D.; Baird, M.; McInerney, J.; Dimmock, M.R. The implementation of a haptic feedback virtual reality simulation clinic with dynamic patient interaction and communication for medical imaging students. J. Med. Radiat. Sci. 2018, 65, 218–225. [Google Scholar] [CrossRef] [PubMed]
  43. Harknett, J.; Whitworth, M.; Rust, D.; Krokos, M.; Kearl, M.; Tibaldi, A.; Bonali, F.L.; Van Wyk, B. The use of immersive virtual reality for teaching fieldwork skills in complex structural terrains. J. Struct. Geol. 2022, 163, 104681. [Google Scholar] [CrossRef]
  44. Dong, S.; Behzadan, A.H.; Chen, F.; Kamat, V.R. Collaborative visualization of engineering processes using tabletop augmented reality. Adv. Eng. Softw. 2013, 55, 45–55. [Google Scholar] [CrossRef]
  45. Dascălu, M.I.; Bodea, C.N.; Nemoianu, I.V.; Hang, A.; Puskás, I.F.; Stănică, I.C.; Dascălu, M. CareProfSys—An ontology for career development in engineering designed for the romanian job market. Rev. Roum. Sci. Tech. Série Électrotechnique Énergétique 2023, 68, 212–217. [Google Scholar] [CrossRef]
  46. Dascalu, M.I.; Brîndușescu, V.A.; Stanica, I.C.; Uta, B.I.; Bratosin, I.A.; Mitrea, D.A.; Brezoaie, R.E. Chatbots for career guidance: The case of careprofsys conversational agent. In Proceedings of the 18th International Technology, Education and Development Conference, Valencia, Spain, 2024, 4–6 March; pp. 6194–6204.
  47. Dascalu, M.I.; Bumbacea, A.S.; Bratosin, I.A.; Stanica, I.C.; Bodea, C.N. CareProfSys—Combining Machine Learning and Virtual Reality to Build an Attractive Job Recommender System for Youth: Technical Details and Experimental Data. ECBS 2023. Lecture Notes in Computer Science. In Proceedings of the 8th International Conference on Engineering of Computer-Based Systems, Västerås, Sweden, 16–19 October 2023; Volume 14390, pp. 289–298, ISBN 978-3-031-49251-8. [Google Scholar]
  48. Unity. Available online: https://docs.unity3d.com/Packages/com.unity.entities@0.2/manual/ecs_core.html (accessed on 3 April 2024).
  49. Mixamo. Available online: https://www.mixamo.com/ (accessed on 3 April 2024).
  50. Oculus Casting. Available online: https://www.oculus.com/casting/ (accessed on 3 April 2024).
  51. CareProfSys Project Website. Available online: https://www.careprofsys.upb.ro/ (accessed on 5 May 2024).
Figure 1. Network configurations to be reproduced in the WebVR scenarios from the CareProfSys system.
Figure 1. Network configurations to be reproduced in the WebVR scenarios from the CareProfSys system.
Electronics 13 02629 g001
Figure 2. (a) Initial state—whiteboard; (b) cable connections and tasks.
Figure 2. (a) Initial state—whiteboard; (b) cable connections and tasks.
Electronics 13 02629 g002
Figure 3. (a) IP configuration; (b) console for specific commands (ping, ipconfig, etc.).
Figure 3. (a) IP configuration; (b) console for specific commands (ping, ipconfig, etc.).
Electronics 13 02629 g003
Figure 4. (a) Scene from the WebVR scenario dedicated to activities specific to the profession of civil engineer; (b) characters from the scenes specific to the civil engineer profession.
Figure 4. (a) Scene from the WebVR scenario dedicated to activities specific to the profession of civil engineer; (b) characters from the scenes specific to the civil engineer profession.
Electronics 13 02629 g004
Figure 5. Tasks to be performed within the scenario assigned to the civil engineer profession.
Figure 5. Tasks to be performed within the scenario assigned to the civil engineer profession.
Electronics 13 02629 g005
Figure 6. (a) One of the models that need to be replicated in the scenario allocated to the profession web and multimedia developers; (b) the editing canvas (workboard) in the scenario allocated to the profession of web and multimedia developers.
Figure 6. (a) One of the models that need to be replicated in the scenario allocated to the profession web and multimedia developers; (b) the editing canvas (workboard) in the scenario allocated to the profession of web and multimedia developers.
Electronics 13 02629 g006
Figure 7. Initial instructions from the scenario allocated to the profession of web and multimedia developers.
Figure 7. Initial instructions from the scenario allocated to the profession of web and multimedia developers.
Electronics 13 02629 g007
Figure 8. (a) Scene from the scenario allocated to the chemical engineer profession; (b) initial scene from the scenario allocated to the chemical engineer profession.
Figure 8. (a) Scene from the scenario allocated to the chemical engineer profession; (b) initial scene from the scenario allocated to the chemical engineer profession.
Electronics 13 02629 g008
Figure 9. (a) Scene related to WBS development from the scenario allocated to the project manager profession; (b) scene related to Gantt chart development from the scenario allocated to the project manager profession.
Figure 9. (a) Scene related to WBS development from the scenario allocated to the project manager profession; (b) scene related to Gantt chart development from the scenario allocated to the project manager profession.
Electronics 13 02629 g009
Figure 10. (a) Example of an action to be performed in the WebVR scenario for the university professor profession; (b) example of an emergency to be solved in the WebVR scenario for the university professor profession.
Figure 10. (a) Example of an action to be performed in the WebVR scenario for the university professor profession; (b) example of an emergency to be solved in the WebVR scenario for the university professor profession.
Electronics 13 02629 g010
Figure 11. WebVR window.
Figure 11. WebVR window.
Electronics 13 02629 g011
Figure 12. Teleport in WebVR scenarios.
Figure 12. Teleport in WebVR scenarios.
Electronics 13 02629 g012
Figure 13. Physics elements.
Figure 13. Physics elements.
Electronics 13 02629 g013
Figure 14. Animation Retargeting procedure in the Unity Avatar System.
Figure 14. Animation Retargeting procedure in the Unity Avatar System.
Electronics 13 02629 g014
Figure 15. Custom Unity packages for CareProfSys scenarios.
Figure 15. Custom Unity packages for CareProfSys scenarios.
Electronics 13 02629 g015
Figure 16. Animator for characters in the civil engineering scenario.
Figure 16. Animator for characters in the civil engineering scenario.
Electronics 13 02629 g016
Figure 17. Water flow simulation in the scenario for the chemical engineer profession.
Figure 17. Water flow simulation in the scenario for the chemical engineer profession.
Electronics 13 02629 g017
Figure 18. Creating the Gantt chart in the scenario for the project manager profession.
Figure 18. Creating the Gantt chart in the scenario for the project manager profession.
Electronics 13 02629 g018
Figure 19. Rendering of the fire in the virtual environment.
Figure 19. Rendering of the fire in the virtual environment.
Electronics 13 02629 g019
Figure 20. Participants in stage 1 of the experiment: filling in the introductory questionnaire, informed consent form, and personal data protection form.
Figure 20. Participants in stage 1 of the experiment: filling in the introductory questionnaire, informed consent form, and personal data protection form.
Electronics 13 02629 g020
Figure 21. Samples of initial documents signed by participants in experiments.
Figure 21. Samples of initial documents signed by participants in experiments.
Electronics 13 02629 g021
Figure 22. Participants in stage 2 of experiments: testing the CareProfSys recommender module.
Figure 22. Participants in stage 2 of experiments: testing the CareProfSys recommender module.
Electronics 13 02629 g022
Figure 23. Participants in stage 3 of the experiments: VR presentation and accommodation.
Figure 23. Participants in stage 3 of the experiments: VR presentation and accommodation.
Electronics 13 02629 g023
Figure 24. Participants in stage 4 of experiments: career exploration in VR (first round of experiments).
Figure 24. Participants in stage 4 of experiments: career exploration in VR (first round of experiments).
Electronics 13 02629 g024
Figure 25. Participants in stage 4 of experiments: career testing in VR (second round of experiments).
Figure 25. Participants in stage 4 of experiments: career testing in VR (second round of experiments).
Electronics 13 02629 g025
Figure 26. Participants in stage 5 of experiments: completing feedback questionnaires.
Figure 26. Participants in stage 5 of experiments: completing feedback questionnaires.
Electronics 13 02629 g026
Table 1. Networking terminology needed to use the WebVR scenarios specific to the profession of “Computer network specialist”.
Table 1. Networking terminology needed to use the WebVR scenarios specific to the profession of “Computer network specialist”.
TermDescription
Internet Protocol Address (IP)A unique identifier for the personal computer used to access the local network or the Internet. It consists of 4 sets of numbers on 8 bits, such as 192.255.255.255. The previous example on 32 bits is called IPv4, which has a limited number of addresses. The enhanced version IPv6 is a 128-bit version that uses hexadecimal in its configuration and incorporates an Internet Protocol Security (IPsec) protocol. Both types of addresses can be used in a network configuration
ping It is a utility command meant to test the connection between two devices
ipconfigIt is a utility command used to display the currently configured interfaces if no arguments are given; arguments can be used to configure the given interfaces
Network maskThe network mask is a 32-bit number intended to separate the host and network address parts of an IP. In a local network, IPs should have the same network address to allow communication
Routing Information Protocol (RIP)Represents a dynamic routing protocol that automatically establishes the best connection between configured networks by calculating the number of hops between networks
SwitchA network device used to connect multiple hosts in a Local Area Network (LAN). The devices’ IPs require the same network address
RouterA network device used to connect multiple LANs. In this case, the devices can have different network addresses
Media Access Control address (MAC)This represents the physical address of the network interface controller and is associated with the IP address to correctly locate the device in a local network
Table 2. Civil engineering necessary terminology to use the CareProfSys WebVR scenarios specific to the civil engineer profession.
Table 2. Civil engineering necessary terminology to use the CareProfSys WebVR scenarios specific to the civil engineer profession.
TermDescription
Safety helmetA helmet used to prevent head injuries in case of impact with objects falling from high heights
Identification badgeA document for identifying authorized personnel at the workplace
Unauthorized ItemsObjects that are not allowed during working hours (e.g., alcoholic beverages)
Occupational Safety RegulationsRules that must be followed for the proper conduct of work activities and safety
Table 3. Necessary web terminology to use the WebVR scenarios specific to the profession of web and multimedia developers.
Table 3. Necessary web terminology to use the WebVR scenarios specific to the profession of web and multimedia developers.
TermDescription
HeaderThe top section of a website that consistently appears on all pages. Usually located above any other content on the site, it contains elements such as the logo, website menu, and other important information
ContentThe content of a website refers to all elements used to communicate your message on a website. Website content is typically divided into two main categories: web copy or body copy, which refers to written text, and multimedia content, which includes images, videos, and audio
FooterThe bottom section of content on a web page. Typically, it contains copyright notices, a link to a privacy policy, site map, logo, contact information, social media icons, and an email signup form
Social media iconsSocial media icons are buttons that link to the social media pages associated with the website/organization for promotion purposes
Table 4. Terminology required to use WebVR scenarios specific to the chemical engineer profession.
Table 4. Terminology required to use WebVR scenarios specific to the chemical engineer profession.
TermDescription
ReagentDexOnline—“Chemical substance that undergoes a specific reaction in the presence of a certain ion or group of ions.”
Blood sugarThe concentration of glucose in the blood
Blood sugar levels (unit of measurement)um: mg/dL
- Low: <65 mg/dL
- Normal: 65–110 mg/dL
- High: >110 mg/dL
AnalyzerAn instrument that performs chemical analysis
Test tubeA glass tube used in the laboratory
PipetteA device equipped with a pump for extracting small quantities (e.g., drops) of liquid
StandA device used to hold laboratory vessels (e.g., test tubes) in place
Table 5. Necessary terminology to use in the WebVR scenarios specific to the project manager profession.
Table 5. Necessary terminology to use in the WebVR scenarios specific to the project manager profession.
TermDescription
WBS (work breakdown structure)A technique used in project management to break down a project into smaller, more manageable tasks
Gantt chartA technique used in project management for the graphical representation of the planning and duration of project tasks
WP (work package)It is a structure that groups multiple activities or tasks of the same type. It is one of the main units used to divide a project
ActivityA structure designed to monitor the progress of work packages, resource allocation and time
Table 6. VR equipment compatible with the developed WebVR module.
Table 6. VR equipment compatible with the developed WebVR module.
DeviceHTC Vive Cosmos Elite (HTC Corporation, New Taipei City, Taiwan)Oculus Rift (Meta Platforms, Menlo Park, CA, USA)Meta Quest (Meta Platforms, Menlo Park, CA, USA)Meta Quest 2 (Meta Platforms, Menlo Park, CA, USA)
Helmet modelElectronics 13 02629 i001Electronics 13 02629 i002Electronics 13 02629 i003Electronics 13 02629 i004
Controller modelElectronics 13 02629 i005Electronics 13 02629 i006Electronics 13 02629 i007Electronics 13 02629 i008
Resolution2880 × 1700 pixels (combined for both eyes)2160 × 1200 pixels (combined for both eyes)2880 × 1600 pixels (combined for both eyes)3664 × 1920 pixels (combined for both eyes)
Horizontal field of view110 degrees87 degrees93 degrees97 degrees
Wireless?NoNoYesYes
Tracking typeOutside in (IR sensors)Outside in (IR sensors)Inside outInside out
Table 7. Test procedure applied in CareProfSys experiments.
Table 7. Test procedure applied in CareProfSys experiments.
Stage IDGoalActivitiesEstimated ResultsDuration
Stage 1Experiment presentation and users’ consent- Explanation of procedure and duration of experiment
- Completion of informed consent form to take part in the experiment (available in Supplementary Materials)
- GDPR Agreement Supplement (available in Supplementary Materials)
- Filling in the introduction, self-assessment form (available in Supplementary Materials)
- User feedback
- Informed consent, GDPR form
5 min
Stage 2Testing the recommendation module- Presentation of the recommender module and its functionalities
- CV upload, social media profile, and feature exploration
- Getting a career recommendation
- Career recommendations5 min
Stage 3VR Overview and Accommodation(Only for users who have received one of the 6 career recommendations (civil engineer, chemical engineer, teacher, project manager, web systems designer, networking specialist) or who have evaluated themselves as having affinities towards one of the 6 jobs available in VR)
- Read document to obtain basic theoretical knowledge for the recommended profession (available in Supplementary Materials)
- System and hardware VR presentation
- Helmet adjustment for users
- Presentation of VR input for movement, action, and functionalities
- Software configuration according to user profile (unique ID)
- Unique ID, tested scene5–10 min
(extra time may be needed for people with no previous experience using VR)
Stage 4Explore VR scenarios(Only for users who have received one of the 6 career recommendations (civil engineer, chemical engineer, teacher, project manager, web systems designer, networking specialist) or who have evaluated themselves as having affinities towards one of the 6 jobs available in VR)
- The user must accomplish as many tasks as possible in the given time
- All tests will start with the easiest difficulty level; if a level is successfully completed, the user will then move on to the next difficulty level (medium, then hard)
- Score according to the logic of each game
- Mistakes (errors) in game logic
- Real-time feedback to identify bugs and suggestions for improvements
10 min
Stage 5Final feedback- Filling in the questionnaire at the end of the tests, focused on usefulness and interview applied following the questionnaire.- User feedback 5–10 min
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dascalu, M.-I.; Stanica, I.-C.; Bratosin, I.-A.; Uta, B.-I.; Bodea, C.-N. Virtual Reality for Career Development and Exploration: The CareProfSys Profiler System Case. Electronics 2024, 13, 2629. https://doi.org/10.3390/electronics13132629

AMA Style

Dascalu M-I, Stanica I-C, Bratosin I-A, Uta B-I, Bodea C-N. Virtual Reality for Career Development and Exploration: The CareProfSys Profiler System Case. Electronics. 2024; 13(13):2629. https://doi.org/10.3390/electronics13132629

Chicago/Turabian Style

Dascalu, Maria-Iuliana, Iulia-Cristina Stanica, Ioan-Alexandru Bratosin, Beatrice-Iuliana Uta, and Constanta-Nicoleta Bodea. 2024. "Virtual Reality for Career Development and Exploration: The CareProfSys Profiler System Case" Electronics 13, no. 13: 2629. https://doi.org/10.3390/electronics13132629

APA Style

Dascalu, M.-I., Stanica, I.-C., Bratosin, I.-A., Uta, B.-I., & Bodea, C.-N. (2024). Virtual Reality for Career Development and Exploration: The CareProfSys Profiler System Case. Electronics, 13(13), 2629. https://doi.org/10.3390/electronics13132629

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop