Next Article in Journal
Detecting Emotions from Illustrator Gestures—The Italian Case
Previous Article in Journal
A Comparative Study of Methods for the Visualization of Probability Distributions of Geographical Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Study on Attention Attracting Elements of 360-Degree Videos Based on VR Eye-Tracking System

1
Department of Culture and Technology Convergence, Changwon National University, Changwon 51140, Korea
2
Department of Culture Technology, Changwon National University, Changwon 51140, Korea
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2022, 6(7), 54; https://doi.org/10.3390/mti6070054
Submission received: 29 March 2022 / Revised: 8 July 2022 / Accepted: 12 July 2022 / Published: 14 July 2022

Abstract

:
In 360-degree virtual reality (VR) videos, users possess increased freedom in terms of gaze movement. As a result, the users’ attention may not move according to the narrative intended by the director and miss out on important parts of the narrative of the 360-degree video. Therefore, it is necessary to study a directing technique that can attract user attention in 360-degree VR videos. In this study, we analyzed the directing elements that can attract users’ attention in a 360-degree VR video and developed a 360 VR eye-tracking system to investigate the effect of the attention-attracting elements on the user. Elements that can attract user attention were classified into five categories: object movement, hand gesture, GUI insertion, camera movement, and gaze angle variation. We developed a 360 VR eye-tracking system to analyze whether five attention-attracting elements influence the user’s attention. Based on the eye tracking system, an experiment was conducted to analyze whether the user’s attention moves according to the five attention-attracting elements. Based on the experimental results, it can be seen that ‘hand gesture’ attracted the second most attention shift of the subjects, and ‘GUI insertion’ induced the smallest shift of attention of the subjects.

1. Introduction

Three hundred and sixty degree virtual reality (VR) videos provide a greater sense of immersion, deliver more information, and possess the advantage of interactivity, compared to traditional media content. Owing to these characteristics, users who actively enjoy VR content are turning to 360-degree VR videos; therefore, this video format is garnering increased interest and investment from media companies worldwide [1,2]. VR equipment can range from expensive VR headsets, such as the Oculus Rift and HTC VIVE, to mobile VR devices that can be combined with smartphones, such as Google Cardboard. This has provided average consumers with easy access to 360-degree VR videos [3]. In addition, anyone can experience 360-degree VR videos on online platforms, such as YouTube, Facebook, or smartphone applications [4].
Existing 2D-based video is in the form of a rectangular flat screen, the movement of the user’s gaze is limited, and it is easy to watch according to the flow of the video intended by the director. In contrast, 360-degree VR videos can be played on a screen that expands the field of view to 360°. As a result, the field of view (FOV) at which the users’ gaze can move is increased, and users’ attention may not move according to the narrative flow intended by the director. In such cases, the user may miss important parts of the narrative of 360-degree VR videos; hence, from the perspective of directing and planning, it is important to consider elements that can attract the users’ attention [5,6,7,8].
Therefore, a process of identifying the user’s gaze information is essential for analyzing the elements which can attract users’ attention in 360-degree VR videos. This is essential to determine whether the user’s attention moves as intended by the director. Eye-tracking technology, used to accurately identify user’s gaze position, can analyze which objects are unconsciously glanced over, where users tend to focus, and how long they look at objects. Even in VR environments, eye-tracking technology can be used to track users’ areas of interest in virtual spaces [9]. Eye-tracking technology can extract the user’s area of attention by tracking eye movement and is widely used in fields that analyze human cognition and user interfaces, such as medicine, games, web services, and marketing. Eye-tracking technology in VR environments is used in research related to human perception and behavior [10,11,12]. The users’ intentions can be analyzed based on their area of interest, and gaze retention time within a virtual space by tracking their gaze using a head-mounted display (HMD) with eye-tracking functionality [9,13,14,15].
This study analyzed the directing elements that can induce the users’ attention in 360-degree VR video and developed a 360 VR eye-tracking system to investigate the effect of the attention-attracting elements on the users. Among the 360-degree VR videos uploaded on YouTube from 2015 to 2020, the scope was limited to music videos, and 35 cases were analyzed. Based on the analysis, this study classified five elements that can attract users’ attention and study directing techniques in 360-degree VR videos. To analyze whether the classified attention-attracting element leads the user to move their attention, an eye-tracking system was developed to track the gaze of users watching 360-degree VR videos. The eye-tracking system was used to perform the experiment on 47 subjects. The experiment was conducted to determine whether the subjects who watched 360-degree VR videos using an HMD with a built-in eye-tracking device watched the videos according to attention-attracting elements, and the results were analyzed. Based on the experimental results, it can be seen that the attention-attracting element of ‘hand gesture’ induced the second most attention movement of the subjects, and that ‘GUI insertion’ caused the least attention movement of the subjects.

2. Related Work

Existing 2D-based videos are played and watched on a rectangular screen; therefore, planning, directing, filming, and post-production is performed accordingly. However, 360-degree VR videos can be viewed from all directions, and the users’ movement affects how they watch the media [2]. In 360-degree VR, users can participate in the story as they can be inside the narrative as it unfolds.
In a study by Dolan (2017), from the perspective of existence, a user is classified as a participant if they exist as a character in the story and as an observer if they do not exist as a character in the story. From the perspective of influence, users are classified as active if they can affect the story and passive if they perceive the story as intended by the storyteller [16]. Media combined with VR technology can provide higher levels of immersion, presence, and embodiment, compared to existing 2D-based video media, such as 2D movies and animations; therefore, research on storytelling and directing techniques that use new media is needed [17]. The study by Mateer (2017) argued that even though 360 VR is similar to existing film in terms of sequence, in 360-degree VR directing techniques, elements, such as directing, positioning of actions, and specific viewing angles to control the users’ movement, are more important [6]. In the study by Xu (2018), setting up a guiding character in a style similar to a virtual environment enhances interaction and storytelling effects and positively affects viewer satisfaction [18]. The study by Fearghail (2018) argued that in 360-degree VR video directing, the producer’s intentional emphasis on the viewer gaze increases visual discomfort; therefore, directing techniques that produce guidelines related to 360-degree VR videos and utilize the spatial characteristics of VR are needed [8]. A study by Danieau (2017) argued that fade to black and desaturation are directing techniques through which users’ gaze can be focused. Among the visual effects of 360-degree VR videos, fade to black darkens the parts except for those essential to users’ area of attention, thus emphasizing what users should be interested in. Desaturation emphasizes the saturation of parts on which users need to focus by setting the saturation low for parts outside the users’ area of attention [19]. Studies have also been conducted on the effect of visual guidance and user attention attraction on 360-degree VR videos. Nielsen (2016) presented a classification system for an approach that effectively induces the user’s attention in cinematic VR and conducted a study comparing the method of controlling the body orientation and the method of guiding the gaze among the methods of guiding attention [20]. A study by Rothe (2019), based on a literature review, investigated methods of guiding user attention in various media as well as VR and AR, and classified user attention guidance methods applicable to cinematic VR [21]. In a study of Speicher (2019), by comparing smartphones and HMD, the effect of visual guidance of video on users’ attention was investigated, and design guidelines for inducing user attention in 360-degree VR videos were defined [22]. Schmitz (2020) conducted a study on the effects of central cue (arrow) and peripheral cue (flicker) on user attention attraction in 360-degree VR videos using eye-tracking technology and determined that the central cue (arrow) has a positive effect on users’ attention [23].
In this study, attention-attracting elements are analyzed, and their effect on the user is studied. In particular, it is analyzed whether a user’s attention is attracted to the 360-degree VR videos by attention-attracting elements. To analyze this, an eye-tracking system suitable for the 360-degree VR video format was developed and designed to record the users’ gaze history while watching the video.

3. Analysis of User’s Attention Attracting Elements in 360-Degree VR Videos

The 360-degree VR videos provide high levels of immersion because 360-degree VR videos have an expanded field of view up to 360° and increase the user’s freedom to move their attention; however, users’ attention may not move in accordance with the narrative sequence intended by a director. Research on directing techniques for user attention induction suitable for 360-degree VR videos is needed to induce the user to move their attention as intended by the director.
In this study, attention-attracting elements refer to increasing immersion by guiding the user’s attention in the direction that should draw the user’s attention in the 360-degree VR video. VR contents can cause sickness if the experience time is long. On the other hand, 360-degree VR music video content can be experienced within 3–5 min and are easily accessible on social media platforms, such as YouTube, so they received a lot of attention in the early days of virtual reality. In terms of user convenience, 360-degree VR music videos are more comfortable to watch and more accessible than other VR content. Therefore, the scope of analysis of this study was limited to 360-degree VR music videos.
Through an analysis of related work, directing techniques that can attract users’ attention in 360-degree VR videos were referred to [6,8,19,22,23]. Based on the analysis of related work, 35 360-degree VR music videos, which were uploaded to YouTube between 2015 and 2020, were analyzed by directing techniques that can attract the users’ gaze in 360-degree VR videos. The keyword ‘360-degree VR Music Video’ was searched on YouTube, and 35 videos that clearly showed attention-attracting elements among music videos were selected for analysis. The 360-degree VR music videos analyzed are listed according to their upload date as listed in Table 1.
In the video analysis method, the researcher watched 360-degree VR music videos uploaded to YouTube using the 360-degree VR video viewing function provided by YouTube, and then watched them with the HTC Vive Pro Eye HMD. Using YouTube’s 360-degree VR video viewing function and HMD viewing method together, after watching each video at least once, the elements that can attract users’ attention were classified into five categories: object movement, hand gesture, GUI insertion, camera movement, and gaze angle variation.
In the object movement directing technique, a director induces the user’s attention movement in the direction in which the object, which is a musician appearing in the video, moves so that the scene directed from the intended position can be watched, and attention can be focused on moving objects. The object movement element can be seen in 20 out of the 35 music videos. For example, analyzing video No. 5 [28], object A, which is marked in Figure 1a, plays a leading role in the music video, and at this time, the users’ attention is fixed to object A. As the video is played, object A moves in the left direction as shown in Figure 1b,c. As shown in Figure 1d,e, when the users move their attention to the left along object A, they watch the scene where objects A and B play instruments together in the position intended by the director as shown in Figure 1f.
Analyzing video no. 28 [51], object A moves to the left in the direction of the arrow as shown in Figure 2a–c. When users move their attention to the left along with object A, they can see scenes, such as Figure 2d,e, and end up watching scenes where objects A and B have a conversation with each other, as shown in Figure 2f in the position intended by the director.
Hand gesture is a directing technique that induces users’ attention movement through the gestures of an object. It is used to cause the user to move their attention in the direction in which the object moves so that users can watch scenes as intended by the director. In addition, hand gesture was used with object movement, but when an object moves, it may induce users’ attention movement using gestures to follow the object. The hand gesture element can be observed in 3 out of the 35 music videos. For example, looking at video No. 24 [47], the marked object is gestured in the left direction as shown in Figure 3a, and users can see the same scene as shown in Figure 3c by shifting their attention when the object moves as shown in Figure 3b. When users move their attention in the direction that the object points to, users see the scene where the marked objects in Figure 3d are choreographed in the position intended by the director.
Analyzing video no. 6 [29], the marked object A in Figure 4a is beckoning in the direction of the arrow, and when users move their attention to the right, which is the direction in which object A beckons, object B in Figure 4b can also be seen to be beckoning to the right. When users move their attention in the direction in which object B beckons, users can see the scene where object C, the main character, is dancing in the position intended by the director as shown in Figure 4c.
In the GUI insertion directing technique, post-processing GUI inserted through post-production in pre-filmed music video allows the director to induce the user’s attention in the direction that users are not watching. This allows users to watch the scene directed in the intended position. The GUI insertion element can be observed in 17 out of 35 music videos. For example, observing video no. 8 as shown in Figure 5a [31], in the early part of the video, the director directed the lights to be dark in the rest of the space, except for the space where object A plays, as shown in Figure 5c. Therefore, users fixate their attention on object A, as shown in Figure 5a. As shown in Figure 5b, the original GUI is not inserted, but when the performance of object A is over, the arrow displayed in Figure 5d appears. The user recognizes the attention movement through the arrow and moves their attention to the right, as shown in Figure 5c. When users move their attention to the right along the arrow, they see a scene where the lighting is brightened, and a new object appears as shown in Figure 5f in the dark space and Figure 5c in the position intended by the director.
In the camera movement directing technique, the director induces the users’ attention in the direction that the camera moves and allows users to see the scene directed from the intended position. In cases where the camera-fixed directing technique is used, most of the time, filming is performed in one space. However, when the camera movement directing technique is used, filming is performed while moving several spaces or utilizing a wide space. The camera movement technique is shown in 13 out of 35 music videos. For example, in the analysis of video No. 23 [46], the initial part of the video begins without an object in the user’s attention, as shown in Figure 6a. As the video is played, the camera moves in the direction of the marked arrow as shown in Figure 6a, and when users move their attention in the direction in which the camera moves, users recognize the position of object A as shown in Figure 6b. If the camera continues to move in the direction where object A is positioned, users will see the scene where object A plays the piano in the location intended by the director, as shown in Figure 6c.
When analyzing video No. 14 [37], the director shows objects A and B alternately located in two spaces while moving the camera. The camera movement technique is applied to move the users’ attention from the scene where object A plays, to the new scene where object B plays. In the early part of the video, as shown in Figure 7a,b, the camera moves to the left in the direction of the arrow from the space where object A plays. When users move their attention in the direction where the camera moves, they pass through the space as shown in Figure 7c and see the marked space as shown in Figure 7d. The camera continues to move toward the marked space as shown in Figure 7e, and users see the scene where object B plays, in the position intended by the director, as shown in Figure 7f.
Gaze angle variation is a directing technique that induces a user’s attention by adjusting the range of gaze angles that users can see. It is used when the users’ attention needs to be focused by reducing the area of the screen that users can see to narrow the gaze angle of users, or when users must move their attention to the position intended by the director by widening the area of the screen to expand the angle of user gaze. The gaze angle variation technique is shown in 2 videos out of the 35 music videos analyzed. For example, when analyzing video No. 10 [33], as seen in the marked area shown in Figure 8a, the space except object A is darkened so that the screen area is reduced and the gaze angle of users is narrowed; therefore, the users’ attention is focused on object A. As the video is played, the screen area is broadened, as shown in Figure 8b, so that the visual angle of the user widens from the center, where object A is positioned, to the direction of the arrow. At this time, users see the scene in which the entire space intended by the director enters the user’s view, as shown in Figure 8c.
In this study, a 360-degree VR eye-tracking system was developed based on the Unity game engine (2020.1.7f1) which analyzes the area of attention by tracking the user’s gaze within a virtual reality environment. The 360 VR eye tracking system is designed based on a heatmap that can analyze the concentration of the user’s gaze among the eye tracking measurement methods to determine which part of the 360-degree VR video is focused on and whether the user’s attention is moved by attention-attracting elements. VIVE Pro Eye was used as the hardware for eye-tracking. The eye-tracking algorithm used in the eye-tracking system uses SDK provided by VIVE Pro Eye. The eye-tracking technology development company Tobii’s eye-tracker and VIVE Pro Eye’s eye-tracking accuracy were compared. Tobii’s eye tracker eye-tracking accuracy is 0.3°–0.6° and the VIVE Pro Eye’s eye-tracking accuracy, which is the offset between the recorded gaze point and actual position, is 0.5°–1.1° within 20° of FOV [59,60].
As shown in Figure 9, the 360-degree VR eye-tracking system consists of a 360-degree VR component, which plays 360-degree VR videos, a heatmap component that visualizes the heatmap and records users’ eye-tracking results, and a 360-degree VR heatmap manager component that manages visualized heatmap texture based on an eye-tracking component. The heatmap component with the mesh collider generates a ray hit point in the position, where the ray and mesh collider collide, which is generated in the user’s gaze direction. Concurrently, a heatmap texture was created as an alpha value from the position where the ray hit point was generated, and it was designed to overlap and record videos played on a 360-degree VR component. Overlapped and recorded 360-degree VR videos and heatmap textures were recorded and stored using a camera in the heatmap component. As shown in Figure 10a, as users watch 360-degree VR videos, the result of the users’ eye-tracking is recorded, as shown in Figure 10b.
The eye-tracking system proposed in this paper is based on the heatmap. Heatmap is a data visualization technique that shows the magnitude of a phenomenon as color in two dimensions. Additionally, the eye-tracking system used in this paper is for 360-degree VR video. The radius, weight, and duration functions are designed to adjust the visual function of heatmap to fit the frame and size of the video to be used in the experiment. The values of radius (5, 20), weight (5, 20), and duration (2, 20) were set to account for the change when the value of the variable was changed. After measuring with various values, it was set to a value that can clearly show the change.
In Figure 11a, the radius value of 5 is, applied and in Figure 11b, the radius value of 20 is applied, and then the result of eye-tracking is compared and analyzed. In the eye-tracking measurement, the radius is a value for measuring the size of the surrounding range of the ray hit point. When a collision occurs, the surrounding range is represented by the radius of the circle around the ray hit point.
Under the same conditions in which one place is continuously stared at for approximately 5 s, the weight value of 5 is applied in Figure 12a, and in Figure 12b, the applied value is 20, and then the result of eye-tracking is compared and analyzed. When one place is continuously gazed at for 5 s, the color of the heatmap becomes closer to red faster when the weight value is 20 than when the weight value is 5. Therefore, weight is a weight value that is multiplied by the heatmap value per second of the frame, which is the weight value of the heatmap per second. The UV coordinates of the generated heatmap texture are changed to pixel coordinates, RGB curve data are generated, and the color is changed according to the time measurement value per pixel of texture. In other words, adjusting the weight value brings the weight value of the heatmap to red as users continue to stare in one place.
Under the same conditions in which the same section was cut into frames per second and listed in order, Figure 13a has a duration value of 2, and in Figure 13b, the applied duration value is 20, and then the result of eye tracking is compared and analyzed. As such, duration is a duration value of heatmap per second when the users move their gaze, which is the time when the heatmap remains where the gaze has originally reached. As the duration value of the heatmap increases, the heatmap remains longer when the user’s gaze is reached.

4. Overview and Results of 360-Degree VR Eye-Tracking Experiment

The experiment’s underlying research question is “Do the five attention-attracting elements affect the user’s gaze movement?” Therefore, we analyzed whether attention-attracting elements affect users’ eye movements through experiments. The experiments were conducted based on the developed eye-tracking system. We recruited participants for the experiment and conducted eye-tracking experiments on 47 subjects. The experimental results were analyzed based on the heat map, which is the result of the eye-tracking experiment. This section discusses the overall experimental outline and procedure and the experimental results, data analysis methods, and data analysis results.

4.1. Experimental Procedure

The structure of the eye-tracking system is shown in Figure 14. The subject wore the VIVE Pro Eye HMD after completing the consent form and questionnaire. The subjects’ age, gender, virtual reality devices experience, and 360 VR video experience were obtained through questionnaires. The length of one experimental session was 30 min, including filling out a consent form, questionnaire, and training on the use of the VR device. In order to control the variables according to the video order during the experiment, the subjects were randomly provided the order of five experimental videos. Additionally, no breaks were provided between videos to immerse the subjects in the experiment. However, before the experiment, the subjects were instructed that they could rest at any time if they felt fatigued or had motion sickness. Subjects who felt motion sickness during the experiment were given a break of about 2 min during the experiment.
Before starting the experiment, the eye calibration process provided by VIVE Pro Eye was performed. When eye calibration was complete, the researcher guided the subject about the experiment and then played the 360-degree VR video, and the eye-tracking system was executed. The values of the eye-tracking variables (radius, weight, and duration) used in the experiment were adopted as radius = 20, weight = 20, and duration = 5. When the subject finished watching the video, their eye-tracking data were recorded as a heatmap by the virtual reality eye-tracking system. Based on these data, the researcher can analyze the subjects’ areas of attention.

4.2. Experimental Participants and Material

The experiment was conducted with 47 subjects, as shown in Figure 15. The subjects were undergraduate and graduate students of Changwon National University and were recruited through notice boards in the school and messenger announcements. The subjects’ gender, age, use of virtual reality devices, and viewing 360-degree VR videos were investigated using a demographic questionnaire before the experiment. Informed consent was obtained from all participants to participate in the experiment. The subjects consisted of 22 males and 25 females in their 20‘s and 30‘s (20 s–45, 30 s–2). A total of 85.1% (40 people) of the subjects had virtual reality device experience, and 63.8% (30 people) had 360 VR video experience.
Among the 35 videos, videos were selected using attention-attracting elements. As for the videos used as the experimental stimulus, the object movement was <SCHOOL OF ROCK: The Musical—“You’re in the Band”> [28], the hand gesture was < SCANDOL 360: EXID_HOT PINK> [29], GUI insert was < Naive New Beaters- Heal Tomorrow—Clip 360° ft. Izia> [31], camera movement was <OneRepublic—Kids (360 version)> [37], and gaze angle variation was <Trevor Wesley—Chivalry is Dead> [33]. In order to provide equal experimental conditions to the subjects, five video sample frames were defined as width of 1280 and height of 720.

4.3. Data Analysis Methods and Experimental Results

The experiment results were analyzed by establishing a standard to determine whether the subject performed attention movement suitable for the five attention-attracting elements. The data were analyzed using the recorded subject’s heatmap, and whether the user watched according to the attention-attracting elements was analyzed. The number of subjects who performed attention movement according to the attention-attracting elements in the section where the attention-attracting elements appeared was treated as valid data. Valid data were analyzed using two criteria: the time of movement of the attention-attracting elements and the direction of movement of the attention-attracting elements. The time of attention movement is used to analyze whether the subject move their attention when the attention-attracting element appears. The direction of attention movement is used to analyze whether the subjects move attention in the direction of the attention-attracting element. The number of subjects who performed attention movements that satisfied both criteria among all subjects was analyzed as percentage data.
The data in the object movement experiment results were 72.3% (34 subjects). The analysis of the experiment results shows that the subjects focused their attention on objects A and B, as shown in Figure 16a at the beginning of the video. As object A starts to move, as shown in Figure 16b, the subject’s attention moves along object A, as shown in Figure 16c. If the subject’s attention moves to the position where object C was positioned, as shown in Figure 16d, it can be said that the subject moves along with the attention-attracting element of the object movement.
The data in the hand gesture experiment results were 78.7% (37 subjects). An analysis of the experiment results shows that, as shown in Figure 17a, the subject focuses on object A and then, as shown in Figure 17b, objects A and B direct object C. If the subject moves the attention to object C along with the gesture of object A and B, it can be said that the subject moves along with the attention-attracting element of the hand gesture.
The data in the GUI insertion experiment were 29.8% (14 subjects). An analysis of the experimental results shows that, as shown in Figure 18a, the subject’s attention is focused on object A before the GUI is inserted. After the marked arrow GUI is inserted, as shown in Figure 18b, the attention moves to the GUI, as shown in Figure 18c. If the subject’s attention moves in the direction of the arrow GUI, as shown in Figure 18d, the subject moves along with the attention-attracting element of GUI insertion.
The data in the camera movement experiment were 91.5% (43 subjects). An analysis of the experimental results shows that, as shown in Figure 19a, the subject’s attention is focused on object A before the camera moves. As the video is played, the camera starts to move, and the subject’s attention moves from object A to the space visible, as shown in Figure 19b, and then the attention moves to object B, as shown in Figure 19c. It moves along with the attention-attracting element of the camera movement.
The data in the gaze angle variation experiment turned out to be 72.3% (34 subjects). An analysis of the experiment results shows that, as shown in Figure 20a at the beginning of the video, most subjects’ attention is fixed on object A, which is positioned in the center, because the gaze angle that the subjects can see is relatively narrow. As the video is played, the gaze angle becomes wider, as shown in Figure 20b–d, and when the subject’s attention moves along with the edge of the widening gaze angle, it can be said that the attention moves along with the attention-attracting element of gaze angle variation.

5. Discussion and Conclusions

This paper proposed the necessity of discovering directing techniques that can attract users’ attention in 360-degree VR video and analyzed 360-degree VR music videos to classify the attention-attracting elements into object movement, hand gesture, GUI insert, camera movement, and gaze angle variation. An eye-tracking system in a VR environment was developed, and related experiments were conducted to analyze whether the derived attention-attracting elements attract users’ attention in 360-degree VR videos. Based on the results obtained from tracking the subjects’ gaze, each attention-attracting element is summarized in Figure 21. As shown in Figure 21, the horizontal axis of the graph represents five attention-attracting elements, the longitudinal axis represents effective data, and the data determined to be effective in each attention-attracting experiment were the following: object movement—34 subjects (72.3%), hand gesture—37 subjects (78.8%), GUI insert—14 subjects (29.8%), camera movement—43 subjects (91.5%), and gaze angle variation—34 subjects (72.3%).
The results of the experiment showed that the element hand gesture was second when it came to attracting the attention of the experiment subjects. In the video where the attention-attracting element of the hand gesture is used, several objects appear and attract the attention of the subjects. It is considered that the object directly designates where the subject should look, causing many subjects to shift their attention.
The video in which the attention-attracting element of GUI insertion is used caused the least eye movement of the subjects. This is because the exposure time of the GUI appearance is very short and therefore not easily noticeable, causing many subjects to miss it. In such cases, if the exposure time and the number of GUI are increased, or a color that is easy to notice is used, user attention can be drawn clearly. However, because the director’s intentional effort to attract attention may cause visual discomfort or motion sickness to users, a directing technique that utilizes the spatial characteristics of VR is required [8].
This study had several limitations and needs further research. The 360-degree VR video provided to the participants in the experiment was not randomized. Therefore, there is a need to supplement these experimental designs. In addition, the scope of research on eye-tracking systems in a VR environment is limited to 360-degree VR music videos. In the future, the scope of VR content will be expanded, broadening the horizons for studies on VR space planning and directing to be conducted.

Author Contributions

Methodology, H.C. and S.N.; writing—original draft preparation, H.C.; writing—review and editing, H.C. and S.N.; project administration, S.N.; All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2020R1I1A3051739).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ban, Y.; Xie, L.; Xu, Z.; Zhang, X.; Guo, Z.; Wang, Y. Cub360: Exploiting cross-users behaviors for viewport prediction in 360 video adaptive streaming. In Proceedings of the 2018 IEEE International Conference on Multimedia and Expo (ICME), San Diego, CA, USA, 23–27 July 2018; pp. 1–6. [Google Scholar] [CrossRef]
  2. Dooley, K. Storytelling with virtual reality in 360-degrees: A new screen grammar. Stud. Australas. Cine. 2017, 11, 161–171. [Google Scholar] [CrossRef]
  3. Powell, W.; Powell, V.; Brown, P.; Cook, M.; Uddin, J. Getting around in google cardboard—exploring navigation preferences with low-cost Mobile VR. In Proceedings of the 2016 IEEE 2nd Workshop on Everyday Virtual Reality (WEVR), Greenville, SC, USA, 20 March 2016; pp. 5–8. [Google Scholar] [CrossRef] [Green Version]
  4. Qian, F.; Ji, L.; Han, B.; Gopalakrishnan, V. Optimizing 360 video delivery over Cellular Networks. In Proceedings of the 5th Workshop on All Things Cellular: Operations, Applications and Challenges, New York, NY, USA, 3–7 October 2016; pp. 1–6. [Google Scholar] [CrossRef]
  5. Maranes, C.; Gutierrez, D.; Serrano, A. Exploring the impact of 360° movie cuts in users’ attention. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 73–82. Available online: https://ieeevr.org/2020/ (accessed on 15 October 2021).
  6. Mateer, J. Directing for cinematic virtual reality: How the traditional film director’s craft applies to immersive environments and notions of presence. Media Pract. 2017, 18, 14–25. [Google Scholar] [CrossRef] [Green Version]
  7. Sheikh, A.; Brown, A.; Evans, M.; Watson, Z. Directing attention in 360-degree video. In Proceedings of the IBC 2016 Conference, Amsterdam, NL, USA, 8–12 September 2016; pp. 1–9. [Google Scholar] [CrossRef] [Green Version]
  8. Fearghail, C.O.; Ozcinar, C.; Knorr, S.; Smolic, A. Director’s cut-analysis of aspects of interactive storytelling for VR films. In Interactive Storytelling; Springer: Berlin/Heidelberg, Germany, 2018; pp. 308–322. [Google Scholar] [CrossRef]
  9. Clay, V.; König, P.; König, S.U. Eye tracking in virtual reality. Eye Mov. Res. 2019, 12, 1–18. [Google Scholar] [CrossRef] [PubMed]
  10. Majaranta, P.; Bulling, A. Eye Tracking and Eye-Based Human–Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar] [CrossRef]
  11. Duchowski, A.T. Eye Tracking Methodology; Springer: London, UK, 2017. [Google Scholar] [CrossRef]
  12. Poole, A.; Ball, L.J. Eye tracking in HCI and Usability Research. In Encyclopedia of Human Computer Interaction; IGI global: Hershey, PA, USA, 2005; pp. 211–219. [Google Scholar] [CrossRef]
  13. Duchowski, A.T.; Shivashankaraiah, V.; Rawls, T.; Gramopadhye, A.K.; Melloy, B.J.; Kanki, B. Binocular Eye tracking in virtual reality for inspection training. In Proceedings of the 2000 symposium on Eye tracking research & applications, Palm Beach Gardens, FL, USA, 6–8 November 2000; pp. 89–96. [Google Scholar] [CrossRef] [Green Version]
  14. Piumsomboon, T.; Lee, G.; Lindeman, R.W.; Billinghurst, M. Exploring natural eye-gaze-based interaction for immersive virtual reality. In Proceedings of the 2017 IEEE Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, USA, 18–19 March 2017; pp. 36–39. [Google Scholar] [CrossRef]
  15. Pfeiffer, J.; Pfeiffer, T.; Meißner, M.; Weiß, E. Eye-tracking-based classification of Information Search Behavior Using Machine Learning: Evidence from experiments in physical shops and virtual reality shopping environments. Inf. Syst. Res. 2020, 31, 675–691. [Google Scholar] [CrossRef]
  16. Redefining the Axiom of Story: The VR and 360 Video Complex. Available online: https://techcrunch.com/2016/01/14/redefining-the-axiom-of-story-the-vr-and-360-video-complex/ (accessed on 8 December 2021).
  17. Tricart, C. Virtual Reality Filmmaking; Taylor & Francis: New York, NY, USA, 2017. [Google Scholar]
  18. Xu, Q.; Ragan, E.D. Effects of character guide in immersive virtual reality stories. In Proceedings of the International Conference on Human-Computer Interaction, Orlando, FL, USA, 26–31 July 2019; pp. 375–391. [Google Scholar] [CrossRef] [Green Version]
  19. Danieau, F.; Guillo, A.; Dore, R. Attention guidance for immersive video content in head-mounted displays. In Proceedings of the 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 18–22 March 2017; pp. 205–206. [Google Scholar] [CrossRef]
  20. Nielsen, L.T.; Møller, M.B.; Hartmeyer, S.D.; Ljung, T.C.; Nilsson, N.C.; Nordahl, R.; Serafin, S. Missing the point: An exploration of how to guide users’ attention during cinematic virtual reality. In Proceedings of the 22nd ACM conference on virtual reality software and technology, New York, NY, USA, 2–4 November 2016; pp. 229–232. [Google Scholar] [CrossRef]
  21. Rothe, S.; Buschek, D.; Hußmann, H. Guidance in cinematic virtual reality-taxonomy, research status and challenges. Multimodal Technol. Interact. 2019, 3, 19. [Google Scholar] [CrossRef] [Green Version]
  22. Speicher, M.; Rosenberg, C.; Degraen, D.; Daiber, F.; Krúger, A. Exploring visual guidance in 360-degree videos. In Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video, Salford, Manchester, UK, 5–7 June 2019; pp. 1–12. [Google Scholar] [CrossRef]
  23. Schmitz, A.; MacQuarrie, A.; Julier, S.; Binetti, N.; Steed, A. Directing versus attracting attention: Exploring the effectiveness of central and peripheral cues in panoramic videos. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 63–72. [Google Scholar]
  24. Noa Neal ‘Graffiti’ 4K 360° Music Video Clip. Available online: https://www.youtube.com/watch?v=LByJ9Q6Lddo&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=1 (accessed on 25 October 2021).
  25. Avicii-Waiting For Love (360 Video). Available online: https://www.youtube.com/watch?v=edcJ_JNeyhg&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=2 (accessed on 25 October 2021).
  26. Yuko Ando /"360° Surround"-MV-(Short Ver.). Available online: https://www.youtube.com/watch?v=zDtETW9Tp7w&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=3 (accessed on 25 October 2021).
  27. INFINITE “Bad” Official MV (360 VR). Available online: https://www.youtube.com/watch?v=BNqW6uE-Q_o&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=4 (accessed on 25 October 2021).
  28. SCHOOL OF ROCK: The Musical—“You’re in the Band” (360 Video). Available online: https://www.youtube.com/watch?v=GFRPXRhBYOI&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=5 (accessed on 25 October 2021).
  29. SCANDOL 360: EXID_HOT PINK (VR). Available online: https://www.youtube.com/watch?v=iv_bNWjsQSM&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=6 (accessed on 25 October 2021).
  30. Peggy Hsu. [Grow old with you] 360 VR MV (Official 360 VR). Available online: https://www.youtube.com/watch?v=k2qgvvXhOaI&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=7, (accessed on 25 October 2021).
  31. Naive New Beaters-Heal Tomorrow-Clip 360° ft. Izia. Available online: https://www.youtube.com/watch?v=JxVVNm35rJE&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=8. (accessed on 25 October 2021).
  32. Devo-“What We Do” [OFFICIAL MUSIC VIDEO]. Available online: https://www.youtube.com/watch?v=Rr3akQ8XX3M&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=9 (accessed on 25 October 2021).
  33. Trevor Wesley-Chivalry is Dead (Official 360 Music Video). Available online: https://www.youtube.com/watch?v=jb0a9kGoZu0&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=10 (accessed on 25 October 2021).
  34. [M2] 360VR dance: GFriend—Rough. Available online: https://www.youtube.com/watch?v=uAt5srzLt4I&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=11 (accessed on 25 October 2021).
  35. Leah Dou-May Rain Official MV (360VR Version). Available online: https://www.youtube.com/watch?v=gI5g4YNORyo&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=12 (accessed on 25 October 2021).
  36. Run The Jewels-Crown (Official VR 360 Music Video). Available online: https://www.youtube.com/watch?v=JCNzOQ2Ok8s&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=13 (accessed on 25 October 2021).
  37. OneRepublic-Kids (360 Version). Available online: https://www.youtube.com/watch?v=eppTvwQNgro&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=14. (accessed on 25 October 2021).
  38. Farruko Ft. Ky-Many Marley-Chillax [360° Official Video]. Available online: https://www.youtube.com/watch?v=c9QkJ6272rs&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=15 (accessed on 25 October 2021).
  39. Imagine Dragons-Shots (Official Music Video). Available online: https://www.youtube.com/watch?v=81fer9ulOeA&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=16 (accessed on 25 October 2021).
  40. Sampha-(No One Knows Me) Like The Piano (Official 360° VR Music Video). Available online: https://www.youtube.com/watch?v=V-ncE-yR8mI&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=17 (accessed on 25 October 2021).
  41. The Range-Florida (Official 360° Video). Available online: https://www.youtube.com/watch?v=L40WLGB7V2w&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=18 (accessed on 25 October 2021).
  42. La La Land Medley in VR!! Sam Tsui & Megan Nicole|Sam Tsui. Available online: https://www.youtube.com/watch?v=VavMLy0QzSA&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=19 (accessed on 25 October 2021).
  43. The Kills-Whirling Eye (Official 360° VR Video). Available online: https://www.youtube.com/watch?v=GMcr4-7hlKs&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=20 (accessed on 25 October 2021).
  44. Depeche Mode-”Going Backwards” (360 Version). Available online: https://www.youtube.com/watch?v=y3GXqq9V9a0&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=21 (accessed on 25 October 2021).
  45. Emri-ADORE-360° (Official Video). Available online: https://www.youtube.com/watch?v=PHqOki9dsgo&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=22 (accessed on 25 October 2021).
  46. Elton John-Farewell Yellow Brick Road: The Legacy (VR360). Available online: https://www.youtube.com/watch?v=0zXZIgnub6w&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=23 (accessed on 25 October 2021).
  47. Mamma Mia! Here We Go Again-Waterloo 360 Music Video. Available online: https://www.youtube.com/watch?v=HRC-A5BaY2Q&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=24 (accessed on 25 October 2021).
  48. The Dandy Warhols “Be Alright” 360° Official Music Video-Shot @ The Dandys’ studio The Odditorium. Available online: https://www.youtube.com/watch?v=peM92qyUy5o&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=25 (accessed on 25 October 2021).
  49. Christopher-Irony (Official VR Music Video). Available online: https://www.youtube.com/watch?v=Yeo8034XisQ&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=26 (accessed on 25 October 2021).
  50. [Official MV] (POY Muzeum)-Space High (Feat. TAKUWA, PUP) (360 VR camera). Available online: https://www.youtube.com/watch?v=fw5lUOMN4dY&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=27 (accessed on 25 October 2021).
  51. Popstar-Reflections (360 VR Experience). Available online: https://www.youtube.com/watch?v=60YGz-gMu-M&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=28 (accessed on 25 October 2021).
  52. Green Screen MV. Available online: https://www.youtube.com/watch?v=qGbHz2Q-YJ8&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=29 (accessed on 25 October 2021).
  53. Nolie-Come Over (Official Music Video). Available online: https://www.youtube.com/watch?v=Ow0C5i1a2pw&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=30 (accessed on 25 October 2021).
  54. DENISE VALLE-REPEAT AFFECTIONS-OFFICIAL 360 MUSIC VIDEO. Available online: https://www.youtube.com/watch?v=nRqP06uX_H0&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=31 (accessed on 25 October 2021).
  55. Gramofone-My Self|@gramoHOME 360°. Available online: https://www.youtube.com/watch?v=1MJmVTuUA34&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=32 (accessed on 25 October 2021).
  56. Joey Maxwell—Streetlights (360 Degree Video). Available online: https://www.youtube.com/watch?v=cC396zPuwTY&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=33 (accessed on 25 October 2021).
  57. Ably House|Grave Song|360° VR. Available online: https://www.youtube.com/watch?v=BrQgMxxgWBY&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=34 (accessed on 25 October 2021).
  58. Noon Fifteen: Easy: 360 Video. Available online: https://www.youtube.com/watch?v=wuz7ZUvR9es&list=PLpwTD9PEgyLMDq5CtdwJ7angHr7SjL9yQ&index=35 (accessed on 25 October 2021).
  59. Sipatchin, A.; Siegfried, W.; Katharina, R. Eye-Tracking for Clinical Ophthalmology with Virtual Reality (VR): A Case Study of the HTC Vive Pro Eye’s Usability. Healthcare 2021, 9, 180. [Google Scholar] [CrossRef] [PubMed]
  60. Tobii Pro Spectrum. Available online: https://www.tobiipro.com/ko/product-listing/ko-tobii-pro-spectrum/ (accessed on 4 November 2021).
Figure 1. Directed Case 1 using “Object Movement” (video No. 5). (a) The user’s attention is fixed to object A, (b,c) Object A starts to move in the left direction, (d,e) The user’s attention moves to the left along object A, (f) Objects A and B play instruments together in the position intended by the director.
Figure 1. Directed Case 1 using “Object Movement” (video No. 5). (a) The user’s attention is fixed to object A, (b,c) Object A starts to move in the left direction, (d,e) The user’s attention moves to the left along object A, (f) Objects A and B play instruments together in the position intended by the director.
Mti 06 00054 g001
Figure 2. Directed Case 2 using “Object Movement” (video No. 28). (ac) Object A moves to the left in the direction of the arrow, (d,e) The user’s attention moves to the left along with object A, (f) The scene where objects A and B have a conversation with each other.
Figure 2. Directed Case 2 using “Object Movement” (video No. 28). (ac) Object A moves to the left in the direction of the arrow, (d,e) The user’s attention moves to the left along with object A, (f) The scene where objects A and B have a conversation with each other.
Mti 06 00054 g002
Figure 3. Directed Case 1 using “Hand gesture” (video No. 24). (a) The marked objects are gestured in the left direction, (b) The user’s attention moves to the left along with the objects’ gesture, (c) The user can watch other objects, (d) The user can see the marked objects are choreographed in the position intended by the director.
Figure 3. Directed Case 1 using “Hand gesture” (video No. 24). (a) The marked objects are gestured in the left direction, (b) The user’s attention moves to the left along with the objects’ gesture, (c) The user can watch other objects, (d) The user can see the marked objects are choreographed in the position intended by the director.
Mti 06 00054 g003
Figure 4. Directed Case 2 using “Hand gesture” (video No. 6). (a) The marked object A is beckoning in the direction of the arrow, (b) Object A and B are beckoning to the right, (c) The user can see object C, the main character, is dancing in the position intended by the director.
Figure 4. Directed Case 2 using “Hand gesture” (video No. 6). (a) The marked object A is beckoning in the direction of the arrow, (b) Object A and B are beckoning to the right, (c) The user can see object C, the main character, is dancing in the position intended by the director.
Mti 06 00054 g004
Figure 5. Directed Case using ‘GUI insert’ (video No. 8). (a) The user’s attention fixates on object A, (b) The scene where GUI is not inserted, (c) The space is dark except for the space where object A plays, (d) The scene where GUI is inserted, (e) The user’s attention moves to the right along the GUI, (f) The user can see the scene where the lighting is brightened and a new object appears.
Figure 5. Directed Case using ‘GUI insert’ (video No. 8). (a) The user’s attention fixates on object A, (b) The scene where GUI is not inserted, (c) The space is dark except for the space where object A plays, (d) The scene where GUI is inserted, (e) The user’s attention moves to the right along the GUI, (f) The user can see the scene where the lighting is brightened and a new object appears.
Mti 06 00054 g005
Figure 6. Directed Case 1 using “camera movement” (video No. 23). (a) The camera moves in the direction of the marked arrow, (b) The user’s attention moves to the right along the camera’s movement, (c) The user can see the scene where object A plays the piano in the location intended by the director.
Figure 6. Directed Case 1 using “camera movement” (video No. 23). (a) The camera moves in the direction of the marked arrow, (b) The user’s attention moves to the right along the camera’s movement, (c) The user can see the scene where object A plays the piano in the location intended by the director.
Mti 06 00054 g006
Figure 7. Directed Case 2 using “Camera movement” (video No. 14). (a,b) The camera moves to the left in the direction of the arrow from the space where object A plays, (c) The user moves their attention in the direction where the camera moves, (d) The user can see the marked space, (e) The camera continues to move toward the marked space, (f) The user can see the scene where object B plays, in the position intended by the director.
Figure 7. Directed Case 2 using “Camera movement” (video No. 14). (a,b) The camera moves to the left in the direction of the arrow from the space where object A plays, (c) The user moves their attention in the direction where the camera moves, (d) The user can see the marked space, (e) The camera continues to move toward the marked space, (f) The user can see the scene where object B plays, in the position intended by the director.
Mti 06 00054 g007
Figure 8. Directed Case using “Gaze angle variation” (video No. 10). (a) The space except object A is darkened, (b) The screen area is broadened, (c) The user can see the scene in which the entire space intended by the director enters the user’s view.4. Development of 360-Degree Eye-Tracking System.
Figure 8. Directed Case using “Gaze angle variation” (video No. 10). (a) The space except object A is darkened, (b) The screen area is broadened, (c) The user can see the scene in which the entire space intended by the director enters the user’s view.4. Development of 360-Degree Eye-Tracking System.
Mti 06 00054 g008
Figure 9. The 360-degree VR eye-tracking system component.
Figure 9. The 360-degree VR eye-tracking system component.
Mti 06 00054 g009
Figure 10. The 360-degree VR eye-tracking system measurement results. (a) The original video (b) The result of the users’ eye-tracking heatmap data.
Figure 10. The 360-degree VR eye-tracking system measurement results. (a) The original video (b) The result of the users’ eye-tracking heatmap data.
Mti 06 00054 g010
Figure 11. Heatmap comparison based on radius value.
Figure 11. Heatmap comparison based on radius value.
Mti 06 00054 g011
Figure 12. Heatmap comparison based on weight value.
Figure 12. Heatmap comparison based on weight value.
Mti 06 00054 g012
Figure 13. Heatmap comparison based on duration value.
Figure 13. Heatmap comparison based on duration value.
Mti 06 00054 g013
Figure 14. The 360-degree VR eye-tracking system structure.
Figure 14. The 360-degree VR eye-tracking system structure.
Mti 06 00054 g014
Figure 15. The 360-degree VR eye-tracking experiment.
Figure 15. The 360-degree VR eye-tracking experiment.
Mti 06 00054 g015
Figure 16. Eye movement according to attention-attracting element of “Object movement”. (a) The subjects focused their attention on objects A and B, (b) Object A starts to move, (c) The subject’s attention moves along object A, (d) The subject’s attention moves to the position where object C was positioned.
Figure 16. Eye movement according to attention-attracting element of “Object movement”. (a) The subjects focused their attention on objects A and B, (b) Object A starts to move, (c) The subject’s attention moves along object A, (d) The subject’s attention moves to the position where object C was positioned.
Mti 06 00054 g016
Figure 17. Eye movement according to attention-attracting element of “hand gesture”. (a) The subject focuses on object A, (b) The subject moves their attention to object C along with the gesture of object A and B.
Figure 17. Eye movement according to attention-attracting element of “hand gesture”. (a) The subject focuses on object A, (b) The subject moves their attention to object C along with the gesture of object A and B.
Mti 06 00054 g017
Figure 18. Eye movement according to attention-attracting element of “GUI insertion”. (a) The subject’s attention is focused on object A before the GUI is inserted, (b) The marked arrow GUI is inserted, (c) The user’s attention moves to the GUI, (d) The subject’s attention moves in the direction of the arrow GUI.
Figure 18. Eye movement according to attention-attracting element of “GUI insertion”. (a) The subject’s attention is focused on object A before the GUI is inserted, (b) The marked arrow GUI is inserted, (c) The user’s attention moves to the GUI, (d) The subject’s attention moves in the direction of the arrow GUI.
Mti 06 00054 g018
Figure 19. Eye movement according to attention-attracting element of “Camera movement”. (a) The subject’s attention is focused on object A before the camera moves, (b) When the camera starts to move, the subject’s attention moves from object A to out, (c) The subject’s attention moves from object A to object B.
Figure 19. Eye movement according to attention-attracting element of “Camera movement”. (a) The subject’s attention is focused on object A before the camera moves, (b) When the camera starts to move, the subject’s attention moves from object A to out, (c) The subject’s attention moves from object A to object B.
Mti 06 00054 g019
Figure 20. Eye movement according to attention-attracting element of ”Gaze angle variation”. (a) At the beginning of the video, most subjects’ attention is fixed on object A, which is positioned in the center, (bd) The subject’s attention moves along with the edge of the widening gaze angle.
Figure 20. Eye movement according to attention-attracting element of ”Gaze angle variation”. (a) At the beginning of the video, most subjects’ attention is fixed on object A, which is positioned in the center, (bd) The subject’s attention moves along with the edge of the widening gaze angle.
Mti 06 00054 g020
Figure 21. The 360-degree VR eye-tracking experiment results.
Figure 21. The 360-degree VR eye-tracking experiment results.
Mti 06 00054 g021
Table 1. The 360-degree VR music videos.
Table 1. The 360-degree VR music videos.
Video NumberVideo Title
1Noa Neal ‘Graffiti’ 4 K 360° Music Video Clip [24]
2Avicii—Waiting For Love (360 Video) [25]
3Yuko Ando—360° (Rubi: Zenhoi) Surround [26]
4INFINITE—“Bad” Official MV (360 VR) [27]
5SCHOOL OF ROCK—You’re in the Band [28]
6SCANDOL 360: EXID—HOT PINK (VR) [29]
7Peggy Hsu—Grow old with you [30]
8Naive New Beaters—Heal Tomorrow ft. Izia [31]
9Devo—What We Do [32]
10Trevor Wesley—Chivalry is Dead [33]
11[M2] 360 VR dance: GFriend—Rough [34]
12Leah Dou—May Rain Official MV [35]
13Run The Jewels—Crown [36]
14OneRepublic—Kids (360 version) [37]
15Farruko Ft. Ky-Many Marley—Chillax [38]
16Imagine Dragons—Shots [39]
17Sampha—No One Knows Me [40]
18The Range—Florida (Official 360° Video) [41]
19La La Land Medley in VR [42]
20The Kills—Whirling Eye [43]
21Depeche Mode—“Going Backwards” [44]
22emri—ADORE—360° (Official Video) [45]
23Elton John—Farewell Yellow Brick Road [46]
24Mamma Mia! Here We Go Again—Waterloo [47]
25The Dandy Warhols—Be Alright [48]
26Christopher—Irony (Official VR Music Video) [49]
27POY Muzeum—Space High [50]
28Popstar—Reflections (360 VR Experience) [51]
29Green Screen MV [52]
30Nolie—Come Over (Official Music Video) [53]
31DENISE VALLE—REPEAT AFFECTIONS [54]
32Gramofone—My Self|@gramoHOME 360° [55]
33joey maxwell—streetlights (360 degree video) [56]
34Ably House|Grave Song|360° VR [57]
35Noon Fifteen: Easy: 360 video [58]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Choi, H.; Nam, S. A Study on Attention Attracting Elements of 360-Degree Videos Based on VR Eye-Tracking System. Multimodal Technol. Interact. 2022, 6, 54. https://doi.org/10.3390/mti6070054

AMA Style

Choi H, Nam S. A Study on Attention Attracting Elements of 360-Degree Videos Based on VR Eye-Tracking System. Multimodal Technologies and Interaction. 2022; 6(7):54. https://doi.org/10.3390/mti6070054

Chicago/Turabian Style

Choi, Haram, and Sanghun Nam. 2022. "A Study on Attention Attracting Elements of 360-Degree Videos Based on VR Eye-Tracking System" Multimodal Technologies and Interaction 6, no. 7: 54. https://doi.org/10.3390/mti6070054

Article Metrics

Back to TopTop