Next Article in Journal
Improving Graduation Rate Estimates Using Regularly Updating Multi-Level Absorbing Markov Chains
Next Article in Special Issue
Mobile Learning in Pre-Service Teacher Education: Perceived Usefulness of AR Technology in Primary Education
Previous Article in Journal
Cultural Dimension in Internationalization of the Curriculum in Higher Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Augmented Reality-Based Mobile Application Facilitates the Learning about the Spinal Cord

1
Northeast Biotechnology Network, Federal University of Piauí, Teresina 64049-550, Brazil
2
Neuro-innovation Technology & Brain Mapping Laboratory, Federal University of Delta do Parnaíba, Av. São Sebastião n 2819-Nossa Sra. de Fátima-Parnaíba, Parnaíba 64202-020, Brazil
3
Federal Institute of Maranhão, Araioses 65570-000, Brazil
*
Author to whom correspondence should be addressed.
Educ. Sci. 2020, 10(12), 376; https://doi.org/10.3390/educsci10120376
Submission received: 3 November 2020 / Revised: 1 December 2020 / Accepted: 9 December 2020 / Published: 12 December 2020
(This article belongs to the Special Issue Smartphones: Challenges and Opportunities for Learning)

Abstract

:
Health education is one of the knowledge areas in which augmented reality (AR) technology is widespread, and it has been considered as a facilitator of the learning process. In literature, there are still few studies detailing the role of mobile AR in neuroanatomy. Specifically, for the spinal cord, the teaching–learning process may be hindered due to its abstract nature and the absence of three-dimensional models. In this sense, we implemented a mobile application with AR technology named NitLabEduca for studying the spinal cord with an interactive exploration of 3D rotating models in the macroscopic scale, theoretical content of its specificities, animations, and simulations regarding its physiology. To investigate NitLabEduca’s effects, eighty individuals with and without previous neuroanatomy knowledge were selected and grouped into control and experimental groups. Divided, they performed learning tasks through a questionnaire. We used the System Usability Scale (SUS) to evaluate the usability level of the mobile application and a complimentary survey to verify the adherence level to the use of mobile applications in higher education. As a result, we observed that participants of both groups who started the task with the application and finished with text had more correct results in the test (p < 0.001). SUS results were promising in terms of usability and learning factor. We concluded that studying the spinal cord through NitLabEduca seems to favor learning when used as a complement to the printed material.

1. Introduction

The mobile technology evolution has provided changes in education [1,2]. Initially, the advances were centered on devices such as Personal Digital Assistants (PDAs), tablets, notebooks, and mobile phones [3]; then, the goal was to allow students to learn outside the classroom [4]; and later, the concept of active spaces emerged, wherein student mobility is related to the real environment context, characterized by the use of mixed reality in learning [5]. The union of virtual and real realities enabled immersion and interactivity because of 3D displays, high-resolution graphics, and animations, so applications with this technology have been implemented as supplements to printed books [6].
Universities usually provide environments conducive to experience new educational processes, such as the concept of mobile learning (m-learning), which characterizes autonomy, collaboration, and mobility among individual students [2]. Learning objects in education implemented for mobile devices are increasingly widespread. In the health area, augmented reality (AR) technology is one of the possible facilitators for learning human anatomy. Three-dimensionality increases student interaction with the study object [7,8], reduces cognitive overload, and allows the student to be an active subject with more dynamics during the learning process [9].
One of the teaching difficulties, especially in anatomy, is to migrate from theory in the learning process to the student’s clinical practice. Technologies have minimized the impact of this transition with AR and 3D printing applications [10], since there are risks for clinical practice training with real patients—one of the motivating factors for adopting these tools in the health learning process [11]. In traditional anatomical study with "wet" anatomical pieces complemented with synthetic molds [12], the costs of these materials presents a difficulty [13], which also drives the search for new technological solutions to the teaching–learning process [14].
In that context, there are several benefits of mobile AR technology for education, such as stimulating collaboration and social interaction, enabling mobility and reducing costs as regards the study environment, since it allows students to learn outside the traditional teaching space [15]. It also provides spatial learning by promoting interaction and immersion by means of three-dimensional images, facilitating the understanding of abstract concepts. For example, the difficulty in learning dental morphology through 2D materials led the authors in [16] to propose a solution with mobile AR technology to improve the learning of anatomical factors of teeth; they achieved positive results for the students in terms of usability, flexibility, and satisfaction in the educational process.
In fact, with the evolution of hardware and software technologies, there is a tendency to create educational applications in the health area with three-dimensional modeling and reconstruction of anatomical structures with AR, and for 3D printing [17]. In this scenario, research with students from the early years of medicine from the discipline of neuroanatomy has made evident the difficulty of learning the spinal cord due to its abstract nature and lack of anatomical models. Moreover, there is a need for supplementary material to the educational process [18]. In this study, we hypothesized that a mobile AR application could favor the understanding of anatomical structures, which could facilitate the learning process regarding the spinal cord, thereby reducing the use of synthetic anatomical models since they are more expensive.
In this sense, to evaluate the use of an AR solution as an auxiliary means in the teaching–learning process of the spinal cord, we developed a mobile application with AR technology, named NitLabEduca, that uses the interactive exploration of 3D rotating models at the macroscopic scale, animation, and theoretical content. Furthermore, we investigated the effects of the technology proposed in this study on the learning process for the spinal cord’s topographic and functional anatomy. We hope to improve student learning by using technological resources with a three-dimensional virtual dissection model of the spinal cord through spatial abstraction. Additionally, our proposed solution provides a low-cost synthetic model in a format printed on 3D printers.
The remainder of the paper is organized as follows. Section 2 presents preliminary information needed to understand this study. Section 3 focuses on research materials and methods. Section 4 presents the results, and Section 5 discusses and correlates them with the theory, acknowledges limitations, and outlines future research. Section 6 concludes our work.

2. Preliminaries

In this section, we first present our motivations for conducting this study. Second, we present the related works. Finally, we discuss the contributions of this paper, and show the research questions adopted by this study.

2.1. Motivation

Augmented reality is a technology that allows integrating components of the virtual world with physical reality in real time [19,20]. There is still no consensus on the definitions of the different interaction technologies such as AR, virtual reality (VR), and mixed reality (MR). An acronym known as “XR” has been used to refer to new and innovative forms of realities [21], which can describe terms such as extended or expanded reality, or the letter “X” means “anything”. Therefore, AR is a generic term that can be seen as a subcategory of XR and seeks to create an environment where users cannot distinguish between virtual and real objects [21].
Augmented reality technology has been used in different contexts, including in the health domain [22]. For example, health professionals have used AR in clinical practice with effectiveness both in preoperative planning and during surgical intervention processes [23]. Professionals have also used AR-assisted telemedicine to virtually reconstruct the human body of a patient using a 3D model [24].
Augmented reality technology is especially useful in health education [25]. Health professionals have used AR to learn the theory and practice of procedural skills through remote learning and teaching supervision during clinical practice [26]. In addition, AR technology provides a simulation environment that can reduce complications in the initial practices of health students. AR-based solutions have assisted medical residents and future surgeons in virtually learning surgical techniques before, during, and after the procedures through simulations, since on-site training in the operating room exposes patients to risks due to lack of student experience [27,28]. These solutions enable the balance between patient safety and the educational background of the medical professional, and enable the teaching of complex skills in a controlled environment, which allows students to make mistakes without the adverse consequences of real life. Moreover, the learning with 3D technologies can increase motivation, student involvement, and spatial knowledge representation [29].
Technology has revolutionized the representation of the human body [12]. Medical students use AR for three-dimensional comprehension of human anatomy in an interactive way [30], since this technology seems to reduce cognitive overload during the learning process [9]. Three-dimensional anatomical structures are models that aid in the spatial interpretation of forms. The perspective of 2D images is restricted to length and width, limiting the student to abstraction, and although 3D images add depth, they are limited by not interacting with the user [31,32]. On the other hand, mobile devices allow for the continuous updating of the contents of health practice through self-directed learning more quickly, even outside the traditional teaching environment [2,33]. In this context, mobile AR technologies have the potential to stimulate spatial learning by materializing abstract concepts and simplifying the understanding of the content [34].

2.2. Related Work

There are different studies focused on m-learning [2] and mobile AR [35]. However, to the best of our knowledge, few studies have been developed on mobile AR technologies for human anatomy education.
Representing of three-dimensional anatomical structures is not an easy task in education, but they are models that assist in the spatial interpretation of forms and describe the human body. Traditionally, the anatomical teaching is performed with real or synthetic cadavers found in laboratories and classrooms in educational institutions. Ferrer-Torregrosa et al. [12] developed an augmented reality tool called ARBOOK, which provides content related to the anatomy of the lower limbs, then helps students to learn the content independently, at desired times and places. The results demonstrated its usefulness for motivating students to learn autonomously and interpret spatial information.
The magic mirror system [36] is an augmented reality application that virtually displays the student’s anatomical organs, and enables interaction and immersion. As a result, students felt motivated to learn human anatomy. Another application with the same purpose, a magic mirror system [30], demonstrated a better three-dimensional understanding of the human body’s anatomical structures.
Birt et al. [22] examined two interventions with AR and VR technologies in two higher education classrooms. That study concentrated on student perceptions for learning physiology and anatomy, and skill acquisition in airway management. In the first classroom, the use of Oculus Rift resulted in greater student involvement and minimal distraction compared to a mobile AR application. In the second one, these technologies provided a significant improvement in student performance by using simulations.

2.3. Discussion and Research Questions

In contrast to the related studies, we focus on using a mobile AR application for human neuroanatomy education—specifically the process of learning the topographic and functional anatomy of the spinal cord. Our objective by performing this study was to develop NitLabEduca and evaluate its performance as an auxiliary means in the teaching–learning process of the spinal cord. Therefore, the contribution of this paper is threefold: (i) we propose and detail the mobile AR application NitLabEduca; (ii) we present the results of an experimental evaluation that compared the use of printed material and the proposed application to acquire knowledge; and (iii) we identify the usability and the learning ability factor of the proposed application. For those ends, we defined the following research questions (RQ):
  • (RQ1) Can NitLabEduca improve the teaching–learning process of the spinal cord?
  • (RQ2) What is the usability performance of NitLabEduca?
  • (RQ3) What is the learning ability factor of NitLabEduca?

3. Methodology

3.1. Research Characterization

This research is an experimental and comparative procedure, which assessed through questionnaires applied to volunteers (i.e., students), after a learning task, the use of printed material and the mobile application with AR as a means of acquiring knowledge. To this end, NitLabEduca was implemented to study ascending and descending pathways of the spinal cord. NitLabEduca is a mobile application that uses AR technology and enables users’ interactivity with a three-dimensional model of the spinal cord on a macroscopic scale for the learning process. The proposed application also has test resources that evaluate results of users in a statistical performance model.

3.2. Sampling

The study participants were 80 students from the neuroscience class in the physiotherapy course at the Federal University of Piauí (UFPI), Brazil. Regarding inclusion criteria, students of the neuroscience discipline were selected from the physiotherapy course at UFPI, aged between 18 and 25 years, men and women. Exclusion criteria were: individuals with a history of biological determinants that could change results (e.g., psychotropic medications, fatigue, and alteration of body temperature); abnormal or corrected audiovisual impairments; individuals with severe impairments when moving hands or fingers; individuals with thinking disorders (i.e., hallucinations); other neurological disorders and severe psychiatric disorders; musculoskeletal conditions that could cause bias; and individuals with global cognitive deterioration.
Participants signed a statement with details of the experimental conditions, the study objective, and informed consent terms. Data collection procedures were initiated only after prior approval by the Ethics and Research Committee of the UFPI (number 3,683,221).

3.3. Experimental Procedure

We gathered 40 participants in a classroom in the UFPI and organized them on school chairs. We divided them into two groups according to Figure 1: a group (n = 20) composed of subjects with previous knowledge of the spinal cord in the discipline (GPK) and another (n = 20) without prior knowledge of the spinal cord (GWK). Forty additional students repeated this procedure, giving a total of 80 participants in this experiment. All of them performed learning tasks in the classroom.
We treated learning conditions as a crossover. That is, we conducted the experiment in two phases for both conditions, as illustrated in Figure 2. In the first phase, members of condition A received printed material, and members in condition B received a tablet. Tablets had NitLabEduca installed and available to be used. Participants used both the mobile application and printed material to study the spinal cord, ascending and descending pathways, and the same content in different formats. Previously to the initial study period, we determined a time of 10 min for participants of the condition B to get to know the mobile device and the proposed software. At the end of 45 min of study to both conditions for the first phase, students took a test for 15 min, in which they were to describe the spinal cord’s structures as presented in materials made available for both conditions.
In the second phase (Figure 2), groups reversed the study conditions. In condition A, students received tablets, and members of the group in condition B received printed material. Similarly to the first phase, we allowed the participants in the group of condition A to get to know the mobile device and AR software for 10 min. After 45 min in the second phase, we applied a test in both conditions for 15 min. Similarly, they also had to describe the spinal cord structures as presented in the printed material and AR.
At the end of the second phase, participants answered the System Usability Scale (SUS) questionnaire in Portuguese language [37] according to Figure 2. This scale allowed us to subjectively evaluate the usability of the mobile application [38,39]. Besides usability, authors in [40] showed how to evaluate the learning ability factor through the analysis of items 4 and 10 of the SUS questionnaire.
The scale is composed of 10 items and each one scores between 0 and 4, with a Likert scale between 1 and 5 points: 1 (totally disagree), 2 (disagree), 3 (neutral), 4 (agree), and 5 (totally agree) [41]. SUS maximum score is 100 points. Equation (1) is used to calculate the SUS scale score [42].
S U S = [ ( S c o r e s o d d q u e s t i o n s 1 ) + ( 5 S c o r e s e v e n q u e s t i o n s ) ]
We prepared a complementary questionnaire for the purpose of gathering information to assist the research analysis and to verify the level of student adherence to the use of mobile applications in higher education. The questionnaire had the following questions (questions were originally written in Portuguese language and then translated to English in this manuscript): (1) Do you have a smartphone? (2) Do you like neuroanatomy? (3) Have you already used mobile software to study? (4) Do you believe that, by using mobile educational software, you understand better? (5) Is the possibility of using software at any time and anywhere interesting? (6) Would you use another mobile educational software to help your studies?

3.4. The Mobile Educational Application NitLabEduca

In this section, we first present the tools used to develop NitLabEduca. Next, we describe its main proposed features.

3.4.1. Implementation Aspects

For NitLabEduca implementation, we used the software 3dS Max (https://www.autodesk.com/products/3ds-max/overview) for the three-dimensional modeling of the spinal cord. The 3dS Max tool can model 3D objects [43]. After applying texture to the object, we performed the process of image modeling and rendering. We exported the spinal cord’s model in Object File Wavefront 3D format, which we used as three-dimensional objects. We manipulated 3D objects in the project supported by the engine Unity (https://unity.com/), which is a software tool able to develop 3D games supported on multiple platforms. We took advantage of the the extension Vuforia (https://developer.vuforia.com/) to create mobile augmented reality environments and QR codes (https://www.qrcode.com/en/index.html) [44]. Therefore, this extension helped us to implement the spinal cord’s animations with user interaction [45] via scripts written in C# language. We used the integrated development environment Android Studio to develop the mobile application, which also enabled access to the questionnaires about the spinal cord and user performance statistics. The database of the mobile application was the Google Firebase.

3.4.2. Features

The mobile AR application enables the study of the spinal cord, ascending and descending pathways, through user interaction with images in rotating 3D models. In the first screen (Figure 3a), users can login or register in the NitLabEduca application to access study material. After logged in, different menu options are available (Figure 3b,c).
The button Medula Espinal (i.e., “spinal cord” in Portuguese language) enables to access the complete image of the spinal cord with the possibility of visualizing individual pathways (Figure 4). This feature allows the users to interact with the three-dimensional object by manipulating the whole or parts that compose the spinal cord. Users can increase and reduce the size of the object. Each pathway of the spinal cord has theoretical content naming it and its technical aspects. All pathways can be studied individually (see video in the supplementary material).
The button Quiz allows users to answer questions related to the covered content (Figure 5a). This feature is a questionnaire with different multiple choice questions (four answers) that aims to assess knowledge about the spinal cord. The button Estatística (i.e., “statistics” in Portuguese language) displays quantitative measurements of the user’s performance in answering the quiz available in the application (Figure 5b). This feature is useful in that it allows users to check their progress of understanding the spinal cord. The button Print 3D provides a file in STL format containing the 3D model of the spinal cord, which can be printed on 3D printers. We sliced this model into several layers and provided the resulting file in NitLabEduca. This feature is useful when users prefer to contact real objects.

3.5. Statistical Analysis

We performed a three-way ANOVA having as a inter-subject factor the group (with previous knowledge vs. without previous knowledge) and intra-subject factors the conditions (student started with printed material vs. student started with NitLabEduca) and moment (first test and second test). We investigated the two factor interactions using a Student’s t-test. The size of the effect was estimated as a partial-square stage ( η 2 p ) in the ANOVA analysis and Cohen’s d for the Student’s t-test. We used the Mauchley’s test criteria to evaluate the sphericity hypothesis and the greenhouse–Geisser procedure (G-G ϵ ) to correct degrees of freedom. Data normality and homoscedasticity were previously verified by Levene and Shapiro–Wilk tests. We calculated the statistical power and 95% confidence interval (95% CI) for the dependent variables. We interpreted the statistical power as low power from 0.1 to 0.3; high power from 0.8 to 0.9. We interpreted the effect magnitude using recommendations suggested by [46]: insignificant <0.19; small from 0.20 to 0.49; medium from 0.50 to 0.79; large from 0.80 to 1.29. With alpha-Bonferroni correction for the interaction analysis, then adjusting the value for p ≤ 0.025. We conducted all analyses using SPSS for Windows version 20.0 (SPSS Inc., Chicago, IL, USA).

4. Results

4.1. Number of Hits

The three-way ANOVA showed an interaction between the condition and moment factors (F(1.152) = 15.897, p < 0.001, η 2 p = 0.10, power = 98%). In the interaction analysis, the paired t-test demonstrated that there was no statistically significant difference between the second test moments for subjects who started with NitLabEduca and finished with printed material (p < 0.05). On the other hand, a statistically significant difference was observed between the first and second test moments; t(38) = 7.616, p < 0.001, d = 0.26. Additionally, for subjects who started the study with printed material and finished with NitLabEduca, no statistically significant difference was observed (p > 0.05), whereas for those who started with NitLabEduca and ended with printed material, a statistically significant difference was found; t(38) = 9.894, p < 0.001, d = 0.27. These findings indicate that subjects who started the study with NitLabEduca and finished with printed material increased the number of hits in the test by 27%. In comparison, those who started with printed material and finished with NitLabEduca increased the hits by 11% (Figure 6).

4.2. NitLabEduca Usability

Table 1 presents the participants’ answers to the 10 items of the SUS questionnaire. Participants responded favorably to the use of the mobile application with AR since the values of the scale in item 2 (disagree) and 1 (totally disagree) show that most participants disagreed with negative aspects of even questions, 74%. Additionally, results of the item 4 (agree) and 5 (totally agree) show that most respondents agreed with positive aspects of odd questions, 68%. On the other hand, it is perceived that some of the participants were indifferent, both in negative and positive aspects, identified in item 3 (neutral)—17% in even questions and 23% in odd ones.
In Table 2, a descriptive analysis of the score for each one of the ten items is presented. It was observed that item 5 presented the lowest mean score of 2.50, where 49% of the subjects agreed that the functions of this system were well integrated. However, 26% were neutral and only 16% disagreed with the positive aspect of the item. Results in item 9 demonstrate that participants felt confident in using the system—49% agreed and 13% fully agreed with the affirmation. Although items 8 and 9 have the same mean (2.6), the standard deviation 1.07 is the largest in item 8, showing a greater distribution of the score in this question.
In the descriptive analysis of the SUS scale, the maximum score was 100 points with a mean of 71, median of 72.5, standard deviation of 13.7, and standard error of 1.53. The NitLabEduca had good usability with mean score classified as "C" in the grade scale for the SUS questionnaire [47]. Figure 7 represents the distribution of the mean score of the SUS scale, in which between 90.1 and 100 corresponded to 5%; from 80.1 to 90, 21%; between 70.1 and 80, 26%. Score distributions from both groups are presented in Figure 8 (GPK) and Figure 9 (GWK), in which each circle represents a participant.
Items 4 and 10 of the SUS, added and multiplied by 12.5, are analyzed for the definition of the learning ability factor. Other items, which are summed and multiplied by 3.125, determine the usability factor according to [40]. Table 3 shows the usability and learning ability of the application. Learning factor shows the mean value 77, indicating that a greater learning ability suggests easier learning through NitLabEduca.
Figure 10 presents score distribution of the learning ability factor. Scores between 90.1 and 100 correspond to 15% of the total, those between 80.1 and 90 correspond to 26%, and points between 70.1 and 80 correspond to the highest percentage (34%). Points between 60.1 and 70 correspond to 15% and 10% with scores below 50, representing unacceptable values [47].

4.3. Complementary Questionnaire

Results showed that 35% of the participants stated that they did not like neuroanatomy teaching, and 72.5% reported that after using the application, they believed they could understand the content better. In the GPK, there were more individuals who liked the discipline (45%) and 67.5% assumed to have learned better after using NitLabEduca. On the other hand, 75% of GWP participants stated that liked neuroanatomy teaching; 77.5% of those believed to have better understood after using the application, according to Figure 11 and Figure 12).

5. Discussion

5.1. Main Findings and Theoretical Discussion

Our findings demonstrated that the NitLabEduca application seems to be more efficient at the beginning of the teaching–learning process as a complement to the printed material (RQ1). In the experiment with the NitLabEduca application, we observed that its use at the beginning of the study facilitated the spatial abstraction of the individual in manipulating visual patterns, which indicates a better understanding when passing to the content in printed material, then increasing performance in the learning process. We understand that, when participants visualized the structure of the spinal cord and its ascending and descending pathways, through interaction with the virtual object, the proposed mobile application stimulated their spatial ability and facilitated theoretical understanding. In this case, participants’ spatial ability seems to have been more requested, which facilitates absorption of information [48]. Spatial skills at the early stage of the learning path are significant to the theory of knowledge [6]. In this context, dynamic three-dimensional visual stimuli in learning by NitLabEduca seems to favor learning.
The results demonstrated that stimulation to spatial ability is fundamental for learning spinal cord pathways, content covered in the neuroanatomy discipline. This fact was also observed by [13] during the learning process in the anatomy discipline. Therefore, representation of three-dimensional anatomical structures as models to aid in the spatial comprehension of forms is a factor that favors learning [12]. In this context, the NitLabEduca application’s performance was positively evaluated due to the positive impact of the three-dimensional teaching–learning model observed in our results, demonstrating the influence of this model on neuroanatomy education compared to the traditional teaching process [14].
In health education, the acquisition of knowledge is characteristically experiential, self-sufficient, and practical [32]. NitLabEduca brings together these characteristics, and in this sense, students can take advantage of it to acquire knowledge related to the spinal cord, because it allows users to combine physical word experiences with virtual environment. These characteristics are in accordance with the constructivist learning theory [49,50] and the experiential learning theory [51].
The findings for the SUS, which assessed subjective feelings and satisfaction levels of individuals towards NitLabEduca, demonstrated good usability of the application (RQ2), which is considered adequate to the teaching–learning process in the interpretation of the SUS scores [39]. It demonstrated that the proposed mobile application is easy to use, allowing students to focus attention on the proposed study topic [52]. Therefore, it influences students for the effectiveness of the learning process [53]. According to SUS, NitLabEduca allowed students to immerse themselves into the content, thereby interacting with them, and demonstrated good applicability in the learning process [40] (RQ3). Therefore, it is understood as an evolution to the traditional teaching–learning model of the neuroanatomy discipline, which migrated from two-dimensional images in a printed matter to AR 3D technology and digital dissection of organs of the human body [9,54].
The popularization of smartphones and their characteristics of multifunctionality, omnipresence, and portability influenced the propagation of m-learning [55], and contributed to diffusion of the “digital born” that technologies integrated into everyday life have [56]. Mobile devices can be incorporated with AR, which assists in immersion and interactivity through 3D animation [6] and enables one to create various educational materials as support for learning, facilitating it [57]. This is in line with the cognitive theory of multimedia learning [58], since the use of mobile AR in the teaching–learning process involves realistic experiences through visual simulations [6].

5.2. Limitations and Future Work

Despite the results of the research, we noted that during NitLabEduca evaluation, participants indicated some limitations. The most obvious situation was related to the execution of the application and visualization of three-dimensional images using heavy tablets. To visualize the spinal cord, a direct line of sight between the camera of the mobile device and a QR code was necessary. This procedure seemed to generate muscular discomfort in students, since tablets used in the research weighed 500 g and handling of the mobile device lasted 45 min to perform the task, which certainly affected the usability perception of the application. Indeed, Lee et al. [59] demonstrated the impact of tablets on physical problems in the user’s body, which can cause stiffness, pain, and discomfort in the back and shoulders. Further studies using lighter devices with a task in a shorter period of time in research on spinal cord pathways and other anatomical structures may provide new insights on the applicability of AR in teaching and learning.
Of the individuals who had knowledge in the discipline of neuroscience, 13 of them were repeating the course and declared they did not like neuroanatomy by means of the complementary questionnaire. Results of the evaluation of NitLabEduca via SUS by these students were lower than the mean acceptable [47]. This outcome seems to have been motivated by the group’s situation, as repeating students in the discipline and possible difficulty with the theme studied. This may be related to the cognitive process being inseparable from motivated reasoning [60], in which emotions are inseparable from mental processes and motivated reasoning overlaps evidence. Future research can be performed to assess whether the use of AR applications can generate positive inspiration [61] in neuroanatomy students, hence impacting their perceptions during the teaching–learning process.
Additional research may focus on comparative experiments, also considering 3D printed models of the spinal cord, in addition to mobile AR applications and printed material. By conducting this comparison, we believe that we deepened our comparative analysis and answered additional research questions. In fact, the modeling and reconstruction of anatomical structures for 3D printing is a trend in supporting education [17]. Therefore, adding an evaluation with 3D printed models would enrich the analysis [62,63].

6. Conclusions

The aim of this study was to develop a mobile AR solution focused on studying the spinal cord, and evaluating its use as an auxiliary means in the teaching–learning process. For this purpose, we implemented NitLabEduca and performed an experimental evaluation with 80 neuroanatomy students to identify the effects on learning and assess the usability of the proposed application.
From our research, we conclude that studying the spinal cord using NitLabEduca favors learning, and it may complement the traditional teaching–learning model, enhancing the knowledge acquisition process. NitLabEduca was revealed to be a resource with potential for both spatial abstraction and functional understanding of the spinal cord pathways. Moreover, NitLabEduca demonstrated good usability in the teaching–learning process and can supplement printed material, enriching the training of students and health professionals.

Supplementary Materials

The following is available online at https://www.mdpi.com/2227-7102/10/12/376/s1, Video S1: 3D Spinal Cord using Augmented Reality.

Author Contributions

Conceptualization, J.F., A.T. and S.T.; methodology, J.F., A.T. and S.T.; software, J.F.; validation, J.F. and S.T.; formal analysis, J.F. and S.T.; writing, J.F., A.T. and S.T.; supervision, S.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to sincerely thank the students of UFPI for voluntary participation in this study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ARAugmented Reality
UFPIFederal University of Piauí
PDAsPersonal Digital Assistants
SUSSystem Usability Scale

References

  1. Teri, S.; Acai, A.; Griffith, D.; Mahmoud, Q.; Ma, D.W.L.; Newton, G. Student use and pedagogical impact of a mobile learning application. Biochem. Mol. Biol. Educ. 2014, 42, 121–135. [Google Scholar] [CrossRef] [PubMed]
  2. Sung, Y.T.; Chang, K.E.; Liu, T.C. The effects of integrating mobile devices with teaching and learning on students’ learning performance: A meta-analysis and research synthesis. Comput. Educ. 2016, 94, 252–275. [Google Scholar] [CrossRef] [Green Version]
  3. Cheung, S.K.S. A survey on the use of mobile devices for learning purposes. Int. J. Innov. Learn. 2014, 16, 192–202. [Google Scholar] [CrossRef]
  4. Seprilia, D.; Handayani, P.; Pinem, A. User Acceptance Factors Affecting the Usage of Mobile Learning in Enriching Outside Classroom Learning at High School Level; IEEE: Jayapura, Indonesia, 2017; Volume 2018, pp. 1–6. [Google Scholar] [CrossRef]
  5. Dascalu, M.I.; Moldoveanu, A.; Shudayfat, E.A. Mixed reality to support new learning paradigms. In Proceedings of the 18th International Conference on System Theory, Control and Computing (ICSTCC), Sinaia, Romania, 17–19 October 2014; pp. 692–697. [Google Scholar] [CrossRef]
  6. Weng, C.; Rathinasabapathi, A.; Weng, A.; Zagita, C. Mixed Reality in Science Education as a Learning Support: A Revitalized Science Book. J. Educ. Comput. Res. 2018, 57, 1–31. [Google Scholar] [CrossRef]
  7. Dumančić, M.; Matijević, M.; Topolovčan, T. How Mobile Learning Can Change Education. Online Int. Interdiscip. Res. J. 2016, 6, 31–40. [Google Scholar] [CrossRef]
  8. Cook, C.W.; Sonnenberg, C. Technology Additionally, Online Education: Models For Change. Contemp. Issues Educ. Res. 2014, 7, 171–188. [Google Scholar] [CrossRef] [Green Version]
  9. Manrique-Juan, C.; Grostieta-Dominguez, Z.; Rojas-Ruiz, R.; Alencastre-Miranda, M.; Muñoz-Gómez, L.; Silva-Muñoz, C. A Portable Augmented-Reality Anatomy Learning System Using a Depth Camera in Real Time. Am. Biol. Teach. 2017, 79, 176–183. [Google Scholar] [CrossRef]
  10. Cheng, K.; Mukherjee, P.; Curthoys, I. Development and use of augmented reality and 3D printing in consulting patient with complex skull base cholesteatoma. Virtual Phys. Prototyp. 2017, 12, 241–248. [Google Scholar] [CrossRef]
  11. Chen, L.; Day, T.W.; Tang, W.; John, N.W. Recent Developments and Future Challenges in Medical Mixed Reality. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Nantes, France, 9–13 October 2017; pp. 123–135. [Google Scholar] [CrossRef] [Green Version]
  12. Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M.; García, S.; Barcia, J.M. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy. J. Sci. Educ. Technol. 2015, 24, 119–124. [Google Scholar] [CrossRef]
  13. Touel, S.; Mekkadem, M.; Kenoui, M.; Benbelkacem, S. Collocated Learning Experience within Collaborative Augmented Environment (Anatomy Course); IEEE: Boumerdes, Algeria, 2017; pp. 1–5. [Google Scholar] [CrossRef]
  14. Azer, S.A.; Azer, S. 3D Anatomy Models and Impact on Learning: A Review of the Quality of the Literature. Health Prof. Educ. 2016, 2, 80–98. [Google Scholar] [CrossRef] [Green Version]
  15. Yilmaz, Z.; Batdi, V. A meta-analytic and thematic comparative analysis of the integration of augmented reality applications into education. Egit. Ve Bilim 2016, 41, 273–289. [Google Scholar] [CrossRef] [Green Version]
  16. Juan, M.; Alexandrescu, L.; Folguera, F.; Garcia-Garcia, I. A Mobile Augmented Reality System for the Learning of Dental Morphology. Digit. Educ. Rev. 2016, 30, 234–247. [Google Scholar]
  17. Fasel, J.; Aguiar, D.; Kiss-Bodolay, D.; Montet, X.; Kalangos, A.; Stimec, B.; Ratib, O. Adapting anatomy teaching to surgical trends: A combination of classical dissection, medical imaging, and 3D-printing technologies. Surg. Radiol. Anat. 2016, 38, 361–367. [Google Scholar] [CrossRef] [PubMed]
  18. Küçük, S.; Kapakin, S.; Göktaş, Y. Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load. Anat. Sci. Educ. 2016, 9, 411–421. [Google Scholar] [CrossRef]
  19. Craig, A. Understanding Augmented Reality: Concepts and Applications; Elsevier Science: Amsterdam, The Netherlands, 2013. [Google Scholar]
  20. Yu, J.; Fang, L.; Lu, C. Key technology and application research on mobile augmented reality. In Proceedings of the 2016 7th IEEE International Conference on Software Engineering and Service Science (ICSESS), Beijing, China, 26–28 August 2016; pp. 547–550. [Google Scholar] [CrossRef]
  21. Setting the future of digital and social media marketing research: Perspectives and research propositions. Int. J. Inf. Manag. 2020, 102168. [CrossRef]
  22. Birt, J.; Stromberga, Z.; Cowling, M.; Moro, C. Mobile Mixed Reality for Experiential Learning and Simulation in Medical and Health Sciences Education. Information 2018, 9, 31. [Google Scholar] [CrossRef] [Green Version]
  23. Vávra, P.; Roman, J.; Zonča, P.; Ihnát, P.; Němec, M.; Kumar, J.; Habib, N.; El-Gendi, A. Recent Development of Augmented Reality in Surgery: A Review. J. Healthc. Eng. 2017, 2017, 1–9. [Google Scholar] [CrossRef]
  24. Wang, S.; Parsons, M.; Stone-McLean, J.; Rogers, P.; Boyd, S.; Hoover, K.; Meruvia-Pastor, O.; Gong, M.; Smith, A. Augmented Reality as a Telemedicine Platform for Remote Procedural Training. Sensors 2017, 17, 2294. [Google Scholar] [CrossRef]
  25. Gaved, M.; FitzGerald, E.; Ferguson, R.; Adams, A.; Mor, Y.; Thomas, R. Augmented Reality and Mobile Learning: The State of the Art. Int. J. Mob. Blended Learn. 2013, 5, 43–58. [Google Scholar] [CrossRef]
  26. Mather, C.; Barnett, T.; Broucek, V.; Saunders, A.; Grattidge, D.; Huang, W. Helping Hands: Using Augmented Reality to Provide Remote Guidance to Health Professionals. Stud. Health Technol. Inform. 2017, 241, 57–62. [Google Scholar] [CrossRef]
  27. Ward, M.; Gayet, B.; Tabchouri, N.; Moisan, F.; Donatelli, G.; Stättner, S.; Fuks, D. Technical advances and future perspectives in liver surgery. Eur. Surg. 2018, 50, 137–141. [Google Scholar] [CrossRef]
  28. Thompson, S.; Schneider, C.; Bosi, M.; Gurusamy, K.; Ourselin, S.; Davidson, B.; Hawkes, D.; Clarkson, M. In vivo estimation of target registration errors during augmented reality laparoscopic surgery. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 865–874. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Pelargos, P.E.; Nagasawa, D.T.; Lagman, C.; Tenn, S.; Demos, J.V.; Lee, S.J.; Bui, T.T.; Barnette, N.E.; Bhatt, N.S.; Ung, N.; et al. Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery. J. Clin. Neurosci. 2017, 35, 1–4. [Google Scholar] [CrossRef] [PubMed]
  30. Kugelmann, D.; Stratmann, L.; Nühlen, N.; Bork, F.; Hoffmann, S.; Samarbarksh, G.; Pferschy, A.; von der Heide, A.M.; Eimannsberger, A.; Fallavollita, P.; et al. An Augmented Reality magic mirror as additive teaching device for gross anatomy. Ann. Anat. Anat. Anz. 2018, 215, 71–77. [Google Scholar] [CrossRef]
  31. Bernardo, A. Virtual Reality and Simulation in Neurosurgical Training. World Neurosurg. 2017, 106, 1015–1029. [Google Scholar] [CrossRef]
  32. Birt, J.; Moore, E.; Cowling, M. Improving paramedic distance education through mobile mixed reality simulation. Australas. J. Educ. Technol. 2017, 33. [Google Scholar] [CrossRef] [Green Version]
  33. Patil, R.; Almale, B.; Patil, M.; Gujrathi, A.; Dhakne-Palwe, S.; Patil, A.; Gosavi, S. Attitudes and Perceptions of Medical Undergraduates Towards Mobile Learning (M-learning). J. Clin. Diagn. Res. 2016, 10, 6–10. [Google Scholar] [CrossRef]
  34. Lytridis, C.; Tsinakos, A.; Kazanidis, I. ARTutor—An Augmented Reality Platform for Interactive Distance Learning. Educ. Sci. 2018, 8, 6. [Google Scholar] [CrossRef] [Green Version]
  35. Pedaste, M.; Mitt, G.; Jürivete, T. What Is the Effect of Using Mobile Augmented Reality in K12 Inquiry-Based Learning? Educ. Sci. 2020, 10, 94. [Google Scholar] [CrossRef] [Green Version]
  36. Ma, M.; Fallavollita, P.; Seelbach, I.; Von Der Heide, A.M.; Euler, E.; Waschke, J.; Navab, N. Personalized augmented reality for anatomy education. Clin. Anat. 2016, 29, 446–453. [Google Scholar] [CrossRef]
  37. Martins, A.I.; Rosa, A.F.; Queirós, A.; Silva, A.; Rocha, N.P. European Portuguese Validation of the System Usability Scale (SUS). Procedia Comput. Sci. 2015, 67, 293–300, In Proceedings of the 6th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion, Sankt Augustin, Germany, 10–12 June 2015. [Google Scholar] [CrossRef] [Green Version]
  38. Brooke, J. SUS-A quick and dirty usability scale. Hum. Brain Mapp. 1986. [Google Scholar] [CrossRef] [Green Version]
  39. Bangor, A.; Kortum, P.T.; Miller, J.T. An Empirical Evaluation of the System Usability Scale. Int. J. Hum. Comput. Interact. 2008, 24, 574–594. [Google Scholar] [CrossRef]
  40. Lewis, J.; Sauro, J. The Factor Structure of the System Usability Scale. In Human Centered Design; Springer: Berlin, Germany, 2009; pp. 94–103. [Google Scholar] [CrossRef]
  41. Likert, R. A technique for the measurement of attitudes. Arch. Psychol. 1932, 22, 1–55. [Google Scholar]
  42. Adebiyi, A.; Sorrentino, P.; Bohlool, S.; Zhang, C.; Arditti, M.; Goodrich, G.; Weiland, J.D. Assessment of feedback modalities for wearable visual aids in blind mobility. PLoS ONE 2017, 12, e0170531. [Google Scholar] [CrossRef]
  43. Stojanov, I.; Ristevski, B.; Kotevski, Z.; Savoska, S. Application of 3ds Max for 3D Modelling and Rendering; University St. Kliment Ohridski Bitola: Bitola, Macedonia, 2016; pp. 133–144. [Google Scholar] [CrossRef] [Green Version]
  44. Cieza, E.; Lujan, D. Educational Mobile Application of Augmented Reality Based on Markers to Improve the Learning of Vowel Usage and Numbers for Children of a Kindergarten in Trujillo. Procedia Comput. Sci. 2018, 130, 352–358. [Google Scholar] [CrossRef]
  45. Dickson, P.E. Using Unity to Teach Game Development. In Proceedings of the ACM Conference on Innovation and Technology in Computer Science Education-ITiCSE ’15, Vilnius, Lithuania, 6–8 July 2015; pp. 75–80. [Google Scholar] [CrossRef]
  46. Cohen, J. Statistical Power Analysis for the Behavioral Sciences; Lawrence Earlbaum Associates: Hillsdale, NJ, USA, 1988; Volume 2. [Google Scholar]
  47. Brooke, J. SUS: A Retrospective. J. Usability Stud. 2013, 8, 29–40. [Google Scholar]
  48. Martín-Gutierrez, J.; Trujillo, R.N.; Acosta-Gonzalez, M. Augmented Reality Application Assistant for Spatial Ability Training. HMD vs Computer Screen Use Study. Procedia Soc. Behav. Sci. 2016, 93, 49–53. [Google Scholar] [CrossRef] [Green Version]
  49. Elliott, S.; Littlefield, J. Educational Psychology: Effective Teaching, Effective Learning; Brown & Benchmark: Madison, WI, USA, 1995. [Google Scholar]
  50. Matthews, M. Constructivism in Science Education: A Philosophical Examination; Springer: Dutch, The Netherlands, 1998. [Google Scholar]
  51. Kolb, D. Experiential Learning: Experience As The Source Of Learning And Development; FT Press: Upper Saddle River, NJ, USA, 1984; Volume 1. [Google Scholar]
  52. Sun, P.; Tsai, R.; Finger, G.; Chen, Y.; Yeh, D. What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 2008, 50, 1183–1202. [Google Scholar] [CrossRef]
  53. Calle Bustos, A.M.; Juan, M.C.; García García, I.; Abad, F. An augmented reality game to support therapeutic education for children with diabetes. PLoS ONE 2017, 12, e0184645. [Google Scholar] [CrossRef] [Green Version]
  54. Trelease, R. From chalkboard, slides, and paper to e-learning: How computing technologies have transformed anatomical sciences education. Anat. Sci. Educ. 2016, 9, 583–602. [Google Scholar] [CrossRef] [PubMed]
  55. Rothman, B.; Gupta, R.; McEvoy, M. Mobile Technology in the Perioperative Arena. Anesth. Analg. 2017, 124, 807–818. [Google Scholar] [CrossRef] [PubMed]
  56. Willicks, F.; Stehling, V.; Richert, A.; Isenhardt, I. The Students’ Perspective on Mixed Reality in Higher Education: A Status and Requirement Analysis; IEEE: Tenerife, Spain, 2018; Number 43; pp. 656–660. [Google Scholar] [CrossRef]
  57. Seralidou, E.; Douligeris, C. Exploring the Potential of Smartphones to Support Learning in Greece; IEEE: San Diego, CA, USA, 2016; Number October; pp. 65–69. [Google Scholar] [CrossRef]
  58. Mayer, R. Cognitive Theory and the Design of Multimedia Instruction: An Example of the Two-Way Street Between Cognition and Instruction. New Dir. Teach. Learn. 2002, 2002. [Google Scholar] [CrossRef]
  59. Lee, S.; Hsu, Y.; Bair, B.; Toberman, M.; Chien, L. Gender and posture are significant risk factors to musculoskeletal symptoms during touchscreen tablet computer use. J. Phys. Ther. Sci. 2018, 30, 855–861. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  60. Kunda, Z.; Dunning, D.; Jones, E.; Jussim, L.; Miller, D.; Nisbett, R.; Petty, R.; Prentice, D. The Case for Motivated Reasoning. Psychol. Assoc. Novemb. 1990, 108, 480–498. [Google Scholar] [CrossRef] [PubMed]
  61. Rauschnabel, P.A.; Felix, R.; Hinsch, C. Augmented reality marketing: How mobile AR-apps can improve brands through inspiration. J. Retail. Consum. Serv. 2019, 49, 43–53. [Google Scholar] [CrossRef]
  62. Kong, X.; Nie, L.; Zhang, H.; Wang, Z.; Ye, Q.; Tang, L.; Li, J.; Huang, W. Do Three-dimensional Visualization and Three-dimensional Printing Improve Hepatic Segment Anatomy Teaching? A Randomized Controlled Study. J. Surg. Educ. 2016, 73, 264–269. [Google Scholar] [CrossRef]
  63. Lim, K.; Loo, Z.; Goldie, S.; Adams, J.; McMenamin, P. Use of 3D printed models in medical education: A randomized control trial comparing 3D prints versus cadaveric materials for learning external cardiac anatomy. Anat. Sci. Educ. 2016, 9, 213–221. [Google Scholar] [CrossRef]
Figure 1. Group arrangement of the students in the classroom.
Figure 1. Group arrangement of the students in the classroom.
Education 10 00376 g001
Figure 2. Study design.
Figure 2. Study design.
Education 10 00376 g002
Figure 3. Application main screen in Portuguese language with user login (a). Menu screens with options to access features (b,c).
Figure 3. Application main screen in Portuguese language with user login (a). Menu screens with options to access features (b,c).
Education 10 00376 g003
Figure 4. Application screen for interacting with the spinal cord through AR.
Figure 4. Application screen for interacting with the spinal cord through AR.
Education 10 00376 g004
Figure 5. A question of the quiz (a) asking, "Which structure is part of the spinal cord?" Statistical results of performance after finishing the quiz (b).
Figure 5. A question of the quiz (a) asking, "Which structure is part of the spinal cord?" Statistical results of performance after finishing the quiz (b).
Education 10 00376 g005
Figure 6. Number of hits in the groups GWK and GPK. The diamond represents the significant statistical difference. Results are presented as means and standard deviation.
Figure 6. Number of hits in the groups GWK and GPK. The diamond represents the significant statistical difference. Results are presented as means and standard deviation.
Education 10 00376 g006
Figure 7. Distribution of SUS scores.
Figure 7. Distribution of SUS scores.
Education 10 00376 g007
Figure 8. Individual score of the GPK participants.
Figure 8. Individual score of the GPK participants.
Education 10 00376 g008
Figure 9. Individual score of the GWK participants.
Figure 9. Individual score of the GWK participants.
Education 10 00376 g009
Figure 10. Score distribution of the learning ability factor.
Figure 10. Score distribution of the learning ability factor.
Education 10 00376 g010
Figure 11. Score distribution of the GPK participants.
Figure 11. Score distribution of the GPK participants.
Education 10 00376 g011
Figure 12. Score distribution of the GWK participants.
Figure 12. Score distribution of the GWK participants.
Education 10 00376 g012
Table 1. Answers to the 10 items of the System Usability Scale (SUS) scale.
Table 1. Answers to the 10 items of the System Usability Scale (SUS) scale.
SUS AnswersQ1Q2Q3Q4Q5Q6Q7Q8Q9Q10
Disagree9371441341231939
Totally disagree125222012017129
Neutral231615721191117218
Agree3214163984013392
Totally agree15121170272102
Table 2. Descriptive analysis for each one of the 10 items of the SUS scale; SD = standard deviation; SE = standard error.
Table 2. Descriptive analysis for each one of the 10 items of the SUS scale; SD = standard deviation; SE = standard error.
ItemMeanMedianSDSE
1. I think I would use this system often2.643.000.960.11
2. I find the system unnecessarily complex3.053.000.830.09
3. I found the system easy to use2.983.000.860.10
4. I thought it would require the support of a technician to to use the system3.003.000.890.10
5. The functions of this system were well integrated2.503.000.880.10
6. I found the system very inconsistent2.713.000.850.09
7. I imagine most people would learn to use this system quickly3.153.000.750.08
8. I found the system too complicated to use2.603.001.070.12
9. I felt very confident with the system2.603.000.900.10
10. I need to learn a lot of things before continuing to use this system3.143.000.890.10
Table 3. Usability and learning ability factor.
Table 3. Usability and learning ability factor.
FactorMeanMedianSE
Usability6971.881.66
Learning77751.9
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fernandes, J.; Teles, A.; Teixeira, S. An Augmented Reality-Based Mobile Application Facilitates the Learning about the Spinal Cord. Educ. Sci. 2020, 10, 376. https://doi.org/10.3390/educsci10120376

AMA Style

Fernandes J, Teles A, Teixeira S. An Augmented Reality-Based Mobile Application Facilitates the Learning about the Spinal Cord. Education Sciences. 2020; 10(12):376. https://doi.org/10.3390/educsci10120376

Chicago/Turabian Style

Fernandes, Jacks, Ariel Teles, and Silmar Teixeira. 2020. "An Augmented Reality-Based Mobile Application Facilitates the Learning about the Spinal Cord" Education Sciences 10, no. 12: 376. https://doi.org/10.3390/educsci10120376

APA Style

Fernandes, J., Teles, A., & Teixeira, S. (2020). An Augmented Reality-Based Mobile Application Facilitates the Learning about the Spinal Cord. Education Sciences, 10(12), 376. https://doi.org/10.3390/educsci10120376

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop