Next Article in Journal
A Systematic Review on Teachers’ Well-Being in the COVID-19 Era
Previous Article in Journal
What Matters in Leadership Practices among Estonian Upper Secondary School Principals?
Previous Article in Special Issue
Material and Socio-Cognitive Effects of Immersive Virtual Reality in a French Secondary School: Conditions for Innovation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Framework for Analysis and Development of Augmented Reality Applications in Science and Engineering Teaching

1
Department of Biology, University of Education Weingarten, 88250 Weingarten, Germany
2
Chair of Science Education, University of Konstanz, 78464 Konstanz, Germany
3
Chair of Science Education, Thurgau University of Education, 8280 Kreuzlingen, Switzerland
4
Department of Engineering, University of Education Weingarten, 88250 Weingarten, Germany
5
Media Education and Visualization Working Group (MEVIS), University of Education Weingarten, 88250 Weingarten, Germany
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2023, 13(9), 926; https://doi.org/10.3390/educsci13090926
Submission received: 17 August 2023 / Revised: 3 September 2023 / Accepted: 4 September 2023 / Published: 11 September 2023
(This article belongs to the Special Issue Learning and Teaching in a Virtual World)

Abstract

:
As augmented reality (AR) becomes a promising technology for use in science and engineering teaching, the need for generally valid criteria and frameworks for the analysis, development, and setup of AR applications arises. In this article, we present an evaluations matrix to analyze current augmented reality approaches for life science teaching (biology, chemistry) and engineering and simultaneously provide directives for future augmented reality application designs. Based on an extensive literature review followed by focus group discussions with experts, the evaluation matrix combines domain-specific aspects, technical features, and subject and media didactical principles to determine seven select parameters. This ensures adequate coverage of the broad range of key considerations in the development of augmented reality technology for science and engineering teaching. Through cluster analysis, two groups of applications could be identified. The first group comprises applications whose development was more technology-driven. Applications in the second group take more didactic criteria into account and are to be considered more holistically from an instructional perspective. No correlation could be observed between the design of the AR application and the intended learning effects. The parameters derived for the evaluation matrix contribute to specifying relevant criteria for the development of AR applications.

1. Introduction

Digital technologies have the potential to support high-quality education [1,2], as they can successfully address requirements to learn 21st-century skills [3]. These requirements include, for example, more collaborative, interactive, personalized, and student-centered learning [4,5]. For the use of digital technologies in class, it is important to ensure that the media provided (a) fit the objectives specified in the curriculum, (b) meet basic content, (media) pedagogical, and pedagogical content quality standards, and (c) in the best case, have been empirically evaluated for its learning effectiveness. Moreover, teachers are needed who are able to select and adequately use digital technologies according to their objectives [6].
A growing digital technology used in education is augmented reality (AR). The term “AR” first appeared in science teaching in 2004, experiencing a dramatic increase between 2017 and 2020 [7], reflecting its increasing relevance in various subjects in education.
Following Milgram [8], augmented reality is defined as part of a “reality–virtuality (RV) continuum”. Augmented reality is placed in the span between the extremes real environment and virtual environment. This intermediate range is defined as a “generic Mixed Reality (MR) environment as one in which real world and virtual world objects are presented together within a single display” ([8] p. 283). Augmented reality allows us to enrich the real-world environment with virtual content. Azuma [9] defines the technology as a system, containing the following aspects: (1) combination of real and virtual (2) interactivity and (3) real-time 3D registration.
Considering the applied devices when using augmented reality, Azuma and colleagues [10] define three different groups: (1) mobile devices (e.g., smartphones and/or tablets), (2) stationary units, or (3) head-mounted displays. For the application in class, mobile devices are especially promising. The portability and costs related to smartphones or tablets allow the use of augmented reality in various different settings, as in the classroom or out in the field. A mobile device’s screen can be shared among students and facilitates cooperation and interaction. Most students have and bring their own devices, such as smartphones, with them. Schools are working on connectivity, increasingly stocking up on tablets for the classroom and beginning to implement these in traditional learning environments, especially since the COVID-19 pandemic [11]. Today’s mobile devices are equipped with camera, microphone, sensors such as GPS, compass, accelerometer, and gyroscope [12]. This way, it is convenient to design augmented reality learning applications for tablets or smartphones, as they are portable, affordable, and the application could incorporate various functions such as image and speech recognition, sensors, or location-based features.
The application of augmented reality in class additionally depends on the triggers utilized. Pence [13] divides augmented reality into marker-less or marker-based. Marker-less augmented reality uses location data, such as GPS, to identify where the user is before overlaying pertinent information. Marker-based augmented reality, on the other hand, relies on visual markers such as QR codes. Cheng and Tsai [14] supplement these two categories when differentiating between location-based and image-based augmented reality. Location-based augmented reality is typically marker-less, whereas image-based augmented reality is using pictures as markers. These divisions are for now helpful when designing and developing augmented reality resources. However, as technology is rapidly evolving, future devices will have a greater number of more sophisticated sensors for understanding the real world where overlaid content is to be presented, causing the number of categories to be extended accordingly, or may make this kind of division superfluous.
Augmented reality has been substantially categorized and defined mainly from a technological perspective, whereas the affordances and limitations from an educational point of view have only been discussed and defined in isolation from one another. Using augmented reality in science and technology teaching, students can explore unobservable phenomena or abstract concepts in real time and interactively. Examples from physics, chemistry, or biology include usually invisible conditions, such as organs, DNA structure, atoms, or magnetic fields, or time-related phenomena, such as explosions, continental drift, or seed germination [12,15,16,17,18]. Wu, Chang and Liang [19] state several conditions in which augmented reality can aid to visualize phenomena that are difficult to learn. They include learning content in a 3D perspective, visualizing the invisible, ubiquitous, collaborative, and situated learning, learners’ sense of presence, immediacy, and immersion, and bridging formal and informal learning.
Hanid, et al. [20] investigated various augmented reality approaches between 2015 and 2019, according to different learning strategies applied. They identified augmented reality beneficial to four dominant learning strategies: interactive learning, game-based learning, collaborative learning, and experiential learning. The first two strongly benefit motivation and interest of the learner. The third can improve performance and enhance the learning process. The latter uses experience as a medium for learning, actively supplementing knowledge. The authors identified five additional learning strategies in terms of using augmented reality, which comprise blended learning, ubiquitous learning, argumentation-based learning, mobile learning, and interaction learning. They emphasized the need for equally considering the learner’s needs, the learning objective, the available equipment, and the learning strategy when developing and designing augmented reality learning environments.
The potentials and learning strategies described so far reflect a wide range of application scenarios for augmented reality in science and technology teaching. Regarding the growing relevance of augmented reality in STEM education [7], the development of new augmented reality learning environments [21,22,23], and the expanding need to distinguish between suitable and not-suitable learning environments by teachers, the question arises as to which parameters are particularly relevant and how these can be operationalized. Lund et al. [12] provide a first approach to answer these questions by analyzing educational augmented reality applications (AR apps) in terms of the design of new learning environments using augmented reality. Following the Delphi-method, experts (researchers, teachers, and designers) elaborated the challenges, opportunities, and ideas for augmented reality in education. They generated a list of nine key aspects (see Table 1) which are scaled from “not important” (0) to “very important” (6). The authors stress that the individual key aspects need to be balanced according to a set learning goal.
We also tackle the question of parameters that are significant for the development of augmented-reality-enriched learning environments in science and engineering teaching and their assessment, focusing on the interplay between subject and media didactical parameters and parameters resulting from domain specific content. We address four questions:
Q1: What are relevant subject-related and media didactic parameters to classify AR apps used in science and engineering teaching, with respect to application design and setup?
Q2: What patterns, similarities, and discrepancies can be identified in terms of the app setup?
Q3: Is there a relation between the setup and the dedicated learning effects of the reviewed AR apps?
Q4: How can the analysis of AR apps based on the developed parameters aid to develop AR apps?

2. Materials and Methods

In the following chapter, we present the systematic procedure of our literature analysis. For this purpose, we first conducted a literature search with the software “Publish or Perish” [24], using the search engine Google Scholar, which is the database with the most comprehensive literature overview in a comparison of the twelve most frequently consulted scientific search engines and databases, such as Web of Science and Scopus [25].
For the search, we used the search terms “ Augmented Reality “, “STEM”, “STEAM”, “Biology”, “Chemistry”, and “Engineering”, applying the following syntax: “Augmented Reality” AND “Chemistry” OR “Biology” OR “STEM” OR “STEAM” OR “Engineering”. The search was conducted in September 2022 and resulted in a total of 537 results (biology: 42, chemistry: 117, engineering: 378). In the following, the results were first screened based on the abstracts. Papers were considered relevant to the study if (1) an AR app was sufficiently described to perform the following assessment, (2) the description was written in either English or German, and (3) there was a description of what learning effects the AR app was intended to address. In the next step, content overlaps were sorted out and additional publications were added that were mentioned by the reviews found and also included AR apps. Based on this procedure, we ultimately obtained twenty-two papers on published apps from the field of biology, eleven from the field of chemistry, and thirteen for the field of engineering.
To identify relevant subject specific aspects, pedagogical content knowledge, and media didactical parameters for the classification of AR apps, an evaluation matrix of seven key design parameters was adopted and augmented for “Engineering” (see Section 3.1) based on Krug et al. [7,26]. The key parameters were identified in a two-step process. In the first step, a literature review was conducted regarding subject and media didactic parameters for the development of AR apps, and the parameters found were compiled into a first set of parameters. In the second step, these parameters were presented to experts (3 experts each of media education, biology didactics, and chemistry didactics, and 1 from engineering didactics) in focus group interviews and, based on their feedback, were shortened and refined.
The review of AR apps using the developed evaluation matrix (see Figure 1) allows mapping out of the different approaches of AR apps on a comparable level. The abovementioned parameters form the key points of this evaluation matrix. The parameters with their respective indicators were quantified and recorded in the evaluation matrix. Each parameter comprised a scale of indicators or levels. Individual amplitudes indicate compliance to the respective. A description of the augmented reality content, and the measured learning effects, as well as the source, are attached to this matrix to ensure traceability.
The parameters were used to rate existing AR apps. Rating was based on a handbook, which precisely describes each parameter and its respective indicators. AR app examples help clarify when certain indicators apply or not.
Rater training with the accompanying rating handbook guaranteed an unbiased rating. Each rater completed rater training in a multistep process, comparing the test rating of AR apps. Strong rating discrepancies were narrowed down in discussion; the resulting agreements and perceptions were recorded in a protocol and added to the rating handbook. After rater training, a final test rating was performed, using 10 percent of the review material, and applying the evaluation matrix, along with the rating handbook with two separate raters per subject. This concluded in a Cohen’s kappa of 0.838 for biology AR apps, 0.814 for chemistry AR apps, and 0.811 for engineering AR apps.
As a counterpart to the descriptive parameter ratings of AR apps, measured learning effects were juxtaposed. While screening review material and describing AR apps, a wide range of the learning effects was recorded. This selection was condensed to four general learning effects in educational research, as these four were consistently investigated in the majority of reviewed papers: motivation, self-efficacy, self-regulation, and cognition (as in cognitive achievement).
When newly introducing augmented reality technology in a standardized field, such as science education, the inhibition threshold can hinder teachers and students from exploring the technology’s qualities and possibilities. The principles of the motivation, self-efficacy, and cognitive load theories contribute to successfully clear this hurdle by creating motivating, stimulating, and meaningful experiences. Therefore, when investigating digital media, such as augmented reality for science and engineering teaching, it is of great importance to set this in reference to these relating learning effects.
A hierarchical cluster analysis was performed to find any underlying structures or patterns based on the given data. The starting point for the cluster analysis was the scores of the evaluation matrix, which provide seven-dimensional cases, formed by the seven parameters, each representing one AR app. To examine entities of this complexity, it was necessary to break their scores down to a comparable level. This statistical technique groups similar samples into clusters based on similarity of variables. Arranged in a treelike structure, the result is a hierarchy of clusters, displaying the similarity of the cases. This identifies patterns and interdependencies among them. Hereby, the cases, in this case the rated AR apps, are examined in terms of their resemblance or distance to one another.
The hierarchical cluster analysis was performed in SPSS (25); using the hierarchical agglomerative clustering method, the similarity measure was calculated using the Ward method. The proximity measure was calculated by scaling down the seven parameter ratings of each AR app to one representing value and comparing each of these cases with one another. To calculate the distance between AR apps, we used the Euclidean distance. Cases with a small distance were combined into a cluster. Next, clusters with a small distance to one another were combined into bigger clusters, and so forth. The clusters can be seen in the dendrogram’s horizontal branches; the fusions are shown in the vertical conjunctions.

3. Results

As augmented reality applied for science and engineering teaching is a rather young arising trend, there are different approaches to guide the design and use of augmented reality. Below are presented the results of an analysis of parameters based on media didactic and pedagogical content knowledge sources.

3.1. Selection of Relevant Parameters

In the review by Krug et al. [7,26], a total of seven design parameters were identified. In our study, which we present here, we used the same search matrix as in the review by Krug et al. [7,26] and replaced the term “physics” with “engineering” (see Methods section). In total, the same seven parameters were found, which are briefly summarized here: “Interaction” (Int), “Immersion” (Imm), “Congruence with Reality” (CwR), “Content Proximity to Reality” (CPtR), “Adaptivity” (Adapt), “Game Elements” (GE), and “Complexity” (Comp).
Interactivity” is considered a key parameter from both a media didactic and pedagogical content knowledge perspective, with the pedagogical content knowledge perspective based on assumptions from cognitive psychology [28,29]. To operationalize interactivity, the Taxonomy of Multimedia Component Interactivity Schulmeister [30] offers a scale to classify multimedia learning on six levels (1. Viewing still pictures, 2. Viewing video (including play, stop, speed, repeat, rewind, etc.), 3. Manipulating video display and viewing order (rotating, zooming, jumping to other parts of a video), 4. Manipulating video or visualization contents through data input, 5. Generating videos or visualizations through programs or data, and 6. Receiving feedback on manipulations of visual objects).
The second key aspect is “presence”, as it describes the user’s feeling of being deeply involved in the setting—the real as well as the virtual world. This illusion of being present in an alternate reality, broadly understood as “presence”, is covered by several different definitions and approaches: Heeter [31] splits the experience of presence into three dimensions: personal, social, and environmental—whereas in the personal dimension the sensation of being present can be enforced by addressing the five senses, Witmer and Singer [32] argue that experiencing presence in a virtual environment requires the ability to focus on one meaningfully coherent set of stimuli in the virtual environment to the exclusion of unrelated stimuli in the physical location. In addition, involvement and immersion are necessary for experiencing presence. Witmer and Singer grouped presence into four categories: control factors (degree of control, immediacy of control, anticipation, mode of control, physical environmental modifiability); sensory factors (sensory modality, environmental richness, multimodal presentation, consistency of multimodal info, degree of movement perception, active search); distraction factors (isolation, selective attention, interface awareness); realism factors (scene realism, consistency of information with the objective world, meaningfulness of experience, separation anxiety/disorientation).
Slater and Wilbur [33] describe presence as a binary state. Presence in the real world is distinguished from presence in the virtual world. They focus on presence from a new perspective: Instead of examining when presence occurs, they aim to identify when and why presence is interrupted. In the complex field of perception when using AR, the work of Slater and Wilbur break presence down to the term “immersion”, as it is understood as a component of presence and can aid to enhance the perception of being present. As immersion, in terms of presence, is an important aspect of AR technology, this parameter was included in the framework. Following the common denominator of the above-mentioned concepts—sensory factors—Slater and Wilbur state that the experience of feeling present and immersed in a virtual environment can be enforced by stimulating one’s senses (sight, hearing, touch, smell, taste). Consequently, and to guarantee objectivity, immersion was abstracted and is classified as the number of addressed senses, to be checked off if the application stimulates the user’s visual, auditory, olfactory, gustatory, or kinesthetic senses.
Aside from immersion, realism in displayed content can also aid to foster involvement and spark interest and is especially important when using technology to display scientific phenomena [34]. Therefore, “congruence with reality” is a key parameter, which describes the extent of realism in the AR environment. Based on the work of McMahan [35], this parameter is split into factors of social and cognitive realism. Social realism covers intangible aspects such as interactions, activities, and events, whereas cognitive realism addresses the appearance and effect of objects, phenomena, or environment. Combined, these aspects will reflect how optically realistic and perceptibly naturalistic the displayed content appears to the user. This parameter entails seven indicators (1. Plausibility, 2. Fidelity, 3. Light effects, 4. Shadow effects, 5. Proportions, 6. Photorealism, and 7. Functionality of 3D registration).
In addition to measuring the extent of realism, another parameter is added to ensure consideration of realism in a didactical matter. Since this framework aims to classify AR apps for science and engineering teaching, it is crucial to also rate the apps on a content-related level, as misleading information or illogical content can foster misconceptions. The parameter “content proximity to reality” rates the level of accuracy as in physical laws and cause–effect relations in addition to the initiating process of the application (trigger/marker) [26]. The five indicators define the correct and logic display of content: 1. Temporal plausibility, 2. Local plausibility, 3. Causal plausibility, 4. Appropriate choice of tracking method, and 5. Appropriate tracking display.
Another key aspect resulting from the interviews with experts is “adaptivity”. Söldner [36] defines adaptivity as the general process, which is triggered by an incident or a change of circumstances and leads to the adaption of a software element or service. Paramythis and Loidl-Reisinger [37] outline adaptivity in terms of adaptive learning environments. A learning environment is considered adaptive if its system is capable of monitoring user activity to then interpret and deduce user requirements and preferences, based on underlying knowledge models in a dynamic matter. Adaptivity is set as the capacity of an application to respond to situational occurrences and/or user activity. It reflects the AR app’s flexibility using a taxonomy of four different levels (1. No possible alterations—neither the user nor the app itself can change settings or react to any incidents, 2. Adaptivity is possible by adjusting settings in before the actual start of the app (adjustment of skill, topic, user target group etc.), 3. Adaptivity is possible by the user manually adjusting a parameter during the use of the application (adjustment of degree of difficulty, alteration of topic or content, manual change of values in experimental settings), and 4. Adaptivity by automatic and situational adaptation of the application according to the user’s needs and skills (adjustment based on user behavior or situational factors, e.g., increase of temperature)).
An additional key aspect resulting from the interviews was the aspect of gamification, which was included as the parameter “game elements”. The experts agreed that augmented reality can be used to boost the affordances of games. This argument was substantiated by the example of the possibility of overlaying real images with virtual objects and further opportunities for stimulating collaboration between the players. In this work, the broad field of gamification is understood as the implementation of game-like elements to nongame context [38]. Gamification applied to learning environments promises to foster interactivity, involvement, and engagement when implemented in a meaningful and appropriate manner [39,40,41]. Based on the research of Bedwell et al. [42], distinct elements of the field of game design were selected as indicators. The initial assortment of game elements consists of nine different categories of distinct, nonoverlapping elements, of which the ninth is “immersion”, which has already been addressed as its own parameter. Consequently, the remaining eight categories are used as indicators for the parameter “game elements” to reflect the level of gamification (1. Goal/rules, 2. Conflict/challenge, 3. Control, 4. Assessment, 5. Action language, 6. Human interaction, 7. Environment, 8. Story).
Complexity” is a parameter used to address the didactical means. It is based on the ESNAS model [43], a framework for evaluating science teaching. More precisely, one of the model’s three dimensions is extracted for this purpose. This factor, complexity, is used to hierarchize tasks in terms of their difficulty. This scale entails five levels and ranges from teaching one fact, multiple facts, a single interrelation, and multiple interrelations to a superordinate concept.

3.2. Evaluation and Comparison of Rated AR Applications

The key parameters presented in Section 3.1 were compiled into an evaluation matrix. Applying the evaluation matrix to each individual AR app determines a distinct matrix profile, assigned to each AR app. This profile represents the details of the app setup. Arranged as a heptagon, each rated AR app setup is displayed in its respective shape.

3.2.1. Measured Learning Effects

The investigation of rated AR apps and their measured learning effects shows a variety of different approaches. In terms of recorded measurements, we confine the wide range to the four, shown in the following table: motivation, cognition (as in cognitive achievement), self-regulation, and self-efficacy. A checkmark indicates that the described AR app has proven itself effective in the particular learning (see columns of Table 2). An increase of motivation was recorded in 30, cognition in 29, self-regulation in 3, and self-efficacy in 4 of the overall 46 reviewed publications. In further analyzations, these findings will be placed into contrast with the according setup of the AR app, represented by the parameter rating.

3.2.2. Graphical Overlay

By overlaying the heptagons of the rated AR apps by subject, a first analysis was conducted, as seen in the figure below. This visual comparison showed a wide variety of different approaches in the individual subjects biology (Figure 2), chemistry, and engineering. While screening the papers and rating AR apps, we also found that the review material covered a vast range of subject specific topics. Additionally, reviewed and rated AR apps also strongly differentiate in their app setup and concept—their respective heptagon. The resulting charts and heptagons show no visible conformity or pattern in terms of the key parameters.
Some ratings show strong peaks in one direction, meaning a strong rating of one specific parameter. Others have a more well-rounded heptagonal shape. In terms of the first finding, we filtered out heptagons with a singular high peak and six other comparably low scores. Many of these extreme-shaped heptagons reached the highest score in either “complexity” or “congruence with reality”.
An exemplary app for these rating results Is the textbook “Arthropoda” [26] (Figure 3). It displays different poisonous or otherwise dangerous species for students to observe. The highest possible score is reached in congruence with reality, while other parameters score comparably low. The objects are well designed with realistic features, but no further interaction, adaptations, or activities, other than viewing, are mentioned.
The smartphone-based AR app of Hoog et al. [45] is another example of these singular high peaks (Figure 4). The app displays the crystal structures of the green fluorescent protein (GFP) of broccoli and spinach onto an object with the according marker. In this case, the markers are attached to beverage koozies as well as lab handbooks. Aside from exploring the structure function relationship by viewing the structure, no further interactions were described. Once again “congruence with reality” scored relatively high compared to the rest of the parameters.
The smartphone AR app of Qamari and Ridwan [52] also peaks high in “congruence with reality” (Figure 5). This app uses images of plants on cards as markers to display a virtual 3D representation of said image. In this context, it is used to introduce five different species of dicotyledonous plants. Students can view the plants from different angles and make snapshots. The high rating for “congruence with reality” leads back to its highly photorealistic design, using light and shadow effects, a seemingly appropriate scaling, and a photorealistic style. Nevertheless, it does not include further features to engage the students in the activity. The AR app’s description shows no elements to foster interactivity or levels of adaptivity. A game-like character or particular game elements are not described. In addition, as it aims at studying the appearance of plants, it scores low complexity points. Therefore, the respective heptagon results in a shape with mostly low scores, except for the protruding peak concerning realism.
An exemplary AR app for a well-rounded rating output is the concept-based approach of the “virtual butterfly ecological system” [57] (Figure 6). As a GPS-based application, the virtual breeding process is combined with the school campus’s botanical garden. The goal is to complete a lifecycle of a certain butterfly species as an avatar. Answering quiz questions leads to points, which can be used to purchase necessary food. Further features are the appropriate choice of host plants for breeding as well as the challenge of having to fight natural enemies, such as birds. The app is set up as a gamified learning environment in which the user is challenged to reach a certain goal. Therefore, it checks off most of the game elements, except it misses the flexible game environment and human interactions amongst users. Meanwhile the “interaction” level is high, as the user can actively make decisions and influence and shape the course/progress. Situational feedback and different levels result in a high adaptivity score as well. Both realism parameters, “congruence with reality” and “proximity to reality”, score extremely high as well. A few reasons for this rating were the high detail in the design of the butterfly movement and photorealism, the integration of real-life environment with the addition of a virtual leaf for a better handling the breeding process, and no real-time, but causal and local, effects. In complexity scale, it reached 4 out of 5, as it depicts multiple interrelating dependencies, but has no full superordinate concept.
This AR app is conceptualized as a game with a variety of tasks and activities, such as quiz questions, points as currency, different levels to undergo, avatars to choose, challenges to overcome, and situational feedback. It covers a biological topic in the curriculum (invertebrate lifecycle) in a broad manner and uses features of both the real and the virtual world: augmented butterflies, natural enemies, and host plants in combination with the real world’s botanical garden integrated in the app system by GPS. This well-rounded game-like concept is matched with a highly detailed 3D design, e.g., the butterfly movement and lifecycle stage were closely investigated and considered in the design. In this way, it integrates both didactical principles and gamification aspects, which can be seen in the rating and is reflected in the homogenous amplitudes of its heptagon.

3.2.3. Statistical Clustering and Relation between Setup and Learning Effect

To further examine the AR apps, we conducted a hierarchical cluster analysis that revealed two distinct clusters (Figure 7). This applies to all three domains: biology, chemistry, and engineering. The final clustering was accomplished after six to eleven calculations. The final clustering was chosen by the potency of the clusters regarding the strength of a cluster by dendrogram, here shown in its length of horizontal branch (withstanding numerous calculations runs) and manual verification of this branch with regard to the cluster’s sample content. The two clusters have a clear differentiation, or, rather, the AR apps of each cluster have a clear similarity, when seen in the context of the AR app’s setup. The different design approach and setup focus is reflected in the two-cluster result, displayed in the dendrogram’s final branches. The group membership of the first cluster combines AR apps that have reached an overall moderate to high score in all parameters, while AR apps in the other cluster reached significantly high scores in only 1–2 technical parameters and far lower scores in the remaining parameters. The first group pursues a wholesome approach and tries to address technical design aspects as well as the implementation of didactical principles. In comparison, AR apps of group 2 focus on optical aspects and strive for high accuracy and detail in terms of modeling and display, while rather disregarding the remaining parameters of this evaluation matrix.
In addition, the rough distribution found in the graphical overlay can be transferred to this grouping of the statistical clustering. More precisely, the two exemplary AR apps mentioned above outline the differences between these two groups.
Clustering of the rating results, as seen in the dendrogram (Figure 7) and the overlay of the heptagonal shapes (Figure 2), shows no correlation to the apps’ specific topics or content.
There was no significant relation between the cluster distribution or the heptagonal molding and the measured learning effects.

4. Discussion

The aim of the study was to confirm parameters for the evaluation of AR apps and also to extend them to other subjects (engineering) that are significant in terms of subject and media didactics. These parameters were used to analyze existing AR apps with regard to their patterns, similarities, and differences. Based on this, suggestions for the future development of AR-supported learning environments should be derived. Since the design parameters from Krug et al. (biology, chemistry, and physics) [7,25] can also be applied to engineering, we assume that there is a high likelihood that the results of the cluster analysis from this study can also be transferred to the field of physics, even though we have not directly investigated this.
Hanid, Said, and Noraffandi [20] focus, in their work, on learning strategies that can be found in AR applications. Lund Nielsen, Brandt, and Svenson [12] investigate general key aspects of AR for education. Generally, this work provides an important basis for assessing and developing AR applications. Complementary to those previous studies, this work aimed to justify parameters for the development and evaluation of AR apps for science teaching from a media didactic perspective and the perspective of pedagogical content knowledge, and to develop an evaluation tool with operationalizable parameters based on them. The presented evaluation tool consists of seven key parameters that are theoretically sound, thoroughly described, underlined with exemplary AR apps from science and engineering education, and individually differentiated with defining indicators or levels. The detailed description of the parameters and indicators facilitates the comprehensibility and accessibility of the tool for teachers and developers of AR applications.
Examining the individual AR apps of each cluster, we found AR app approaches to follow either a more didactics-driven or a technical-driven development. We conclude that app development of about half of the analyzed AR apps (group 2) was driven by technical aspects rather than following a criteria-based approach and hypothesize that this is due to the lack of a general framework for AR app creation in science and engineering teaching.
Combining both efforts of technical accuracy and didactical meaningfulness, the seven parameters we suggest (interaction, immersion, congruence with reality, content proximity to reality, adaptivity, game elements, and complexity) can stand as key factors in app development. This could close this gap, and therefore help create holistic learning apps with both technical precision as well as implementing meaningful pedagogical principles in the AR app setup and concept.
We have seen that certain AR app setups—in terms of the parameter rating and heptagonal output—are used for certain topics or types of learning. The specific focus on augmented reality technical aspects and 3D modeling is seen in AR apps for visualizing scientific phenomena or objects. An extensive deliberation of various factors, such as interactive elements, reward systems, situational feedback, and many more, can be found in AR apps aiming to teach superordinate concepts or complex interrelations. In accordance with our findings, we can roughly distinguish two different objectives in augmented reality development: visualization of content versus interactive learning.
Modern visualization technology allows comprehensibly displaying objects and/or phenomena which could not be easily explained by text or 2D image. We can leverage the factor time or scale up/down with modeling and simulation. However, augmented reality can achieve more when conceptually integrated into the learning process by combining elements of the real and virtual world in real time. Learning is understood as the interactive confrontation with learning material [30], and learning strategies, especially in science, are understood as an experience rather than viewing and observing. Therefore, implementing augmented reality technology in a science-learning environment can go further than solemnly displaying highly accurate content. Following this line of thought, the work of Hanid, Said, and Noraffandi [20] categorizes the implementation of augmented reality in education into four different approaches: interactive, game-based, collaborative, and experimental learning. These abovementioned features stand foremost in the conceptualization process. As the reviewed AR apps are designed to cover a specific scientific topic, the AR app is embedded into the lesson’s teaching material. This leads us to the next step: the deliberation of which status the AR app should have within the learning environment in total. Shall the AR app be integrated in an extensive setting as one part of the learning environment, or shall it stand alone, as a learning environment in itself?
The latter will allow for a univer”al i’plementation in various settings. If an AR app strives for this, this application needs to be highly self-sufficient. In the design process, the dimension of the learning environment should be defined as where it stands in the continuum from standalone to combined by its instructions and/or teacher assistance, preceding session with further information and/or conclusive discussion and additional work material, e.g., worksheets.
The seven key parameters will serve as a guideline to Implement the needed design aspects to develop an AR app with a high degree of self-sufficiency. An AR app considering all associated indicators in the design process will reach a high degree of self-sufficiency in terms of the app being a wholesome learning environment in itself. Nonetheless, it is noteworthy that an AR app is not expected to score highly by all means to be efficient for learning.
We could not show a correlation between learning effects and rated parameters, but this may stem from three reasons: (1) It is possible that a deeper differentiation of learning effects is needed. (2) It is highly likely that further advancement in conceptualized and structured development of AR apps for science and engineering teaching will lead to usable data. (3) More scientific research, such as evaluation and testing of developed science and engineering teaching AR apps, is needed.
As mentioned above, the parameters determine relevant aspects to be considered when developing AR apps in science and engineering teaching. Therefore, this assortment of subject and media didactical and AR-specific aspects facilitates the development of a more wholesome approach, addressing both technical and didactical principles.
Moreover, working along the learning strategies mentioned above [20] must be the first step of the development. When designing an AR app for science and engineering teaching, it is crucial to identify a path of learning strategy and define further aspects, such as the learning objective, the user’s need, and the equipment. Based on this, we postulate the following basic key considerations as a first step:
  • Target audience (age, level of knowledge, prior experience);
  • Learning objective (skills, knowledge, content/topic);
  • AR technology (equipment, restrictions/possibilities);
  • AR elements (use of advantages/benefits of this technology);
  • Type of user action (learning strategy: interactive, game-based, collaborative, experiential).

Limitations

In the context of this article, it is important to note that this constitutes a literature review rather than a systematic literature review. This restriction gives rise to certain limitations that must be considered in the context of this study, since, for example, not only high-quality papers were used and a certain degree of subjectivity could arise in the selection of the publications to be examined, which might have been less prevalent in the case of a systematic literature review.
Another limitation lies in the novelty and complexity of the augmented reality technology domain, which explains why the selection of key parameters was not exhaustive but only served to fulfill the specific objectives of this study. Consequently, we do not assert completeness regarding technical aspects of augmented reality, educational and media didactic principles, or design and programming perspectives. Rather, the parameter selection aimed to bridge these diverse domains through a balanced compilation.
In this paper, we focused on examining existing AR app approaches in terms of measured learning effects; however, it is important to note that the presented learning effects are confined to motivation, self-efficacy, self-regulation, and cognitive performance. Due to the limited resources and the scope of this study, we were unable to consider all possible effects of individual papers, which could potentially impact other aspects of learning.
Furthermore, the literature search was limited to the Google Scholar database, the “title words” search mechanism, and the English and German languages to obtain an extensive yet feasible amount of review material. This may lead to the omission of some relevant works in other languages or other databases, which is another limitation of this review.

5. Conclusions

The integration of augmented reality into learning environments within (life) sciences and engineering teaching is a relatively nascent but rapidly expanding domain. Thus far, there have been only few criteria available for both the development and analysis of AR-assisted teaching and learning environments in the context of science and engineering education. To enable efficient learning from the perspective of subject didactics, an evaluation tool should consider not only technical and media didactic criteria but also subject didactic criteria. Furthermore, it should be compact enough to be used for the analysis of AR apps as well as for the development of AR applications. The work presented in this article can thus be understood as a step towards addressing this existing gap in research. The evaluation tool proved its use as a reviewing tool to systematically cluster AR learning apps from different scientific domains according to their design approach and app setup. The juxtaposition of the heptagonal output and respective learning effects showed no significant relation. Thus far, there is also no pattern in the heptagonal output in terms of the addressed topic or content. These results indicate a conceptual gap between the design of the AR application and the respective learning objectives, which could be addressed through more intensive subject-didactic analysis in advance or during app development.
Hence, parameters forming the evaluation matrix can further be used to deduce relevant design criteria for the development of future AR learning environments. Transformed as key design aspects, the parameter’s indicators can guide the design process and ensure an overall consideration of both technical and educational aspects. The various parameters of different fields combined shall aid in the systematic development of AR apps that are both technically smoothly functioning and user friendly as well as didactically meaningful. Therefore, the seven determined parameters cover all three fields: parameters 1, 3, and 5 apply to technical aspects; parameters 1, 2, 5, and 6 contribute to an immersive experience, and parameters 2, 4, 5, and 7 address the educational principles for meaningful learning environments.

Author Contributions

Conceptualization, all authors; methodology, V.C., M.K. and S.M.; validation, V.C., M.K. and S.M. formal analysis, V.C., M.K. and S.M.; investigation, V.C., M.K. and S.M.; data curation, V.C., M.K. and S.M.; writing—original draft preparation, V.C., M.K., J.H. and H.W.; writing—review and editing, V.C., M.K., J.H. and H.W.; visualization, V.C.; supervision, J.H., S.K., W.M. and H.W.; project administration, J.H. and H.W.; funding acquisition, J.H., S.K., W.M. and H.W.; All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by German Ministry of Science, “ARtiste- Augmented Reality Teaching in Science Technology Education “grant number reference 34-7811.533-4/3/5” and the APC was funded by the University of Konstanz.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that supports the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

There is no conflict of interest to declare.

References

  1. European Comission. Digital Education Action Plan 2021–2027 Resetting Education and Training for the Digital Age. 2020. Available online: https://education.ec.europa.eu/focus-topics/digital-education/action-plan (accessed on 9 March 2023).
  2. KMK Lehren und Lernen in der Digitalen Welt. Die Ergänzende Empfehlung zur Strategie Bildung in der Digitalen Welt. 2021. Available online: https://www.kmk.org/fileadmin/Dateien/veroeffentlichungen_beschluesse/2021/2021_12_09-Lehren-und-Lernen-Digi.pdf (accessed on 9 March 2023).
  3. OECD The Future of Education and Skills Education 2030. 2018. Available online: https://www.oecd.org/education/2030/E2030%20Position%20Paper%20(05.04.2018).pdf (accessed on 9 July 2023).
  4. Van Laar, E.; Van Deursen, A.J.A.M.; Van Dijk, J.A.G.M.; De Haan, J. The relation between 21st-century skills and digital skills: A systematic literature review. Comput. Hum. Behav. 2017, 72, 577–588. [Google Scholar] [CrossRef]
  5. González-Pérez, L.I.; Ramírez-Montoya, M.S. Components of Education 4.0 in 21st Century Skills Frameworks: Systematic Review. Sustainability 2022, 14, 1493. [Google Scholar] [CrossRef]
  6. von Kotzebue, L.; Meier, M.; Finger, A.; Kremser, E.; Huwer, J.; Thoms, L.-J.; Becker, S.; Bruckermann, T.; Thyssen, C. The Framework DiKoLAN (Digital Competencies for Teaching in Science Education) as Basis for the Self-Assessment Tool DiKoLAN-Grid. Educ. Sci. 2021, 11, 775. [Google Scholar] [CrossRef]
  7. Krug, M.; Czok, V.; Huwer, J.; Weitzel, H.; Müller, W. Challenges for the design of augmented reality applications for science teacher education. INTED Proc. 2021, 6, 2484–2491. [Google Scholar] [CrossRef]
  8. Milgram, P.; Takemura, H.; Utsumi, A.; Kishino, F. Augmented reality: A class of displays on the reality-virtuality continuum. In SPIE Proceedings, Telemanipulator and Telepresence Technologies; Das, H., Ed.; SPIE: Boston, MA, USA, 1995; pp. 282–292. [Google Scholar]
  9. Azuma, R.T. A Survey of Augmented Reality. Presence Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  10. Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent advances in augmented reality. IEEE Comput. Graph. Appl. 2001, 21, 34–47. [Google Scholar] [CrossRef]
  11. Sepúlveda, A. The Digital Transformation of Education: Connecting Schools, Empowering Learners. 2020. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000374309 (accessed on 5 July 2023).
  12. Nielsen, B.L.; Brandt, H.; Swensen, H. Augmented Reality in science education–affordances for student learning. Nord. Stud. Sci. Educ. 2016, 12, 157–174. [Google Scholar] [CrossRef]
  13. Pence, H.E. Smartphones, Smart Objects, and Augmented Reality. Ref. Libr. 2010, 52, 136–145. [Google Scholar] [CrossRef]
  14. Cheng, K.H.; Tsai, C.C. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research. J. Sci. Educ. Technol. 2013, 22, 449–462. [Google Scholar] [CrossRef]
  15. Celik, C.; Guven, G.; Cakir, N.K. Integration of mobile augmented reality (MAR) applications into biology laboratory: Anatomic structure of the heart. Res. Learn. Technol. 2020, 28, 2355. [Google Scholar] [CrossRef]
  16. Tschiersch, A.; Krug, M.; Huwer, J.; Banerji, A. Augmented Reality in chemistry education—An overview. ChemKon 2021, 28, 241–244. [Google Scholar] [CrossRef]
  17. Probst, C.; Fetzer, D.; Lukas, S.; Huwer, J. Effects of using augmented reality (AR) in visualizing a dynamic particle model. ChemKon 2021, 29, 164–170. [Google Scholar] [CrossRef]
  18. Huwer, J.; Banerji, A.; Thyssen, C. Digitalisierung—Perspektiven für den Chemieunterricht. Nachrichten Chem. 2020, 68, 10–16. [Google Scholar] [CrossRef]
  19. Wu, H.; Lee, S.W.; Chang, H.; Liang, J. Current status, opportunities and challenges of augmented reality in edu-cation. Comput. Educ. 2013, 62, 41–49. [Google Scholar] [CrossRef]
  20. Hanid, M.F.A.; Said, M.N.H.M.; Yahaya, N. Learning Strategies Using Augmented Reality Technology in Education: Meta-Analysis. Univers. J. Educ. Res. 2020, 8, 51–56. [Google Scholar] [CrossRef]
  21. Karayel, C.E.; Krug, M.; Hoffmann, L.; Kanbur, C.; Barth, C.; Huwer, J. ZuKon 2030: An Innovative Learning Environment Focused on Sustainable Development Goals. J. Chem. Educ. 2022, 100, 102–111. [Google Scholar] [CrossRef]
  22. Syskowski, S.; Huwer, J. A Combination of Real-World Experiments and Augmented Reality When Learning about the States of Wax—An Eye-Tracking Study. Educ. Sci. 2023, 13, 177. [Google Scholar] [CrossRef]
  23. Krug, M.; Huwer, J. Safety in the Laboratory—An Exit Game Lab Rally in Chemistry Education. Computers 2023, 12, 67. [Google Scholar] [CrossRef]
  24. Harzing, A.-W. Publish or Perish User’s Manual. 2016. Available online: https://harzing.com/resources/publish-or-perish/manual/using/query-results/accuracy (accessed on 9 March 2023).
  25. Gusenbauer, M. Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases. Scientometrics 2019, 118, 177–214. [Google Scholar] [CrossRef]
  26. Krug, M.; Czok, V.; Müller, S.; Weitzel, H.; Huwer, J.; Kruse, S.; Müller, W. Ein Bewertungsraster für Augmented-Reality-Lehr-Lernszenarien im Unterricht. ChemKon 2022, 29, 312–318. [Google Scholar] [CrossRef]
  27. Destiara, M. The Practicality of Teaching Material Biology of Islamic-Science based on Augmented Reality. BIO-INOVED J. Biol. Pendidik. 2020, 2, 117–122. [Google Scholar] [CrossRef]
  28. Chi, M.T.H.; Wylie, R. The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educ. Psychol. 2014, 49, 219–243. [Google Scholar] [CrossRef]
  29. Puentedura, R. Transformation, Technology, and Education. 2006. Available online: http://hippasus.com/resources/tte/puentedura_tte.pdf (accessed on 9 March 2023).
  30. Schulmeister, R. Interaktivität in Multimedia-Anwendungen. 2005. Available online: https://www.e-teaching.org/didaktik/gestaltung/interaktiv/InteraktivitaetSchulmeister.pdf (accessed on 9 March 2023).
  31. Heeter, C. Being There: The subjective experience of presence. Presence Teleoper. Virtual Environ. 1992, 1, 262–271. [Google Scholar] [CrossRef]
  32. Witmer, B.G.; Singer, M.J. Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence Teleoper. Virtual Environ. 1998, 7, 225–240. [Google Scholar] [CrossRef]
  33. Slater, M.; Wilbur, S. A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence Teleoper. Virtual Environ. 1997, 6, 603–616. [Google Scholar] [CrossRef]
  34. Standen, P. Realism and imagination in educational multimedia simulations. In The Learning Superhighway: New World? New Worries? Proceedings of the Third International Interactive Multimedia Symposium, Perth, Western Australia, 21–25 January 1996; McBeath, C., Atkinson, R., Eds.; Promaco Conventions Pty Limited: Bateman, WA, USA, 1996; pp. 384–390. Available online: https://ascilite.org/archived-journals/aset/confs/iims/1996/ry/standen.html (accessed on 1 July 2023).
  35. McMahan, A. Immersion, Engagement, and Presence: A Method for Analyzing 3-D Video Games. In The Video Game Theory Reader; Wolf, M.J.P., Perron, B., Eds.; Routledge: Oxfordshire, UK, 2003; pp. 67–86. [Google Scholar]
  36. Söldner, G. Semantische Adaption von Komponenten. Ph.D. Thesis, Friedrich Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany, 2012. Available online: https://d-nb.info/1029374171/34 (accessed on 9 March 2023).
  37. Paramythis, A.; Loidl-Reisinger, S. Adaptive Learning Environments and e-Learning Standards. Electron. J. E-Learn. 2004, 2, 181–194. [Google Scholar]
  38. Deterding, S.; Dixon, D.; Khaled, R.; Nacke, L. From Game Design Elements to Gamefulness: Defining “Gamification”. In Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, Proceedings from MindTrek 11, New York, NY, USA, 28–30 September 2011. [Google Scholar]
  39. Nah, F.F.H.; Zeng, Q.; Telaprolu, V.R.; Ayyappa, A.P.; Eschenbrenner, B. Gamification of education: A review of literature. In Proceedings of the International Conference on HCI in Business, Heraklion, Greece, 22–27 June 2014; Available online: https://link.springer.com/content/pdf/10.1007/978-3-319-07293-739.pdf (accessed on 1 July 2023).
  40. Parong, J.; Mayer, R.E. Learning science in immersive virtual reality. J. Educ. Psychol. 2018, 110, 785–797. [Google Scholar] [CrossRef]
  41. Hung, H.; Yeh, H. Augmented-reality-enhanced game-based learning in flipped English classrooms: Effects on students’ creative thinking and vocabulary acquisition. J. Comput. Assist. Learn. 2023, 1–23, Early View. [Google Scholar] [CrossRef]
  42. Bedwell, W.L.; Pavlas, D.; Heyne, K.; Lazzara, E.H.; Salas, E. Toward a Taxonomy Linking Game Attributes to Learning. Simul. Gaming 2012, 43, 729–760. [Google Scholar] [CrossRef]
  43. Kauertz, A.; Fischer, H.E.; Mayer, J.; Sumfleth, E.; Walpuski, M. Standardbezogene Kompetenzmodellierung in den Naturwissenschaften der Sekundarstufe. Z. Didakt. Nat. 2010, 16, 135–153. [Google Scholar]
  44. Yusof, C.S.; Amri, N.L.S.; Ismail, A.W. Bio-WtiP: Biology lesson in handheld augmented reality application using tangible interaction. IOP Conf. Ser. Mater. Sci. Eng. 2020, 979, 12002. [Google Scholar] [CrossRef]
  45. Hoog, T.G.; Aufdembrink, L.M.; Gaut, N.J.; Sung, R.; Adamala, K.P.; Engelhart, A.E. Rapid deployment of smartphone-based augmented reality tools for field and online education in structural biology. Biochem. Mol. Biol. Educ. 2020, 48, 448–451. [Google Scholar] [CrossRef] [PubMed]
  46. Rodríguez, F.C.; Frattini, G.; Krapp, L.F.; Martinez-Hung, H.; Moreno, D.M.; Roldán, M.; Salomón, J.; Stemkoski, L.; Traeger, S.; Dal Peraro, M.; et al. MoleculARweb: A Web Site for Chemistry and Structural Biology Education through Interactive Augmented Reality out of the Box in Commodity Devices. J. Chem. Educ. 2021, 98, 2243–2255. [Google Scholar] [CrossRef]
  47. Kofoglu, M.; Dargut, C.; Arslan, R. Development of Augmented Reality Application for Biology Education. Tused 2020, 17, 62–72. [Google Scholar] [CrossRef]
  48. Weng, C.; Otanga, S.; Christianto, S.M.; Chu, R.J.-C. Enhancing Students’ Biology Learning by Using Augmented Reality as a Learning Supplement. J. Educ. Comput. Res. 2020, 58, 747–770. [Google Scholar] [CrossRef]
  49. Garcia-Bonete, M.-J.; Jensen, M.; Katona, G. A practical guide to developing virtual and augmented reality exercises for teaching structural biology. Biochem. Mol. Biol. Educ. A Bimon. Publ. Int. Union Biochem. Mol. Biol. 2019, 47, 16–24. [Google Scholar] [CrossRef]
  50. Korenova, L.; Fuchsova, M. Visualisation in Basic Science and Engineering Education of Future Primary School Teachers in Human Biology Education Using Augmented Reality. Eur. J. Contemp. Educ. 2019, 8, 92–102. [Google Scholar] [CrossRef]
  51. Safadel, P.; White, D. Facilitating Molecular Biology Teaching by Using Augmented Reality (AR) and Protein Data Bank (PDB). TechTrends 2019, 63, 188–193. [Google Scholar] [CrossRef]
  52. Qamari, C.N.; Ridwan, M.R. Implementation of Android-based augmented reality as learning and teaching media of dicotyledonous plants learning materials in biology subject. In Proceedings of the 3rd International Conference on Science in Information Technology (ICSITech), Bandung, Indonesia, 25–26 October 2017; pp. 441–446. [Google Scholar] [CrossRef]
  53. Küçük, S.; Kapakin, S.; Göktaş, Y. Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load. Anat. Sci. Educ. 2016, 9, 411–421. [Google Scholar] [CrossRef]
  54. Hung, Y.-H.; Chen, C.-H.; Huang, S.-W. Applying augmented reality to enhance learning: A study of different teaching materials. J. Comput. Assist. Learn. 2017, 33, 252–266. [Google Scholar] [CrossRef]
  55. Huang, T.-C.; Chen, C.-C.; Chou, Y.-W. Animating eco-education: To see, feel, and discover in an augmented reality-based experiential learning environment. Comput. Educ. 2016, 96, 72–82. [Google Scholar] [CrossRef]
  56. Aivelo, T.; Uitto, A. Digital gaming for evolutionary biology learning: The case study of parasite race, an augmented reality location-based game. LUMAT 2016, 4, 1–26. [Google Scholar] [CrossRef]
  57. Tarng, W.; Yu, C.-S.; Liou, F.-L.; Liou, H.-H. Development of a virtual butterfly ecological system based on augmented reality and mobile learning technologies. Virtual Real. 2013, 19, 674–679. [Google Scholar] [CrossRef]
  58. Jamali, S.S.; Shiratuddin, M.F.; Wong, K.W.; Oskam, C.L. Utilising Mobile-Augmented Reality for Learning Human Anatomy. Procedia—Soc. Behav. Sci. 2015, 197, 659–668. [Google Scholar] [CrossRef]
  59. Chiang, T.; Yang, S.; Hwang, G.-J. An Augmented Reality-based Mobile Learning System to Improve Students’ Learning Achievements and Motivations in Natural Science Inquiry Activities. Educ. Technol. Soc. 2014, 17, 352–365. [Google Scholar]
  60. Kamarainen, A.M.; Metcalf, S.; Grotzer, T.; Browne, A.; Mazzuca, D.; Tutwiler, M.S.; Dede, C. EcoMOBILE: Integrating augmented reality and probeware with environmental education field trips. Comput. Educ. 2013, 68, 545–556. [Google Scholar] [CrossRef]
  61. Hsiao, K.-F.; Chen, N.-S.; Huang, S.-Y. Learning while exercising for science education in augmented reality among adolescents. Interact. Learn. Environ. 2012, 20, 331–349. [Google Scholar] [CrossRef]
  62. Berry, C.; Board, J. A Protein in the palm of your hand through augmented reality. Biochem. Mol. Biol. Educ. A Bimon. Publ. Int. Union Biochem. Mol. Biol. 2014, 42, 446–449. [Google Scholar] [CrossRef]
  63. Cai, S.; Wang, X.; Chiang, F.-K. A case study of Augmented Reality simulation system application in a chemistry course. Comput. Hum. Behav. 2014, 37, 31–40. [Google Scholar] [CrossRef]
  64. Mahmoud Mohd Said Al Qassem, L.; Al Hawai, H.; AlShehhi, S.; Zemerly, M.J.; Ng, J.W. AIR-EDUTECH: Augmented immersive reality (AIR) technology for high school Chemistry education. In Proceedings of the 2016 IEEE Global Engineering Education Conference (EDUCON), Abu Dhabi, United Arab, 10–13 April 2016; pp. 842–847. [Google Scholar] [CrossRef]
  65. Abu-Dalbouh, H.M.; AlSulaim, S.M.; AlDera, S.A.; Alqaan, S.E.; Alharbi, L.M.; AlKeraida, M.A. An Application of Physics Experiments of High School by using Augmented Reality. IJSEA 2020, 11, 37–49. [Google Scholar] [CrossRef]
  66. Nishihamat, D.; Takeuchi, T.; Inoue, T.; Okada, K.I. AR chemistry: Building up augmented reality for learning chemical experiment. Int. J. Inform. Soc. 2010, 2, 43–48. [Google Scholar]
  67. Wan, A.T.; San, L.Y.; Omar, M.S. Augmented Reality Technology for Year 10 Chemistry Class. Int. J. Comput.-Assist. Lang. Learn. Teach. 2018, 8, 45–64. [Google Scholar] [CrossRef]
  68. Cai, S.; Liu, C.; Wang, T.; Liu, E.; Liang, J.-C. Effects of learning physics using Augmented Reality on students’ self-efficacy and conceptions of learning. Br. J. Educ. Technol. 2021, 52, 235–251. [Google Scholar] [CrossRef]
  69. Maier, P.; Klinker, G.J. Evaluation of an Augmented-Reality-based 3D User Interface to Enhance the 3D-Understanding of Molecular Chemistry. In Proceedings of the 5th International Conference on Computer Supported Education, Aachen, Germany, 6–8 May 2013; pp. 294–302. [Google Scholar] [CrossRef]
  70. Chun Lam, M.; Suwadi, A. Smartphone-based Face-to-Face Collaborative Augmented Reality Architecture for Assembly Training. In Proceedings of the 2nd National Symposium on Human-Computer Interaction 2020, Kuala Lumpur, Malaysia, 8 October 2020. [Google Scholar]
  71. Kumta, I.; Srisawasdi, N. Investigating Correlation between Students’ Attitude toward Chemistry and Perception toward Augmented Reality, and Gender Effect. In Proceedings of the 23rd International Conference on Computers in Education (ICCE2015), Hangzhou, China, 30 November–4 December 2015. [Google Scholar]
  72. Nandyansah, W.; Suprapto, N.; Mubarok, H. Picsar (Physics Augmented Reality) as a Learning Media to Practice Abstract Thinking Skills in Atomic Model. J. Phys.Conf. Ser. 2020, 1491, 12049. [Google Scholar] [CrossRef]
  73. Chen, S.-Y.; Liu, S.-Y. Using augmented reality to experiment with elements in a chemistry course. Comput. Hum. Behav. 2020, 111, 106418. [Google Scholar] [CrossRef]
  74. Astriawati, N.; Wibowo, W.; Widyanto, H. Designing Android-Based Augmented Reality Application on Three Dimension Space Geometry. J. Phys. Conf. Ser. 2020, 1477, 22006. [Google Scholar] [CrossRef]
  75. Bazarov, S.E.; Kholodilin, I.Y.; Nesterov, A.S.; Sokhina, A.V. Applying Augmented Reality in practical classes for engineering students. IOP Conf. Ser. Earth Environ. Sci. 2017, 87, 32004. [Google Scholar] [CrossRef]
  76. Gonzalez-Franco, M.; Pizarro, R.; Cermeron, J.; Li, K.; Thorn, J.; Hutabarat, W.; Tiwari, A.; Bermell-Garcia, P. Immersive Mixed Reality for Manufacturing Training. Front. Robot. AI 2017, 4, 3. [Google Scholar] [CrossRef]
  77. Guo, W. Improving Engineering Education Using Augmented Reality Environment. In Learning and Collaboration Technologies. Design, Development and Technological Innovation; Zaphiris, P., Ioannou, A., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2018; pp. 233–242. [Google Scholar] [CrossRef]
  78. Liarokapis, F.; Anderson, E.F. Using Augmented Reality as a Medium to Assist Teaching in Higher Education; Eurographics: Saarbrücken, Germany, 2010. [Google Scholar]
  79. Luwes, N.J.; van Heerden, L. Augmented reality to aid retention in an African university of technology engineering program. In Proceedings of the 6th International Conference on Higher Education Advances (HEAd’20), Virtually, 3–5 June 2020. [Google Scholar]
  80. Martín-Gutiérrez, J.; Contero, M. Improving Academic Performance and Motivation in Engineering Education with Augmented Reality. In Proceedings of the HCI International 2011–Posters’ Extended Abstracts: International Conference, HCI International 2011, Proceedings, Part II 14, Orlando, FL, USA, 9–14 July 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 509–513. [Google Scholar] [CrossRef]
  81. Gutiérrez, J.; Fernández, M.D.M. Applying augmented reality in engineering education to improve academic performance & student motivation. Int. J. Eng. Educ. 2014, 30, 625–635. [Google Scholar]
  82. Sahin, C.; Nguyen, D.; Begashaw, S.; Katz, B.; Chacko, J.; Henderson, L.; Stanford, J.; Dandekar, K.R. Wireless communications engineering education via Augmented Reality. In Proceedings of the 2016 IEEE Frontiers in Education Conference (FIE), Erie, PA, USA, 12–15 October 2016; pp. 1–7. [Google Scholar] [CrossRef]
  83. Sanchez, A.; Redondo, E.; Fonseca, D.; Navarro, I. Academic performance assessment using Augmented Reality in engineering degree course. In Proceedings of the 2014 IEEE Frontiers in Education Conference (FIE) Proceedings, Madrid, Spain, 22–25 October 2014; pp. 1–7. [Google Scholar] [CrossRef]
  84. Shirazi, A.; Behzadan, A. Assessing the pedagogical value of Augmented Reality-based learning in construction engineering. In Proceedings of the 13th International Conference on Construction Applications of Virtual Reality, London, UK, 30–31 October 2013; pp. 416–426. [Google Scholar]
  85. Singh, G.; Mantri, A.; Sharma, O.; Dutta, R.; Kaur, R. Evaluating the impact of the augmented reality learning environment on electronics laboratory skills of engineering students. Comput. Appl. Eng. Educ. 2019, 27, 1361–1375. [Google Scholar] [CrossRef]
  86. Theodorou, P.; Kydonakis, P.; Botzori, M.; Skanavis, C. Augmented Reality Proves to Be a Breakthrough in Environmental Education. Prot. Restor. Environ. 2018, 7, 219–228. [Google Scholar]
Figure 1. Exemplary rating on the augmented reality app of Destiara [27] using the evaluation matrix. Each parameter has either certain levels or a number of indicating elements to be checked off. This indicates the degree of the parameters applied in the rated augmented reality app.
Figure 1. Exemplary rating on the augmented reality app of Destiara [27] using the evaluation matrix. Each parameter has either certain levels or a number of indicating elements to be checked off. This indicates the degree of the parameters applied in the rated augmented reality app.
Education 13 00926 g001
Figure 2. Output of all biology augmented reality apps overlayed. Each heptagonal shape represents one augmented reality app’s setup, content, and focus in its development.
Figure 2. Output of all biology augmented reality apps overlayed. Each heptagonal shape represents one augmented reality app’s setup, content, and focus in its development.
Education 13 00926 g002
Figure 3. Rating result of the augmented reality app of Destiara [27]. The heptagon clearly shows a strong peak in the visual parameter congruence with reality.
Figure 3. Rating result of the augmented reality app of Destiara [27]. The heptagon clearly shows a strong peak in the visual parameter congruence with reality.
Education 13 00926 g003
Figure 4. Rating result of the augmented reality app of Hoog et al. [45]. Again, this augmented reality app’s heptagon clearly shows a strong peak in the visual parameter congruence with reality and moderate rating values in interactivity and adaptivity.
Figure 4. Rating result of the augmented reality app of Hoog et al. [45]. Again, this augmented reality app’s heptagon clearly shows a strong peak in the visual parameter congruence with reality and moderate rating values in interactivity and adaptivity.
Education 13 00926 g004
Figure 5. The rating result of the augmented reality app of Qamari and Ridwan [52] is another example of a substantial peak in the visual parameter congruence with reality, while the 6 other parameters score low.
Figure 5. The rating result of the augmented reality app of Qamari and Ridwan [52] is another example of a substantial peak in the visual parameter congruence with reality, while the 6 other parameters score low.
Education 13 00926 g005
Figure 6. The rating result of the augmented reality app of “virtual butterfly ecological system” [57] shows strong rating results in 6 of 7 parameters, with immersion being the only lower score.
Figure 6. The rating result of the augmented reality app of “virtual butterfly ecological system” [57] shows strong rating results in 6 of 7 parameters, with immersion being the only lower score.
Education 13 00926 g006
Figure 7. Hierarchical cluster analysis of rated biology augmented reality apps finding two distinct clusters in iteration seven.
Figure 7. Hierarchical cluster analysis of rated biology augmented reality apps finding two distinct clusters in iteration seven.
Education 13 00926 g007
Table 1. Key aspects for the development of augmented-reality-enriched learning environments for science and engineering teaching (adapted from [12]).
Table 1. Key aspects for the development of augmented-reality-enriched learning environments for science and engineering teaching (adapted from [12]).
Key AspectDescription
InteractiveFrom observing to user interaction with the material.
CreatorFrom consuming to creating content.
CollaborativeFrom individual to collaborative work.
Situated learningThe degree of a designs incorporating augmented reality and its setting.
Inquiry-based scienceThe degree of facilitating an inquiry-based perspective opposed to learning facts and concepts.
Real-world augmentationFrom virtual reality to real world augmentation.
3D visualizationFrom 2D such as illustrations to applying 3D designs.
JuxtaposingFrom viewing the design from one perspective to flexibility of altering content and perspective.
Data-drivenFrom static content to actively collecting and presenting data.
Table 2. Listing of all screened augmented reality apps and the accordingly measured learning effects.
Table 2. Listing of all screened augmented reality apps and the accordingly measured learning effects.
DomainAuthor (Year) of Published Augmented Reality AppMotivationCognitionSelf-RegulationSelf-Efficacy
BiologyYusof, C. et al. (2020) [44]X
Hoog, T. G. et al. (2020) [45]X
Destiara, M. (2020) [27]X
Rodriguez, F. et al. (2021) [46]XX
Celik, C. et al. (2020) [15]XX
Kofoglu, M. et al. (2020) [47]XX
Weng et al. (2019) [48]XX
Garcia-Bonete, M.-J. et al. (2019) [49]X
Korenova, L & Fuchsova, M. (2019) “Brain iExplore” [50] XX
Korenova, L & Fuchsova, M. (2019) “Anatomy 4D” [50] XX
Safadel, P. & White, D. (2018) [51]X
Qamari, C. & Ridwan, M. (2017) [52]XX
Kücük, S. et al. (2016) [53] XX
Hung, Y.-H. et al. (2017) [54]X
Huang, T.-C. et al. (2016) [55]X X
Avielo, T. & Uitto, A. (2016) [56]X
Tarng, W. et al. (2015) [57]XX
Jamali, S. S. et al. (2015) [58]XX
Chiang, T.-H.-C. et al. (2014) [59]XX
Kamarainen, A. M. et al. (2013) [60]X X
Hsiao, K.-F. et al. (2012) [61]X X
Berry, C. & Board, J. (2014) [62]X
ResultBiology (number of augmented reality apps measuring…)191133
ChemistryCai, S., Wang, X., & Chiang, F.-K. (2014) [63] X
Al Qassem, L. M. M. S. et al. (2016) [64]XX
Abu Dalbouh, H. et al. (2020) [65]XX
Nishihama, D. et al. (2010) [66] X
Wan, A.T. et al. (2018) [67]XX
Cai, S. et al. (2020) [68] X x
Maier, P., & Klinker, G. J. (2013) [69]X
Meng, C. L. et al. (2020) [70] X
Kumta, I., & Srisawasdi, N. (2015) [71] X
Nandyansah, W. et al. (2020) [72] X
Chen, S.-Y., & Liu, S.-Y. (2020) [73]XX
ResultChemistry (number of augmented reality apps measuring…)51001
EngineeringAstriawati, N. et al. (2020) [74] X
Bazarov, S. E. et al. (2017) [75]X
Gonzalez-Franco, M. et al. (2017) [76] X
Guo, W. (2018) [77] X
Liarokapis, F. & Anderson, E. F. (2010) [78]X
Luwes, N. & Van Heerden L. (2020) [79] X
Martín-Gutiérrez, J. & Contero, M. (2011) [80]XX
Martín-Gutiérrez, J. & Meneses Fernández, M. (2014) [81]X
Sahin, C. et al. (2016) [82] X
Sanchez, A. et al. (2013) [83] X
Shirazi, A. & Behzadan, A. (2013) [84]X
Singh, G. et al. (2019) [85]X
Theodorou, P. et al. (2018) [86] X
ResultEngineering (number of augmented reality apps measuring…)6800
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Czok, V.; Krug, M.; Müller, S.; Huwer, J.; Kruse, S.; Müller, W.; Weitzel, H. A Framework for Analysis and Development of Augmented Reality Applications in Science and Engineering Teaching. Educ. Sci. 2023, 13, 926. https://doi.org/10.3390/educsci13090926

AMA Style

Czok V, Krug M, Müller S, Huwer J, Kruse S, Müller W, Weitzel H. A Framework for Analysis and Development of Augmented Reality Applications in Science and Engineering Teaching. Education Sciences. 2023; 13(9):926. https://doi.org/10.3390/educsci13090926

Chicago/Turabian Style

Czok, Valerie, Manuel Krug, Sascha Müller, Johannes Huwer, Stefan Kruse, Wolfgang Müller, and Holger Weitzel. 2023. "A Framework for Analysis and Development of Augmented Reality Applications in Science and Engineering Teaching" Education Sciences 13, no. 9: 926. https://doi.org/10.3390/educsci13090926

APA Style

Czok, V., Krug, M., Müller, S., Huwer, J., Kruse, S., Müller, W., & Weitzel, H. (2023). A Framework for Analysis and Development of Augmented Reality Applications in Science and Engineering Teaching. Education Sciences, 13(9), 926. https://doi.org/10.3390/educsci13090926

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop