Next Article in Journal
Teachers’ Perceptions and Appropriation of EFL Educational Reforms: Insights from Generalist Teachers Teaching English in Mexican Rural Schools
Previous Article in Journal
Why Do Students Attend STEM Clubs, What Do They Get Out of It, and Where Are They Heading?
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Development and Evaluation of Digital Learning Tools Promoting Applicable Knowledge in Economics and German Teacher Education

Department of Law and Economics, Johannes Gutenberg University Mainz, 55128 Mainz, Germany
German Institute, Johannes Gutenberg University Mainz, 55128 Mainz, Germany
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(5), 481;
Received: 5 April 2023 / Revised: 5 May 2023 / Accepted: 6 May 2023 / Published: 10 May 2023


Digital teaching interventions allow for tailor-made university teaching. This is especially relevant for teacher education, where applicable professional teaching knowledge needs to be promoted for later professional success. Digital teaching tools have been shown to be a promising supplement for this purpose. Even though the corresponding demands in teacher education have been increasing in recent years, the need to develop digital learning tools usable in instruction is still urgent. The TWIND project develops digital learning tools for teacher education and evaluates them in a quasi-experimental design. The present work investigates the usability and application of these newly developed tools. Sixty-three trainee teachers worked independently over four weeks with one of two digital learning tools, focusing on either ‘Multilingualism in Classrooms’ or ‘Professional Communication in Classrooms.’ This study includes a pre-post-test of pedagogical knowledge facets as well as student and instructor ratings on the digital tools. The digital learning tools led to a positive change in the respective target facets of pedagogical knowledge. The student and instructor feedback reflected positively on the usability and usefulness of the new digital tools. Based on these findings, the limitations of the study as well as implications for further research and teacher education practice have been outlined.

1. Relevance

Due to efforts on digitalization in teaching and learning in recent years, studying has become more and more location- and time-independent, individually tailored, and multimodal [1]. Today’s students routinely work with flexible learning opportunities, which can be used regardless of time and place [2,3]. To address individual learning needs in university practice and the respective high heterogeneity of student bodies, a more precise tailoring of teaching practices is becoming essential [4,5]. This leads to a growing demand for effective, high-quality digital teaching-and-learning tools that include different media to enable higher levels and rates of student success in higher education [6,7,8]. Corresponding efforts have not only become necessary due to the higher demand of flexibility among students in regard to lowering attendance requirements but recently also acute in the context of the pandemic, where an immediate transition to digital teaching and learning was imperative [9]. The demand for digital teaching-and-learning tools in university teaching has increased exponentially in recent years [9], and the immense growth of digital teaching at universities was accelerated by the pandemic [10,11]. These digital tools are often designed to complement or replace face-to-face teaching and mostly depend on students’ self-directed learning activities [12,13]. Whereas high-quality digital learning opportunities play a central role in educational practice [1], their development and implementation represent a challenge for university teachers [14].
Digital teaching-and-learning material is particularly relevant in teacher education, since universities are not only responsible for imparting knowledge to students but also for fostering action-related (i.e., applicable) knowledge that enable students to successfully teach in the classroom [15]. Newly graduated teachers entering the profession face enormous challenges regarding the complex demands of their daily pedagogical work [16,17]. Career starters in particular often have difficulties in applying the knowledge acquired in their study programs to solve practical problems and to transfer their knowledge into situational action in the classroom [18,19]. The situation-specificity and immediacy of teaching require teachers to not only develop professional theoretical (e.g., content) knowledge (CK) but also applicable pedagogical knowledge (PK) [20]. Systematically and effectively promoting knowledge that is relevant to teaching in practice can prepare students early on for teaching demands in classrooms [21]. Therefore, teaching-and-learning tools in teacher education should enable students to reflect on the complexity of teaching and to develop applicable professional knowledge.
To meet the requirements for classroom teaching and to provide high-quality digital learning tools in teacher education, the project TWIND (‘Technology and Economics: Integrated Didactics’), funded by the Federal Ministry of Education and Research, focuses on the development and evaluation of new digital teaching-and-learning tools. The overarching goal is to improve teacher education through effective preparation for professional work with digital teaching-and-learning tools that promote the applicable PK of prospective teachers. These tools also focus on applicable PK in particular as a crucial disposition of trainee teachers for their prospective professional performance, and they can be integrated into training independently of the specific domain (such as German and economics).
Currently, the trend toward digitization is leading to a broad mass of digital interventions in higher teacher education [22,23]. However, the extent to which these actually result in a change in knowledge, such as applicable PK in the teacher education context, has barely been investigated so far. Systematic evaluations are therefore required to investigate the extent to which the digital learning tools can influence applicable PK in teacher training [24]. This paper addresses the evaluation of newly developed digital teaching-and-learning tools in university teaching to promote facets of applicable PK in prospective teachers. The study presented here demonstrates how the newly developed digital learning tools can be used in teacher training practice. This study examined the application and usability of two newly developed tools among trainee teachers of economics education and German education in a quasi-experimental pre–post design with intervention and comparison groups. To this end, we focused on two overarching objectives:
To investigate the effects of the digital teaching-and-learning tools on applicable PK among trainee teachers in a pre-post-test;
To investigate the subjective ratings of the digital teaching-and-learning tools reported by students and instructors.
The evaluation of the new teaching-and-learning tools takes stock of the impact of the digital tools on regular teacher education practice.

2. Applicable Pedagogical Knowledge

Professional teaching competence represents the ability of an individual to behave in an appropriate, thoughtful, and individually and socially responsible manner in social, professional and private situations [25,26] according to Weinert [27]. It is viewed as a multifaceted construct with several components, including facets of knowledge and beliefs [28,29]. Blömeke and colleagues [30] describe teaching competence as a continuum, where (situation-specific) skills represent the link between dispositions such as professional knowledge and teacher beliefs and actual professional behavior (e.g., manifested action in a classroom). In addition, a number of affective and motivational dispositions are important for developing and using professional skills [30,31]. In summary, the extended model in ref. [30] indicates that:
There are different types of teacher knowledge such as content knowledge (CK), pedagogical content knowledge (PCK), pedagogical knowledge (PK; [32]), and knowledge of media didactics (TPACK; [33]).
These dispositions may determine the level of situation-specific skills [2,4]. In this context, PK can be considered as the basis (besides CK) for developing PCK as well as TPACK. Thereby, PK can be seen as the knowledge of processes, practices, and methods of learning and teaching [32,33,34]. According to Koehler and Mishra ([33], p. 64), “this generic form of knowledge applies to understanding how students learn, general classroom management skills, lesson planning, and student assessment.” This goes in line with the assumption by Voss and colleagues ([35], p. 953), who define general PK as “the knowledge needed to create and optimize teaching-learning situations”.
Furthermore, situation-specific skills can be differentiated according to the two major professional demands in teaching practice, which build on the professional knowledge base CK, PK, PCK and TPACK, action-related skills (AS) in the classroom, and reflection skills (RS) before and after teaching in a class (see Figure 1).
The concept of PK is closely associated with applicable knowledge that enables teaching demands to be met [37,38,39]. Thereby, it refers to adjusting instructional designs to the learners’ prerequisites [40]. More specifically, applicable PK can be understood as a teacher’s ability to adapt the planning and implementation of instruction to the individual prerequisites and abilities of the learners in such a way that the most favorable conditions for individual learning are created and maintained during learning [41]. For instance, applicable PK includes knowledge about teaching methods, classroom management, self-regulated learning, conflict management, and communication in classrooms, as well as language comprehension and language education against the background of learners’ heterogeneity in terms of language skills [37,38,39,40,42]. Therefore, applicable PK is comprehensively relevant across disciplines and domains [43] and represents one of the central facets of professional teaching competence. As classrooms in schools become more diverse, professional development in applicable PK is becoming more and more important, but it is proving to be challenging [44]. This requires effective new methods to start promoting applicable PK early in university teacher education.

3. Promoting Applicable Pedagogical Knowledge in Teacher Education—Current Developments and Instructional Designs

3.1. Current Challenges for Instructional Designs

Teaching skills are contextualized by the characteristics of the teaching-and-learning processes and—in view of promoting applicable PK—require the development of knowledge that not only meets the challenges of teacher education but is also transferable to real professional practice. Prospective teachers need to be prepared for the demands of the contextualized application of knowledge in complex and varied teaching situations. This preparation should ideally take place at an early stage in teacher education and can be achieved through an active and problem-oriented examination of the demands of teaching [6,45,46,47].
Despite this demand, current teacher education at universities often does not incorporate sufficient practical elements and thus does not provide training that is close to everyday professional life [48,49]. As a consequence, (prospective) teachers are often not able to apply their professional knowledge in practical teaching. This stresses the frequently formulated need for linking theory and practice in higher education [50,51] and underlines the relevance of imparting applicable PK in teacher education [18,19,52,53]. Therefore, to adequately prepare prospective teachers for their later professional life, instructional designs that enable the effective promotion of applicable PK are needed.
One approach to developing connections between prospective teachers’ knowledge and professional behavior is constructing a learning situation (in the case of active learning) as a situation for action, including its specific context (e.g., in a simulated authentic teaching situation; [53]), focusing on teaching–learning arrangements [54]. This requires designing university courses in such a way that they approximate the actual situational requirements of realistic future teaching-and-learning situations.
In view of the changing teaching environment and corresponding challenges, concepts such as lifelong learning, action-relation, self-regulation, and the associated strong learner-centeredness have gained importance in education and training [55]. To address such challenges, adaptive learning systems, such as the concept of problem-based learning, are often proposed as effective approaches characterized by an open learning environment, thus allowing students a high degree of self-direction [56,57]. Integrating problem-based learning into teacher training is considered an effective way to acquire appropriate applicable PK for future teaching practice [58]. Therefore, instructional designs are often problem-centered [59]. In addition, other approaches such as the example-based approach are also useful [46], particularly in the early stages of knowledge development (see also [60,61]; for the situated cognition approach, see [62]).
The application of PK can be achieved through the use of digital multimedia teaching–learning tools during teacher education [63,64]. The realistic instructional scenarios presented in video vignettes showing typical classroom situations often meet the criteria of a problem-based learning situation and are therefore considered a promising medium for promoting applicable knowledge [65]. Particularly in the context of the COVID-19 pandemic and the resulting current restrictions on classroom teaching at universities, such video-based approaches are increasingly being integrated into digital learning opportunities as part of e-learning [66]. For instance, tasks using video vignettes with typical situations in (commercial) teaching to promote applicable PK have already been adapted for use at universities on a site-specific basis [67].
To meet the requirements of self-regulated learning, however, isolated video vignettes are not sufficient, but rather must be appropriately embedded in a learning framework. For instance, Gaudin and Chalies [68] point out the need to vary different design aspects of working with videos in online teaching (e.g., video material, forms of exchange and feedback) in intervention studies [68]. The selection of a suitable instructional design is therefore an important educational decision when developing digital teaching–learning tools. University teacher education is still often oriented towards traditional and teacher-centered learning models, even when it comes to digital learning [11]. A serious shortcoming in the design of digital teaching–learning materials is the choice of a learning model that does not consider the diversity of students and does not allow learner-centered learning [69]. By focusing on applicable PK, formats that consider the practical teaching problem motivate the students and initiate the learning process. Despite the importance of problem-centered and self-regulated learning through the use of digital tools, corresponding developments in teacher education and training are still lacking [70].

3.2. Design of the New Digital Tools

In the TWIND project, digital learning tools have been developed based on preliminary research as new teaching–learning tools to promote teachers’ applicable PK for daily professional practice [71]. The need for new development arises from the goal of developing tools that promote applicable knowledge as well as from the deficit of corresponding materials on overarching pedagogical topics. Brückner and colleagues [71] described the conceptualization of the digital learning tools in detail [71]. In summary, the modular structure of the digital tools follows the instructional design approach of the ‘pebble-in-the-pond’ model [59], which allows for the learner-centered and self-regulated use of these tools by students and (prospective) teachers. They can flexibly work through the individual components of the tools according to their individual needs to achieve the intended learning goals. To promote individual facets of knowledge, including PK, individual tools that incorporate various multimedia and modal learning media as micro-educational elaborated teaching–learning situations can be “tied together” [71]. Continuous monitoring of the learning process is made possible via audio moderation and guidance throughout, taking into account (in the case of self-regulated learning) the possible absence of a moderating, interactive, impulse-giving instructor (for more information on structure and modules of the new tools, see [71]).
Based on this model, both video-based and text-based instructional tasks were developed and embedded in digital tools to create constructive and educational coordinated teaching–learning units of 20–30 min each. The digital tools, as multimedia teaching-learning units, include a description of the learning context and objectives, accompanying learning texts, audio-visual media, tasks, and PK descriptions, as well as reflection and control elements. To make the teaching–learning unit effective and to integrate it into an adequate learning situation, it contains complex learning tasks to promote applicable PK (for details, see [71]).

4. Topics of the New Digital Teaching-and-Learning Tools and Their Application in Teacher Education

4.1. Focal Topics

This paper focuses on two new digital learning tools developed in TWIND, intended for use in teacher education at university level, each of which addresses one facet of applicable PK. Since PK includes knowledge about communication in classrooms that enables flexible reactions to classroom situations as well as knowledge that sensitizes (prospective) teachers to the specific learning requirements of a heterogeneous student group (e.g., sensitivity to multilingualism in classrooms [35,72]), one tool focuses on ‘Professional Communication in Classrooms’ and the other on ‘Multilingualism in Classrooms’.

4.1.1. Digital Tool: ‘Professional Communication in Classrooms‘

‘Professional Communication in Classrooms’ can be seen as a facet of cross-domain applicable PK [32,72], which is embedded, firstly, in regulative communication contexts where communication is used to influence or shape student behavior. Secondly, it is embedded in interpersonal conversational contexts where teachers use communication to mediate situational subjective feelings and negotiate conflicts [37]. Teachers who use their knowledge for professional communication in classrooms may, e.g., demonstrate more personal, situationally adapted communication with students in regulative and interpersonal contexts. They use communicative strategies not only in response to their students’ behavior but also in individual teacher–student relationships [37].
‘Professional Communication in Classrooms’ is of particular importance for (prospective) teachers since a key challenge of this profession lies in their instructional behavior and whether they are successful in achieving student learning outcomes within a given instructional setting (e.g., materials, objectives, procedures; see [73]). Teachers who are able to professionally implement ‘Professional Communication in Classrooms’ using their applicable PK are expected to offer better instruction and increase the learning outcomes of their students [73,74].
‘Professional Communication in Classrooms’ as a part of conversation management is characterized by varying degrees of teacher direction [75,76]. Professional communication in classrooms stands for all situations involving classroom talk or communication between the teacher and learners or between all learners on a specific topic [74]. From this, the qualitative characteristics of good classroom communication can be empirically extracted, and (minimum) standards can be formulated. For instance, a class discussion should proceed in a structured way (structure), be educationally justified (coherence), be worthwhile for the learners (yield), be discursive in nature (discursivity), and the learners should feel valued in the process (learning atmosphere), which is also reflected in the (minimum) standards [75].
Teachers can professionalize their ‘Professional Communication in Classrooms’ through theoretical knowledge (e.g., knowledge of communication rules and principles), case knowledge (e.g., knowledge related to their own experiences), and action knowledge (e.g., knowledge of situational options for action; see [75]). According to Richards [75], flexible, non-conclusive knowledge of available action points (theoretically substantiated and practiced based on case scenarios) is required to be able to deal with surprises professionally in the flow of a conversation. In addition to the (minimum) standards of classroom communication, there are also proven strategies for improving communication, such as listening, opening up, giving time, revisiting contributions, etc. [75]. Due to the complexity of converting learned content into practice, applicable PK on the facet ‘Professional Communication in Classrooms’ may not often be applied in realistic instruction [18,19]. This stresses the need for the development of corresponding learning tools. Usually, prospective teachers eventually learn how to conduct successful communication in classrooms through guided training, which is theoretically substantiated and practiced on a case-by-case basis [75]. However, the promotion of corresponding applicable PK has so far hardly been explicitly integrated in the context of teacher education at the university level.
Based on the conceptional–theoretical considerations, the digital learning tool ‘Professional Communication in Classrooms’ is structured in four sections (see Figure 2) and aims at the promotion of respective PK. The first section provides a problem-based introduction to the relevance of the topic and, using explanatory videos, deals with the topics ‘communication forms of teaching,’ ‘strategies for successfully conducting classroom communication,’ and ‘communication rules.’ In addition, learners have the opportunity to engage in self-study with additional, more detailed literature on professional communication. The goal of this section is to build a PK knowledge base that serves as a foundation for working through the other sections of the digital multimedia teaching-and-learning tool. The following three sections contain a progressive task structure to systematically build applicable PK. This is followed in the next section by the processing of closed assessment tasks (rating on a Likert scale) based on complex teaching situations. Finally, open-ended questions based on audio recordings of teaching situations are used for reflection and knowledge transfer.

4.1.2. Digital Tool: ‘Multilingualism in Classrooms’

‘Multilingualism in Classrooms’ can also be seen as a facet of cross-domain applicable PK and is related to teachers’ knowledge of professional communication and sensitivity to students’ linguistic abilities. School students are showing an increasing heterogeneity in terms of German language skills, educational proximity, and working techniques [77]. For (prospective) teachers, this poses a particular challenge. The promotion of language skills is one of the everyday tasks of a teacher and should be implemented as a consistent teaching principle in all subjects, learning areas, and learning fields [25].
However, this so-called everyday task is anything but trivial, and, therefore, teachers need special facets of PK for it; the promotion of educational language skills can only be successful if the teacher knows which language skills the students already have and which language requirements the planned lesson, and the materials used in it, contain. On this basis, the teacher can design an adequate lesson plan in line with the scaffold approach [78,79], which means subject and language learning go hand in hand [75]. To meet these challenges, teachers need basic linguistic knowledge, i.e., knowledge about language and its linguistic levels (e.g., [80,81]). Furthermore, a reflection on what the acquisition of a second language means for a learner is needed, especially in light of the fact that the learner has to follow the lessons and complete all tasks in this second language. Therefore, it is necessary for a teacher to be aware of the differentiation between acquiring a first and second language [80,81]. Against this background, a teacher should be aware of the consequences of their own teaching actions. On this basis, skills in the areas of diagnostics and support can be built up.
Teachers of different subjects, especially those without a focus on German education, are not very sensitized to language-sensitive lesson design and mostly do not consider the promotion of students’ language skills as their teaching task [82]. Consequently, sensitivity to students’ individual linguistic heterogeneity and their corresponding language knowledge is related to applicable PK [83]. Therefore, corresponding learning offers are necessary in teacher education and training. These offers should address all (prospective) teachers since the promotion of language skills cannot be considered a task of German teachers alone.
The digital learning tool ‘Multilingualism in Classrooms’ consists of four sections and has two objectives: (1) extending the knowledge of language, language acquisition, and multilingualism and (2) raising awareness for how to act professionally in the context of language diagnostics. The sections contain different types of inputs (explanatory video, audio clip, short presentation, interview, text) to activate the learners’ most diverse and deep cognitive processes. In addition, each input presentation is complemented by an optional additional task, e.g., focusing on self-reflection (see Figure 3).

4.2. Application and Evaluation of the New Tools in a Quasi-Experimental Study

The development of new digital teaching-and-learning tools to complement the existing teacher education curriculum progressed as described above. Although the design of the tools was guided by current research, the question arose as to what extent changes in the targeted applicable PK facets could be measured as a result of learning with these tools. In research on the evaluation of the application and impact of instructional interventions, the general question was posed as to how an intervention (here, the new tools) affects learners (here, prospective teachers) and targeted constructs (here, facets of applicable PK on professional communication in classrooms and multilingualism in classrooms, respectively).
Learning success in an instructional intervention can be defined as an increase in the targeted knowledge facets (e.g., [84,85]). Consequently, we implemented the two digital teaching-and-learning tools described above in a quasi-experimental field study in a pre–post design with intervention and comparison groups in higher teacher education. Guided by the overarching objectives (see Section 1) and based on the conceptual and methodological framework presented below, we focused on the following research questions (RQ 1) and hypotheses (H1):
RQ 1:
How does the use of newly developed digital teaching-and-learning tools change the facets of the applicable PK of trainee teachers in a pre–post comparison?
A significant positive development of the respective applicable PK facet can be observed between t1 and t2 in the intervention groups compared to the comparison groups.
In the context of evaluation, the results of learning in terms of both the effectiveness of the new digital formats for learning success and the usefulness of the tool’s content and format are important [86]. For instance, the intended learning processes might be triggered in a targeted way only if an intervention is prepared in a way that is appropriate for the target group [87]. To facilitate the transfer of newly developed tools into the regular practice of teacher training in higher education, therefore, not only do the effects on the targeted applicable PK facets need to be considered but also how the digital teaching-and-learning tools are perceived by students and instructors in terms of their usefulness and applicability. This resulted in three further evaluation questions (RQ 2, RQ 3, RQ 4) in the present quasi-experimental study.
RQ 2:
How do students use the new digital teaching-and-learning tools and how does this usage behavior influence their change in applicable PK facets in a pre–post comparison?
RQ 3:
How do students rate digital teaching-and-learning tools in terms of their usability and usefulness and are these ratings connected to their change in applicable PK facets?
RQ 4:
How do instructors rate digital teaching-and-learning tools in terms of their usability and usefulness?

5. Methods

5.1. Study Design

The study was conducted within the project TWIND [70] in the subprojects of German Teacher Education and Economics Teacher Education at a German university (for the project description, see [71]). In this study, both teaching-and-learning tools aimed to promote applicable PK in a specific area (see Section 4.1). To gain insights into the applicability and usability of the newly developed tools, they were implemented in two regular courses at one German university in the summer term of 2021, with a total of 63 students attending these courses. In each of the courses at this university, one of the newly developed digital tools was included and used as an ‘intervention’ in these courses (see Figure 4). In this study design, each group represented an ‘intervention’ group for one of the two newly developed tools, respectively. Two groups of students were necessary due to the objective of the study to investigate the two newly developed tools—the intervention would have been too extensive to use both tools in one course. Students were not randomly assigned to courses, resulting in a quasi-experimental design with two intervention groups.
The decision to have two intervention groups that also served as a ‘comparison’ group for the respective tool, rather than a typical ‘control-only’ group, was based on ethical considerations. The classical control group design would not be appropriate for a field study in teacher education because the ‘control group’ would not benefit from any newly developed learning tools and therefore would be disadvantaged compared to participating students in intervention groups. Thanks to the integration of the learning tools into ongoing courses, the intervention did not create any additional work for the students alongside to their regular studies. Common guidelines for the ethical conduct of research studies were considered in the study implementation. Participants were informed transparently about participation, the aim of the study, and the purposes for which the data would be used. No sensitive personal data were collected. No participants were confronted with any harmful threats or information. Data collection and storage was anonymized. The participants agreed to participate after being informed in detail. There were no disadvantages for students who did not participate in the online surveys.
Each tool was implemented in a synchronic online course over four weeks. The students’ work with the digital tools was not actively supervised by the course instructor, leading to a mostly self-regulated learning intervention. Since the respective tools were included as part of the course, students were not given an external incentive to engage with it. Every course participant was only given access to one of the two tools. The group for the intervention tool ‘Multilingualism in Classrooms’ (group A (M (for multilingualism))) consisted of economics education students. The group for the intervention tool ‘Professional Communication in Classrooms’ (Group B (C (for communication))) consisted of two subgroups of pre-service German teachers (see Figure 4). Since these two groups were pursuing two different degrees at two different faculties, interaction between them was very unlikely, and they could thus be seen as independent samples.
Before (t1) and after (t2) the intervention, students participated in an online pre- and post-survey. These surveys included test items on applicable PK facets as well as feedback on the usage and evaluation of the tool’s usefulness (for more details, see Section 5.2). In the courses, test items were designed for an evaluation at the end of the term, and these exam questions were used for a pre–post-evaluation in this study. To detect potential changes in the students’ applicable PK facets, the test items were specifically aligned with the contents of the intervention tool.
To make valid statements about the effects of the tools, groups that did not receive the intervention were required for comparison. For this purpose, all students in the intervention groups were additionally examined using the pre- and post-tests regarding the applicable PK facet promoted in the respective other group. Therefore, German teacher education students were also examined with regard to PK on multilingualism in classrooms, and economics teacher education students were also examined with regard to PK on professional communication in classrooms. Participation in the surveys was voluntary and not mandatory. However, the use of the learning tools was integrated as part of the regular university course and therefore mandatory. Students were asked to take part in the surveys during their courses and received specific incentives for their participation. Links to the online surveys were made available to students through the learning management system used in the course. After the intervention was completed, instructors were also asked via an invitation email to provide feedback on the usability of the tools.

5.2. Instruments

Data were collected online using SoSci Survey [88]. The surveys were conducted in German, with items translated into English for this paper. In addition to testing for the applicable PK facets (see Section 5.2.1 and Section 5.2.2) at both measurement points, socio-biographical (e.g., gender, age, migration background) and study-related (e.g., UEQ grade, study phase) data were also surveyed at t1. At t2, information on usage behavior, as well as detailed feedback on the digital learning tools, was collected. Student rating was obtained in terms of (a) usage behavior during the intervention (three items: frequency, duration, and learning form) and (b) evaluation of the digital tools (fifteen items on an agreement scale regarding different aspects, e.g., overall evaluation of the tool, motivational aspects as well as its workability, and the individual impact of the tool rated by students). Feedback from the instructors was obtained after the intervention in an open format while all other data were collected in closed format.

5.2.1. Instrument on Professional Communication in Classrooms

The newly developed tool is described in detail in Section 4.1.1. The respective applicable PK facets regarding professional communication in classrooms were assessed online using a self-developed standardized test containing eight single- and multiple-choice items written in German, which covered applicable knowledge aspects in exemplary teaching situations (for an example item, see Figure 5). Self-development was necessary due to a lack of established measurement instruments regarding the specific PK facets. A pilot phase of the newly developed instruments preceded the deployment for this study. No partial points were assigned, i.e., only completely correct responses were scored with one point per item resulting in a sum score ranging from 0 to 8. The theoretical assumption of a global underlying factor for the applicable PK facet ‘Professional Communication in Classrooms’ was confirmed in a CFA, with fit indices for a one-dimensional model in a satisfactory range (i.e., for t1: SRMR = 0.056; RMSEA = 0.028; CFI = 0.869; TLI = 0.816). The fit values did not correspond to the commonly known optimal values. At the same time, the fit values should be evaluated in light of the relatively small sample (match sample: n = 63) as suggested in the literature (e.g., [89,90]). Theoretically and conceptually, both constructs are multifaceted, broad constructs with possible specifics, e.g., depending on the study domain. In this regard, the heterogeneity of the sample should also be considered, especially with regard to the affiliation with the particular study domain in teacher education.

5.2.2. Instrument on Multilingualism in Classrooms

The digital learning tool ‘Multilingualism in Classrooms’ is intended to provide basic knowledge about language, language acquisition, and multilingualism, and it serves as an important disposition for corresponding teaching actions in practice (see the description in Section 4.1.2 for details on the tool). The respective multilingualism-related PK facets were assessed as part of the online survey, again using self-developed test items, as no validated instrument on this specific topic could be found. The applicable multilingualism-related PK facets test consisted of 13 single-choice items written in German. Again, only correct responses were considered, resulting in a sum score ranging from 0–13. In a CFA, the theoretically expected one-dimensional internal structure was confirmed (e.g., for t1: SRMR = 0.068; RMSEA = 0.040; CFI = 0.848; TLI = 0.817). For both applicable PK facets, the same items were used in t2 as in t1.

5.3. Sample Description

A total of 94 students were surveyed at t1: 21 students in group A (M) and 73 students in group B (C). At t2, 63 students were reached in total, of which 16 were from group A (M) and 47 from group B (C). A self-generated pseudonymous code was used to combine the two datasets via Stata [91]. Thus, a matched pre–post sample of 63 persons were identified (16 in group A, 47 in group B), whereas the other participants took part only in t1. A description of the matched sample can be seen in Table 1.

5.4. Statistical Procedure

The examination of the change in the applicable PK facets (RQ 1) was performed with comparisons of means (t-tests with scores in t1 and t2). In addition to descriptive statistics on usage behavior (RQ 2), regression analyses were also performed with usage behavior variables as independent variables. This was performed solely in group B (C) due to the low sample size and variance in group A (M). Since this did not reflect a nested data structure, no multilevel model was considered. The respective difference score was used as the dependent variable, which was calculated by subtracting the test items’ sum score at t1 from the sum score at t2. If the difference score was positive, the sum score at t2 was higher than at t1. With regard to student evaluations (RQ 3), descriptive statistics as well as regression analyses on the difference score were performed. Participant responses to the open feedback questions are provided, and instructor feedback (RQ 4) is presented here in excerpts. A manipulation check was also performed by investigating questions RQ 2 to RQ 4, i.e., checking the extent to which the tools were used. The statistical analyses were performed using Stata version IC16 V5 [91]. The qualitative data collected in open format regarding feedback from students and instructors were merely analyzed at a descriptive level for this study. That is, the information on the potential for improvement of the tools, for example, was screened, and multiple identical responses were deleted to avoid repetition. Information on open-ended responses from students in Section 6 was reported with an anonymous proband code in parentheses that replaced a case number. Feedback from the two instructors was provided with a case number.

6. Results

6.1. Changes in PK Facets (RQ 1)

To examine the changes in the applicable PK facets, mean values and standard deviations were calculated (see Table 2). A significant positive development of the respective target applicable PK facets between t1 and t2 were identified in both groups (H1; group A [L]: t[15] = −3.552, p = 0.003, d = 0.764; group B [C]: t[42] = −2.675, p = 0.011, d = 0.409), and no significant change was observed in the respective comparison groups (group A (M): t(14) = −0.924, p = 0.371, d = 0.253; group B (C): t(43) = −1.544, p = 0.130, d = 0.214).
Regarding the applicable PK facet in ‘Professional Communication in Classrooms’, the comparisons of the two groups within the measurement points showed no significant differences (p > 0.05, see Table 2). For the language knowledge scores, a significant knowledge difference between group A (M) and group B (C) (p < 0.001) was found at t1 but not at t2 (p = 0.941, see Table 2). This finding should be considered in light of the sample compositions, in which group B (C) consisted of students of German teacher education, and prior knowledge in the area of multilingualism could be assumed.

6.2. Usage Behavior and Its Influence on Changes in PK (RQ 2)

During the 4-week intervention, most of the students used the digital tools up to once a week. In terms of the duration of use, the students were distributed fairly evenly between up to and more than 90 min. All but one of the students used the digital tools alone (see Table 3).
Taking a closer look at the frequency of the use and duration of processing in group B (C) (see Table 4), there was a positive effect of repetition (overall duration: t[41] = −2.042, d = 0.714). For group A (M), we found no variance in the frequency of use, possibly due to the small sample size.
Taking a look at the regression analyses focusing on the influence of usage behavior on the difference score in group B (C), evidence for the positive effects of frequency of use was found again (see Table 5). Working together with other students also had a positive effect; however, this finding should be considered with caution in view of the small number of cases.

6.3. Usability and Usefulness from the Students’ Perspectives (RQ 3)

Students were also asked for feedback regarding the various facets of the digital tools. Both closed questions and open-ended questions were used. Overall, the students indicated medium to higher values in terms of satisfaction and considered the tools to be helpful for their professional working life and an innovative addition to the existing curriculum (see Table 6). The contents of the tools were considered relevant and were seen as added value in the course of their studies. Working with the tools aroused interest and could be associated with positive affect. Furthermore, the goal of fostering self-regulated learning could be achieved through clear objectives and intuitive use. At the same time, even though knowledge was expanded, revisions of the tools may be required to further optimize the learning success.
These ratings by the tool users were also correlated to the difference score of the assessed targeted applicable PK facets (see Table 7). The subjectively perceived knowledge gain and highly rated learning progress were also reflected in the difference score.
In their open responses, the students both highlighted what was particularly well done and pointed out issues for improvement (see Table 8). Overall, the learning goals were clear, the different types of media were also perceived positively, and the addition to the existing curriculum was rated as purposeful. The media packages in particular were considered suitable for introducing new topics, summarizing them or initiating a reflection process. Criticisms of the tools focused on some aspects of the technical and formal implementation in regular practice. In particular, the students wished for the tools to be embedded in a setting guided by instructors.

6.4. Usability and Usefulness from the Instructors’ Perspective (RQ 4)

In addition, from an instructor’s perspective, the tools were perceived as a helpful addition to the existing curriculum, especially for self-regulated learning processes. However, instructors integrating the tools’ contents in classroom teaching could further stimulate learning processes. Feedback from two instructors is given here as an example (see Table 9).

7. Discussion

7.1. Summary and Interpretation of the Findings

This study aimed to investigate the application of newly developed digital multimedia teaching-and-learning tools as self-learning tools in practical teacher training at university in a quasi-experimental design. Two groups of pre-service teacher students were observed, each of which worked with a digital tool and participated in a pre- and post-study. Based on the developments in the applicable PK facets, the following assumed effects were found in the intervention groups (RQ 1): Both groups showed a significant positive change in the targeted facet, whereas they showed no significant change in the non-targeted facet (H1). In particular, the students did not differ in terms of the test score in ‘Professional Communication in Classrooms’ (at t1), even when different study phases (bachelor vs. master) were considered. Teacher students seem to not regularly build up applicable PK in this area over the course of their studies. This finding was consistent with the state of the low curricular implementation of tools that foster applicable PK in teacher education to date, and it underlines the relevance of developing corresponding teaching-and-learning tools.
The differences between the two groups with regard to the test score in ‘multilingualism’ can be attributed to the composition of the subsamples. Compared to the applicable PK facets for professional communication in classrooms, which was comparably (marginally) included in the study programs of both groups, the German study program had a large overlap with the content of the language-related digital learning tool. Therefore, the higher applicable PK facet scores in the group of German students was expected. Even within this short period, the students in the intervention group were able to catch up with those in the ‘expert’ comparison group with respect to test scores. Overall, the effect sizes suggest promising outcomes from including the new digital teaching-and-learning tools as a complimentary component of teacher education.
To further understand the developments in the test scores before and after the interventions, we took a closer look at the students’ usage of the tools (RQ 2). These replaced a course unit with an approximate length of 1.5 h per week over four weeks. The use of the tool in a mostly individual, self-regulated learning setting should be viewed in light of the COVID-19 pandemic and the resulting restrictions. In particular, the repeated use resulted in more significant changes with respect to applicable PK, which was in line with previous research (e.g., [92]).
In the ratings (RQ 3), the students reported that the digital tools were useful and valuable additions to their studies. Overall, the tools were perceived as enabling self-regulated learning processes, providing clear learning goals and intuitive usage. Furthermore, the students described a positive effect on their applicable PK. These ratings were in line with the changes in the test scores.
Consistent with the student ratings, the initial feedback on the digital tools from instructors so far (RQ 4) suggests that the tools were seen as a meaningful and useful addition to the existing curriculum of teacher higher education. The tools were considered as self-learning material and also as a basis for in-depth group discussions. There was also interest in further tools or further development of existing tools for the transfer in teaching practice.
In summary, the results of this study, including the change in the targeted applicable PK facets, the (influence of the) usage behavior, and the feedback from students and instructors, suggest that the new digital tools are useful and usable, and that their content focusing on applicable PK is indeed relevant across domains. Furthermore, through the conception of the digital multimedia tools (as described in Section 4), self-learning processes were triggered. The results of this study support the findings on the importance of problem-based tasks for promoting applicable knowledge in teacher education. Further use of the newly developed digital tools, e.g., in blended learning settings, seems promising.

7.2. Limitations and Outlook

The findings of this study should be critically viewed against the background of limiting factors resulting, for example, from the study design. Study participation was voluntary, resulting not only in selection bias but also in panel mortality in the pre–post-study. Due to the study’s design as a field study and hence its reliance on existing courses, the resulting sample sizes were small and limited the possibilities of statistical analysis.
The heterogeneous characteristics of the groups examined (i.e., different teaching subjects) as well as the instructors may have influenced the results. No other universities were included in this first study so far. This is, however, essential to derive more generalizable conclusions. The use of the tools and their evaluation in other courses and at other universities thus represents the next step to test the replicability of the results presented in this study. However, in context of these limitations, it is particularly remarkable that the digital tools showed an effect on the prospective teachers’ applicable PK.
In addition, the usage data (RQ 2) were collected via retrospective statements instead of process-generated data (e.g., log-in times, access data, and downloads of individual elements of the tools), which is another important source of information on learning processes. These could provide further, more differentiated insights into usage behavior in future studies. In particular, process-generated data allow conclusions about difficulties and critical moments in learning processes to be drawn (e.g., through data on task completion and the problems encountered in the process).
Another limitation lies in the explanation of the change in the applicable PK facets, since influencing factors that are potentially relevant have not been included in the study thus far. Especially regarding the digital learning tools, digital skills (often referred to as information and communication technology (ICT) literacy) might influence self-regulated use and may be quite heterogeneously developed among the students [93,94,95]. There is still a research deficit, e.g., to what extent support for digital skills is initially required to enable the successful self-regulated use of digital teaching-and-learning tools.
Finally, the students’ feedback (RQ 3) represents subjective evaluations, which can also be distorted (e.g., effects of social desirability). However, the results were in line with the (subjective) ratings by the instructors (RQ 4) on various aspects of the digital tools.

7.3. Implications for Research and Practice

Even if the initial indications of the impact of digital learning tools on applicable PK facets were found (RQ 1), more in-depth studies are required. In particular, different modes of use, i.e., different degrees of integration into digital and on-site regular teaching, need to be examined, just like the use of the tools in other domains of teacher education and at other universities. The extent to which more experienced (teaching) students still benefit from the digital teaching-and-learning tools must also be examined for their use in further phases of teacher training, e.g., during supervised teaching (in Germany, teacher training is structured in multiple phases, with a usually five-year course of university study (including practical training units) being followed by one to two years (depending on the federal state) of supervised teaching at schools) as well as in later professional development. In addition, it is also important to examine the effect of the newly developed digital tools on teacher beliefs, which play a central role in transferring knowledge into practice [31].
Based on the insights and feedback from the students and instructors (RQ 3 and 4), the digital learning tools could be further optimized to expand the curriculum in a practice-oriented way. The development of further digital learning tools for (non-)domain-specific teacher PK facets for integration in teacher education seems promising. Accordingly, further digital learning tools have been developed and evaluated in the TWIND project, e.g., in the area of PK (e.g., self-regulated learning, classroom management) and in the area of media didactic challenges, which are especially important in times of digitalization of teaching. Teacher education practice ought to be advanced by publishing digital tools to public platforms as open educational resources (OER) to ensure free, immediate, and permanent access to educational resources [96].

Author Contributions

Conceptualization, J.R.-S., O.Z.-T., S.B. and A.M.; Methodology, J.R.-S. and O.Z.-T.; Software, J.R.-S.; Validation, O.Z.-T. and A.M.; Formal analysis, J.R.-S.; Investigation, J.R.-S., O.Z.-T., K.F. and A.M.; Resources, O.Z.-T. and A.M.; Data curation, J.R.-S. and A.M.; Writing—original draft, J.R.-S., K.F. and M.S.; Writing—review & editing, O.Z.-T., S.B. and A.M.; Visualization, J.R.-S. and K.F.; Supervision, O.Z.-T.; Project administration, O.Z.-T. and A.M.; Funding acquisition, O.Z.-T. and A.M. All authors have read and agreed to the published version of the manuscript.


This study was funded by the German Federal Ministry of Education and Research with the funding number 01JA2042A.

Institutional Review Board Statement

Ethical review and approval were not required for the study of human participants in accordance with the local legislation and institutional requirements.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors without undue reservation.


We would like to thank all the students in teacher training who contributed to the development of the digital learning tools and to their evaluation.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.


  1. Minea-Pic, A. Innovation teachers’ profesional learning trough digital technologies. In OECD Education Working Papers; OECD Publishing: Paris, France, 2020; Volume 237. [Google Scholar] [CrossRef]
  2. Mason, L.; Boldrin, A.; Ariasi, N. Searching the Web to learn about a controversial topic: Are students epistemically active? Instr. Sci. 2010, 38, 607–633. [Google Scholar] [CrossRef]
  3. Yu, R.; Gadiraju, U.; Holtz, P.; Rokicki, M.; Kemkes, P.; Dietze, S. Predicting User Knowledge Gain in Informational Search Sessions. In Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, MI, USA, 8–12 July 2018; pp. 75–84. [Google Scholar] [CrossRef]
  4. Laurillard, D. Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies; Routledge: Milton Park, UK; Abingdon, UK; Oxfordshire, UK, 2002. [Google Scholar] [CrossRef]
  5. Spooner-Lane, R.; Tangen, D.; Mercer, K.L.; Hepple, E.; Carrington, S. Building intercultural competence one “patch” at a time. Educ. Res. Int. 2013, 2013, 394829. [Google Scholar] [CrossRef]
  6. Aljawarneh, S.A. Reviewing and exploring innovative ubiquitous learning tools in higher education. J. Comput. High. Educ. 2020, 32, 57–73. [Google Scholar] [CrossRef]
  7. Biggs, J.; Tang, C. Teaching for Quality Learning at University, 4th ed.; McGraw-Hill: New York, NY, USA, 2011; ISBN 9780335242757. [Google Scholar]
  8. Ribeiro, L.R.C. The Pros and Cons of Problem-Based Learning from the Teacher’s Standpoint. J. Univ. Teach. Learn. Pract. 2011, 8, 34–51. [Google Scholar] [CrossRef]
  9. UNESCO. Education: From Disruption to Recovery. Available online: (accessed on 21 March 2023).
  10. Mulenga, E.M.; Marbán, J.M. Is COVID-19 the Gateway for Digital Learning in Mathematics Education? Contemp. Educ. Technol. 2020, 12, ep269. [Google Scholar] [CrossRef] [PubMed]
  11. Zawacki-Richter, O. The current state and impact of COVID-19 on digital higher education in Germany. Hum. Behav. Emerg. Technol. 2021, 3, 218–226. [Google Scholar] [CrossRef]
  12. Kuzmanovic, M.; Martic, M.; Popovic, M.; Savic, G. A new approach to evaluation of university teaching considering heterogeneity of students’ preferences. Int. J. High. Educ. Res. 2013, 66, 153–171. [Google Scholar] [CrossRef]
  13. Woods, R.; Baker, J.D.; Hopper, D. Hybrid structures: Faculty use and perception of web-based courseware as a supplement to face-to-face instruction. Internet High. Educ. 2004, 7, 281–297. [Google Scholar] [CrossRef]
  14. Bond, M.; Marín, V.I.; Dolch, C.; Bedenlier, S.; Zawacki-Richter, O. Digital transformation in German higher education: Student and teacher perceptions and usage of digital media. Int. J. Educ. Technol. High. Educ. 2018, 15, 1–20. [Google Scholar] [CrossRef]
  15. Breen, P. Developing Educatiors for The Digital Age: A Framework for Capturing Knowledge in Action; University of Westminster Press: London, UK, 2018. [Google Scholar] [CrossRef]
  16. Fitchett, P.G.; McCarthy, C.J.; Lambert, R.G.; Boyle, L. An examination of US first-year teachers’ risk for occupational stress: Associations with professional preparation and occupational health. Teach. Teach. 2018, 24, 99–118. [Google Scholar] [CrossRef]
  17. Keller-Schneider, M.; Zhong, H.F.; Yeung, A.S. Competence and challenges in professional development: Teacher perceptions at different stages of career. J. Educ. Teach. 2020, 46, 36–54. [Google Scholar] [CrossRef]
  18. Cooper, J.M. Classroom Teaching Skills; Wadsworth Cengage Learning: Belmont, CA, USA, 2010; ISBN-13: 978-1-133-60276-7. [Google Scholar]
  19. Kersting, N.B.; Sutton, T.; Kalinec-Craig, C.; Stoehr, K.J.; Heshmati, S.; Lozano, G.; Stigler, J.W. Further exploration of the classroom video analysis (CVA) instrument as a measure of usable knowledge for teaching mathematics: Taking a knowledge system perspective. ZDM 2016, 48, 97–109. [Google Scholar] [CrossRef]
  20. Oser, F.; Salzmann, P.; Heinzer, S. Measuring the competence-quality of vocational teachers: An advocatory approach. Empir. Res. Vocat. Educ. Train. 2009, 1, 65–83. [Google Scholar] [CrossRef]
  21. DeLoach, S.B.; Perry-Sizemore, E.; Borg, M.O. Creating quality undergraduate research programs in economics: How, when, where (and why). Am. Econ. 2012, 57, 96–110. [Google Scholar] [CrossRef]
  22. Branch, J.W.; Burgos, D.; Serna, M.D.A.; Ortega, G.P. Digital Transformation in Higher Education Institutions: Between Myth and Reality. In Radical Solutions and eLearning. Lecture Notes in Educational Technology; Burgos, D., Ed.; Springer: Berlin/Heidelberg, Germany, 2020; pp. 41–50. [Google Scholar] [CrossRef]
  23. Ugur, N.G. Digitalization in higher education: A qualitative approach. Int. J. Technol. Educ. Sci. 2020, 4, 18–25. [Google Scholar] [CrossRef]
  24. Falloon, G. From digital literacy to digital competence: The teacher digital competency (TDC) framework. Educ. Tech. Res. Dev. 2020, 68, 2449–2472. [Google Scholar] [CrossRef]
  25. van Lier, L. Action-based Teaching, Autonomy and Identity. Innov. Lang. Learn. Teach. 2007, 1, 46–65. [Google Scholar] [CrossRef]
  26. Piccardo, E.; North, B. The Action-Oriented Approach: A Dynamic Vision of Language Education; Multilingual Matters: Bristol, UK, 2019. [Google Scholar] [CrossRef]
  27. Weinert, F.E. Concept of competence: A conceptual clarification. In Defining and Selecting Key Competencies; Rychen, D.S., Salganik, L.H., Eds.; Hogrefe & Huber: Bern, Switzerland, 2001; pp. 45–65. [Google Scholar]
  28. Hakim, A. Contribution of Competence Teacher (Pedagogical, Personality, Professional Competence and Social) On the Performance of Learning. Int. J. Eng. Sci. 2015, 4, 1–12. [Google Scholar] [CrossRef]
  29. König, J. Teachers’ Pedagogical Beliefs. Definition and Operationalisation, Connections to Knowledge and Performance, Development and Change; Waxmann: Münster, Deutschland, 2012. [Google Scholar] [CrossRef]
  30. Blömeke, S.; Gustafsson, J.-E.; Shavelson, R.J. Beyond dichotomies: Competence viewed as a continuum. Z. Für Psychol. 2015, 223, 3–13. [Google Scholar] [CrossRef]
  31. Gregoire, M. Is it a challenge or a threat? A dual-process model of teachers’ cognition and appraisal processes during conceptual change. Educ. Psychol. Rev. 2003, 15, 147–179. [Google Scholar] [CrossRef]
  32. Shulman, L.S. Those who understand: Knowledge growth in teaching. Educ. Res. 1986, 15, 4–14. [Google Scholar] [CrossRef]
  33. Koehler, M.J.; Mishra, P. What is technological pedagogical content knowledge? Contemp. Issues Technol. Teach. Educ. 2009, 9, 60–70. [Google Scholar] [CrossRef]
  34. Chai, C.; Koh, J.; Tsai, C.-C. A Review of Technological Pedagogical Content Knowledge. J. Educ. Technol. Soc. 2013, 16, 31–51. [Google Scholar]
  35. Voss, T.; Kunter, M.; Baumert, J. Assessing Teacher Candidates’ General Pedagogical/Psychological Knowledge: Test Construction and Validation. J. Educ. Psychol. 2011, 103, 952–969. [Google Scholar] [CrossRef]
  36. Heinze, A.; Dreher, A.; Lindmeier, A.; Niemand, C. Akademisches versus schulbezogenes Fachwissen—Ein differenzierteres Modell des fachspezifischen Professionswissens von angehenden Mathematiklehrkräften der Sekundarstufe [Academic versus school-based professional knowledge—A more sophisticated model of the subject-specific professional knowledge of prospective secondary mathematics teachers]. Z. Für Erzieh. 2016, 19, 329–349. [Google Scholar]
  37. Applegate, J.L. Adaptive communication in educational contexts: A study of teachers’ communicative strategies. Commun. Educ. 2009, 29, 158–170. [Google Scholar] [CrossRef]
  38. De Corte, E. Constructive, Self-Regulated, Situated, and Collaborative Learning: An Approach for the Acquisition of Adaptive Competence. J. Educ. 2020, 192, 33–47. [Google Scholar] [CrossRef]
  39. Morris-Rothschild, B.K.; Brassard, M.K. Teachers’ conflict management styles: The role of attachment styles and classroom management efficacy. J. Sch. Psychol. 2006, 44, 105–121. [Google Scholar] [CrossRef]
  40. Brühwiler, C.; Vogt, F. Adaptive teaching competency. Effects on quality of instruction and learning outcomes. J. Educ. Res. 2020, 1, 119–142. [Google Scholar] [CrossRef]
  41. Vogt, F.; Rogalla, M. Developing adaptive teaching competency through coaching. Teach. Teach. Educ. 2009, 25, 1051–1060. [Google Scholar] [CrossRef]
  42. Bell, S.R. Power, territory, and interstate conflict. Confl. Manag. Peace Sci. 2017, 34, 160–175. [Google Scholar] [CrossRef]
  43. Morine-Dershimer, G.; Kent, T. The complex nature and sources of teachers’ pedagogical knowledge. In Examining Pedagogical Content Knowledge; Gess-Newsome, J., Lederman, N.G., Eds.; Springer: Berlin/Heidelberg, Germany, 1999; pp. 21–50. [Google Scholar] [CrossRef]
  44. Schipper, T.; Goei, S.L.; de Vries, S.; van Veen, K. Professional growth in adaptive teaching competence as a result of Lesson Study. Teach. Teach. Educ. 2017, 68, 289–303. [Google Scholar] [CrossRef]
  45. Huang, K.; Lubin, I.A.; Ge, X. Situated learning in an educational technology course for pre-service teachers. Teach. Teach. Educ. 2011, 27, 1200–1212. [Google Scholar] [CrossRef]
  46. Renkl, A. Toward an instructionally oriented theory of example-based learning. Cogn. Sci. 2014, 38, 1–37. [Google Scholar] [CrossRef] [PubMed]
  47. Catalano, A. The effect of a situated learning environment in a distance education information literacy course. J. Acad. Librariansh. 2015, 41, 653–659. [Google Scholar] [CrossRef]
  48. Nuthall, G. Relating classroom teaching to student learning: A critical analysis of why research has failed to bridge the theory-practice gap. Harv. Educ. Rev. 2004, 74, 273–306. [Google Scholar] [CrossRef]
  49. Allen, J.M. Valuing practice over theory: How beginning teachers re-orient their practice in the transition from the university to the workplace. Teach. Teach. Educ. 2009, 25, 647–654. [Google Scholar] [CrossRef]
  50. Korthagen, F.A.; Loughran, J.J.; Russell, T. Developing fundamental principles for teacher education programs and practices. Teach. Teach. Educ. 2006, 22, 1020–1041. [Google Scholar] [CrossRef]
  51. Dicke, T.; Elling, J.; Schmeck, A.; Leutner, D. Reducing reality shock: The effects of classroom management skills training on beginning teachers. Teach. Teach. Educ. 2015, 48, 1–12. [Google Scholar] [CrossRef]
  52. Walstad, W.B.; Salemi, M.K. Results from a Faculty Development Program in Teaching Economics. J. Econ. Educ. 2011, 42, 283–293. [Google Scholar] [CrossRef]
  53. Codreanu, E.; Sommerhoff, D.; Huber, S.; Ufer, S.; Seidel, T. Between authenticity and cognitive demand: Finding a balance in designing a video-based simulation in the context of mathematics teacher education. Teach. Teach. Educ. 2020, 95, 1–12. [Google Scholar] [CrossRef]
  54. Schaeper, H. Development of competencies and teaching–learning arrangements in higher education: Findings from Germany. Stud. High. Educ. 2009, 34, 677–697. [Google Scholar] [CrossRef]
  55. Cullen, R.; Harris, M.; Hill, R.R. The Learner-Centered Curriculum: Design and Implementation; John Wiley & Sons: Hoboken, NJ, USA, 2012; ISBN 978-1-118-17102-8. [Google Scholar]
  56. Loyens, S.M.; Magda, J.; Rikers, R.M. Self-directed learning in problem-based learning and its relationships with self-regulated learning. Educ. Psychol. Rev. 2008, 20, 411–427. [Google Scholar] [CrossRef]
  57. Loeng, S. Self-directed learning: A core concept in adult education. Educ. Res. Int. 2020, 2020, 3816132. [Google Scholar] [CrossRef]
  58. Pourshafie, T.; Murray-Harvey, R. Facilitating problem-based learning in teacher education: Getting the challenge right. J. Educ. Teach. 2013, 39, 169–180. [Google Scholar] [CrossRef]
  59. Merrill, M.D. First principles of instruction. Educ. Technol. Res. Dev. 2002, 50, 43–49. [Google Scholar] [CrossRef]
  60. Seidel, T.; Stürmer, K. Modeling and Measuring the Structure of Professional Vision in Pre-Service Teachers. Am. Educ. Res. J. 2014, 51, 739–771. [Google Scholar] [CrossRef]
  61. van Es, E.A.; Sherin, M.G. Learning to Notice: Scaffolding New Teachers’ Interpretations of Classroom Interactions. J. Technol. Teach. Educ. 2002, 10, 571–596. [Google Scholar]
  62. Ataizi, M. Situated Cognition. In Encyclopedia of the Sciences of Learning; Seel, N.M., Ed.; Springer Science & Business Media: Bostan, MA, USA, 2012; pp. 3082–3084. ISBN 978-1-4419-1428-6. [Google Scholar]
  63. Hirsch, S.E.; Kennedy, M.J.; Haines, S.J.; Thomas, C.N.; Alves, K.D. Improving preservice teachers’ knowledge and application of functional behavioral assessments using multimedia. Behav. Disord. 2015, 41, 38–50. [Google Scholar] [CrossRef]
  64. Kuhn, C.; Alonzo, A.C.; Zlatkin-Troitschanskaia, O. Evaluating the pedagogical content knowledge of pre- and in-service teachers of business and economics to ensure quality of classroom practice in vocational education and training. Empir. Res. Vocat. Educ. Train. 2016, 8, 1–18. [Google Scholar] [CrossRef]
  65. Alonzo, A.C.; Kobarg, M.; Seidel, T. Pedagogical content knowledge as reflected in teacher-student interactions: Analysis of two video cases. J. Res. Sci. Teach. 2012, 49, 1211–1239. [Google Scholar] [CrossRef]
  66. Pal, D.; Patra, S. University Students’ Perception of Video-Based Learning in Times of COVID-19: A TAM/TTF Perspective. Int. J. Hum. Comput. Interact. 2021, 37, 903–921. [Google Scholar] [CrossRef]
  67. Chen, G.; Chan, C.K.; Chan, K.K.; Clarke, S.N.; Resnick, L.B. Efficacy of video-based teacher professional development for increasing classroom discourse and student learning. J. Learn. Sci. 2020, 29, 642–680. [Google Scholar] [CrossRef]
  68. Gaudin, C.; Chalies, S. Video viewing in teacher education and professional development: A literature review. Educ. Res. Rev. 2015, 16, 41–76. [Google Scholar] [CrossRef]
  69. Almara’beh, H.; Amer, E.F.; Sulieman, A. The effectiveness of multimedia learning tools in education. Int. J. 2015, 5, 761–764. [Google Scholar]
  70. Wadill, D. Action E-Learning: An Exploratory Case Study of Action Learning Applied Online. Hum. Resour. Dev. Int. 2006, 9, 157–171. [Google Scholar] [CrossRef]
  71. Brückner, S.; Saas, H.; Reichert-Schlax, J.; Zlatkin-Troitschanskaia, O.; Kuhn, C. Digitale Medienpakete zur Förderung handlungsnaher Unterrichtskompetenzen [Digital media packages for the promotion of action-oriented teaching skills]. Bwp@ Berufs-und Wirtsch.–Online 2021, 40, 1–25. [Google Scholar]
  72. Dadvand, B.; Behzadpoor, F. Pedagogical knowledge in English language teaching: A lifelong-learning, complex-system perspective. Lond. Rev. Educ. 2020, 18, 107–126. [Google Scholar] [CrossRef]
  73. Brown, D.F. The Significance of Congruent Communication in Effective Classroom Management. Clear. House A. J. Educ. Strateg. Issues Ideas 2005, 79, 12–15. [Google Scholar] [CrossRef]
  74. Pecore, J.L. Beyond beliefs: Teachers adapting problem-based learning to preexisting systems of practice. Interdiscip. J. Probl. Based Learn. 2013, 7, 7–33. [Google Scholar] [CrossRef]
  75. Richards, K. ‘Being the teacher’: Identity and classroom conversation. Appl. Linguist. 2006, 27, 51–77. [Google Scholar] [CrossRef]
  76. Zimmerman, D.H. Discoursal identities and social identities. In Identities in Talk; Antaki, C., Widdicombe, S., Eds.; Sage: Thousand Oaks, CA, USA, 1998; pp. 87–106. [Google Scholar] [CrossRef]
  77. Lindner, K.-T.; Nusser, L.; Gehrer, K.; Schwab, S. Differentiation and Grouping Practices as a Response to Heterogeneity–Teachers’ Implementation of Inclusive Teaching Approaches in Regular, Inclusive and Special Classrooms. Front. Psychol. 2021, 12, 1–16. [Google Scholar] [CrossRef]
  78. Gibbons, P. Scaffolding Language, Scaffolding Learning: Teaching Second Language Learners in the Mainstream Classroom; Heinemann: Portsmouth, NH, USA, 2002. [Google Scholar]
  79. Hammond, J.; Gibbons, P. What is scaffolding. Teach. Voices 2005, 8, 8–16. [Google Scholar]
  80. Müller, A. Profession und Sprache: Die Sicht der (Zweit-)Spracherwerbsforschung [Profession and language: The view of (second) language acquisition research]. In Kindheit und Profession—Konturen und Befunde eines Forschungsfeldes [Childhood and Profession—Contours and Findings of a Field of Research]; Betz, T., Cloos, P., Eds.; Beltz Juventa: Weinheim, Germany, 2014; pp. 66–83. [Google Scholar]
  81. Hopp, H.; Thoma, D.; Tracy, R. Sprachförderkompetenz pädagogischer Fachkräfte [Language promotion competence of pedagogical staff]. Z. Für Erzieh. 2010, 13, 609–629. [Google Scholar] [CrossRef]
  82. Eberhardt, A.; Witte, A. Professional Development of Foreign Language Teachers: The Example of the COMENIUS Project Schule im Wandel (School Undergoing Change). Teanga 2016, 24, 34–43. [Google Scholar] [CrossRef]
  83. Okoli, A.C. Relating Communication Competence to Teaching Effectiveness: Implication for Teacher Education. J. Educ. Pract. 2017, 8, 150–154. [Google Scholar]
  84. Kuh, G.D.; Kinzie, J.; Buckley, J.A.; Bridges, B.K.; Hayek, J.C. What Matters to Student Success: A Review of the Literature; National Postsecondary Education Cooperative: Washington, DC, USA, 2006. [Google Scholar]
  85. Gardner, M.M.; Hickmott, J.; Ludvik, M.J.B. Demonstrating Student Success: A Practical Guide to Outcomes-Based Assessment of Learning and Development in Student Affairs; Stylus Publishing, LLC: Sterling, VA, USA, 2012. [Google Scholar]
  86. Aristovnik, A.; Keržič, D.; Tomaževič, N.; Umek, L. Demographic determinants of usefulness of e-learning tools among students of public administration. Interact. Technol. Smart Educ. 2016, 13, 289–304. [Google Scholar] [CrossRef]
  87. Turner, W.D.; Solis, O.J.; Kincade, D.H. Differentiating instruction for large classes in higher education. Int. J. Teach. Learn. High. Educ. 2017, 29, 490–500. [Google Scholar]
  88. Leiner, D.J. SoSci Survey, Version 3.1.06. Computer Software. 2019. Available online: (accessed on 5 May 2023).
  89. Hu, L.T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  90. Marsh, H.W.; Hau, K.T.; Wen, Z. In search of golden rules: Comment on hypothesis-testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler’s (1999) findings. Struct. Equ. Model. 2004, 11, 320–341. [Google Scholar] [CrossRef]
  91. Stata Corp. Stata Statistical Software: Release 16; StataCorp LLC: College Station, TX, USA, 2019. [Google Scholar]
  92. Ledermüller, K.; Fallmann, I. Predicting learning success in online learning environments: Self-regulated learning, prior knowledge and repetition. Z. Für Hochsch. 2017, 12, 79–99. [Google Scholar] [CrossRef]
  93. Ezziane, Z. Information Technology Literacy: Implications on Learning and Teaching. Educ. Technol. Soc. 2007, 10, 175–191. [Google Scholar] [CrossRef]
  94. Educational Testing Services [ETS]. Digital Transformation. A Framework for ICT Literacy. Available online: (accessed on 21 March 2023).
  95. Anthonysamy, L.; Choo, A. Investigating Self-Regulated Learning Strategies for Digital Learning Relevancy. Malays. J. Learn. Instr. 2021, 18, 29–64. [Google Scholar] [CrossRef]
  96. Jhangiani, R.S. The Philosophy and Practices That Are Revolutionizing Education and Science; Ubiquity Press: London, UK, 2017. [Google Scholar] [CrossRef]
Figure 1. Expanded model of teachers’ actionable instructional competencies (adapted from [36]) augmented by PK.
Figure 1. Expanded model of teachers’ actionable instructional competencies (adapted from [36]) augmented by PK.
Education 13 00481 g001
Figure 2. Structure of the learning tool ‘Professional communication in classrooms’.
Figure 2. Structure of the learning tool ‘Professional communication in classrooms’.
Education 13 00481 g002
Figure 3. Structure of the learning tool ‘Multilingualism in classrooms’.
Figure 3. Structure of the learning tool ‘Multilingualism in classrooms’.
Education 13 00481 g003
Figure 4. Study design. Notes: Group A (M) = group with intervention regarding multilingualism. Group B (C) = group with intervention regarding professional communication. PK = pedagogical knowledge. LMS = learning management system.
Figure 4. Study design. Notes: Group A (M) = group with intervention regarding multilingualism. Group B (C) = group with intervention regarding professional communication. PK = pedagogical knowledge. LMS = learning management system.
Education 13 00481 g004
Figure 5. Example test item. Note: For this paper, the example item was translated from German to English.
Figure 5. Example test item. Note: For this paper, the example item was translated from German to English.
Education 13 00481 g005
Table 1. Descriptive statistics: complete sample and divided into subgroups.
Table 1. Descriptive statistics: complete sample and divided into subgroups.
VariableComplete Sample
n = 63
Group A (M)
na = 16
Group B (C)
nb = 47
Differences between the Groups *
Gender, male, n/N (%)11/63 (17.46%)7/16 (43.75%)4/47 (8.51%)0.001
Age, M ± SD23.98 (±4.039)25.00 (±3.033)23.63 (±4.307)0.246
UEQ grade, M ± SD2.36 (±0.624)2.18 (±0.529)2.43 (±0.647)0.179
Study phase, masters’ program, n/N (%)32/63 (50.79%)16/16 (100%)16/47 (34.04%)<0.001
Note: UEQ = university entrance qualification. * Depending on the scale level, a t-test or a χ2-test was used for significance testing. Bold and italicized significance levels indicate significance at the 5% level.
Table 2. Descriptive statistics regarding scores at t1 and t2 (matched data).
Table 2. Descriptive statistics regarding scores at t1 and t2 (matched data).
ContentMeasurement PointGroup AffiliationDifferences between Groups, p
Group A (M)Group B (C)
Score PK on conversation, M ± SDt13.67 (± 1.291)3.05 (± 1.290)0.191
t24.00 (± 1.464)3.65 (± 1.325)0.396
Differences between time points, pp0.3710.011
Score PK on multilingualism, M ± SDt16.38 (± 2.705)9.20 (± 1.651)<0.001
t29.19 (± 2.007)9.59 (± 1.317)0.369
Differences between time points, pp0.0020.130
Note: Bold and italicized significance levels indicate significance at the 5% level.
Table 3. Descriptive statistics on usage behavior.
Table 3. Descriptive statistics on usage behavior.
VariablesAnswer OptionsGroup A (M)Group B (C)
n/N (%)Difference Score L, M ± SDn/N (%)Difference Score C, M ± SD
FrequencyAt least several times a week0/16 (0.00%)-12/47 (25.53%)1.36 ± 1.433
Up to once a week16/16 (100.00%) 2.81 ± 3.16735/47 (74.47%)0.34 ± 1.428
Duration<90 min8/16 (50.00%)1.13 ± 2.74819/47 (40.43%)0.61 ± 1.650
>90–120 min8/16 (50.00%)4.50 ± 2.72628/47 (59.57%)0.60 ± 1.384
Learning formAlone16/16 (100.00%)2.81 ± 3.16745/47 (97.83%)0.55 ± 1.452
With others 0/16 (0%) -1/47 (2.17%)3.00 ± 0.000
Table 4. Results from t-tests on usage behavior with focus on frequency and duration in group B (C).
Table 4. Results from t-tests on usage behavior with focus on frequency and duration in group B (C).
VariableGroupsM ± SDDifferences between Groups, p
Comparison of duration groups
Frequency: overall<90 min0.61 (±1.649)0.981
>90 min0.60 (±1.384)
Frequency: at least several times a week<90 min2.67 (±1.528)0.059
>90 min0.88 (±1.126)
Frequency: up to once a week<90 min0.20 (±1.373)0.601
>90 min0.47 (±1.505)
Comparison of frequency groups
Duration: overallAt least several times a week1.36 (±1.433)0.048
Up to once a week0.34 (±1.428)
Duration: <90 minAt least several times a week2.67 (±1.528)0.013
Up to once a week0.20 (±1.373)
Duration: >90 minAt least several times a week0.88 (±1.126)0.507
Up to once a week0.47 (±1.505)
Note: Bold and italicized significance levels indicate significance at the 5% level.
Table 5. Results from regression analyses regarding usage behavior on difference score in group B (C).
Table 5. Results from regression analyses regarding usage behavior on difference score in group B (C).
F(3, 39) = 2.88, p = 0.048, R2 = 0.181
Usage BehaviorßBSEp
Frequency of use, at least several times a week0.3491.1700.4970.024
Duration, >90 min0.103−0.3060.4420.492
Social form, with others0.2972.8901.4300.050 *
Constant 1.5860.5290.005
Note: * not significant at a 5% α-level. Bold and italicized significance levels indicate significance at the 5% level.
Table 6. Descriptives on students’ evaluations.
Table 6. Descriptives on students’ evaluations.
Group A (M)Group B (C)
VariableItemAnswer OptionsM ± SDM ± SD
Overall evaluation of the tool
Overall satisfactionOverall satisfaction with tool.1 (fully satisfied) to 5 (not satisfied/deficient)2.00 ± 0.7562.48 ± 1.151
UsefulnessI found the tool useful for my (future) work as a teacher.1 (not at all) to 4 (full agreement)3.07 ± 0.7043.23 ± 0.859
Innovative additionThe tool is an innovative addition to the curriculum.1 (not at all) to 4 (full agreement)3.20 ± 0.7752.95 ± 0.914
Relevance of contentThe contents are highly relevant for me.1 (not at all) to 4 (full agreement)2.87 ± 0.6403.18 ± 0.691
RecommendationWould you recommend the tool to other students?1 (not at all) to 4 (full agreement)3.13 ± 0.6403.07 ± 0.950
Added valueOverall, the tool was very well done and represents an added value in my studies.1 (not at all) to 4 (full agreement)3.13 ± 0.5163.20 ± 0.851
Motivational facets
Arousing interestThe tool aroused my interest and attention.1 (not at all) to 4 (full agreement)2.93 ± 0.7043.02 ± 0.876
EnjoyableI enjoyed working with the tool.1 (not at all) to 4 (full agreement)3.13 ± 0.8342.73 ± 0.872
Working with the tool
Self-regulated learningThe tool enables self-regulated learning.1 (not at all) to 4 (full agreement)3.73 ± 0.4583.59 ± 0.542
Clarity of goalsThe goals of the tool became clear to me.1 (not at all) to 4 (full agreement)3.33 ± 0.7243.36 ± 0.718
Intuitive usageThe tool enabled simple intuitive use.1 (not at all) to 4 (full agreement)3.33 ± 0.7243.39 ± 0.655
Impact of the tool
Concrete actions learnedI have learned not only facts but also concrete actions.1 (not at all) to 4 (full agreement)2.53 ± 0.6403.09 ± 0.802
Expanded knowledgeMy professional knowledge was expanded.1 (not at all) to 4 (full agreement)3.20 ± 0.5613.32 ± 0.829
Successful performanceI consider my performance successful.1 (not at all) to 4 (full agreement)2.80 ± 0.5612.89 ± 0.538
Learning progressHow would you rate your learning progress with the tool?1 (very high) to 5 (very low)2.67 ± 0.6172.30 ± 0.765
Note. Green color indicates tendency to positive evaluation (scale split). More saturated color indicates more positive evaluations.
Table 7. Results from regression analyses regarding subjective evaluations on difference score in group B (C).
Table 7. Results from regression analyses regarding subjective evaluations on difference score in group B (C).
F(15, 27) = 2.08, p = 0.047, R2 = 0.536
Subjective EvaluationsßBSEp
Overall satisfaction *, not satisfied/deficient−0.445−0.5790.3200.081
Usefulness, fully agree−0.505−0.8700.6440.188
Innovative addition, fully agree−0.048−0.0780.5030.878
Relevance of content, fully agree−0.041−0.0860.3760.820
Recommendation, fully agree−1.134−1.770.5130.002
Added value, fully agree−0.380−0.6550.6280.306
Arousing interest, fully agree0.0570.0970.4350.826
Enjoyable, fully agree0.2700.4540.4180.287
Self-regulated learning, fully agree−0.252−0.6850.4500.181
Clarity of goals, fully agree0.2470.5080.4570.275
Intuitive usage, fully agree0.2140.4830.5750.408
Concrete actions learned, fully agree−0.917−1.7030.4580.001
Expanded knowledge, fully agree0.9821.7500.6010.007
Successful performance, fully agree0.0470.1360.4630.771
Learning progress *, very low−0.783−1.5030.5500.011
Constant 12.9163.9640.003
Note: * Lower numbers indicate higher agreement. Green color indicates positive correlations, yellow indicates negative ones. Bold and italicized significance levels indicate significance at the 5% level.
Table 8. Excerpts of feedback from students.
Table 8. Excerpts of feedback from students.
Items Open Responses
This was particularly well done.
Preparation and design of the materials. (an19daj)
Prejudices that I knew from myself have been reflected. (el15chf)
Diverse media. (li24dat)
Podcasts. (ud31ppn)
Explanation videos. (ab03dka)
The various tasks on the topics. (ar02ey1)
Summary of the main points. (ar05ldv)
The videos did a good job of introducing the topics. (ar10dkf)
Clarity of the content. (ar27ttl)
It was a good addition and appropriate for the course. (at02cht)
Contents are very useful for the teaching profession. (at02cht)
Contents were not known to me before. (at02cht)
Give specific goals for action, such as the 10 most important aspects of conducting a conversation. (at02cht)
The tasks were very varied. (er24ese)
Structure and elaboration were clear. (ir20esl)
Practical examples. (we25tet)
Such media packages should be integrated more often in other educational study programs. (we25tet)
This should be revised/Potential for improvement.
Such a media package is difficult to implement as a pure self-study program. Also, the situations are partly not clear, as they can only be considered in isolation. (ax20dkj)
Self-regulated control of the free text tasks with sample solution in the form of the test was tedious and frustrating. (te24iel)
Because it still needs to be improved [audio tracks]. (ab03dka)
The audio in the videos partly overlaps. (on10tbl)
Table 9. Excerpts of feedback from instructors.
Table 9. Excerpts of feedback from instructors.
Instructor No.Positive FeedbackCritical Feedback
1[The digital media tool] was very well prepared and informative, the benefit given. Especially the explanatory videos [...] were well received for self-education. The examples in the videos were also very appropriate.The reflection prompt at the end was rated by students as problematic for self-regulated learning, many [students] wished for discussions in the group and a more detailed application of the contents.
2The participation of the students in the processing of the digital media tools can be assessed as quite positive. The tool is overall helpful.
[The digital media tools are] a very nice opportunity for interested and committed students further engage with the contents as a self-learning offer. Videos [...] also address interesting interactions in the classroom.
The tools could be extended, or complementary tools could be developed on other relevant aspects.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Reichert-Schlax, J.; Zlatkin-Troitschanskaia, O.; Frank, K.; Brückner, S.; Schneider, M.; Müller, A. Development and Evaluation of Digital Learning Tools Promoting Applicable Knowledge in Economics and German Teacher Education. Educ. Sci. 2023, 13, 481.

AMA Style

Reichert-Schlax J, Zlatkin-Troitschanskaia O, Frank K, Brückner S, Schneider M, Müller A. Development and Evaluation of Digital Learning Tools Promoting Applicable Knowledge in Economics and German Teacher Education. Education Sciences. 2023; 13(5):481.

Chicago/Turabian Style

Reichert-Schlax, Jasmin, Olga Zlatkin-Troitschanskaia, Katharina Frank, Sebastian Brückner, Moritz Schneider, and Anja Müller. 2023. "Development and Evaluation of Digital Learning Tools Promoting Applicable Knowledge in Economics and German Teacher Education" Education Sciences 13, no. 5: 481.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop