Next Article in Journal
University Teacher Students’ Learning in Times of COVID-19
Previous Article in Journal
The BioS4You European Project: An Innovative Way to Effectively Engage Z-Generation Students in STEM Disciplines
Previous Article in Special Issue
Analysis of Online Classes in Physical Education during the COVID-19 Pandemic
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Framework DiKoLAN (Digital Competencies for Teaching in Science Education) as Basis for the Self-Assessment Tool DiKoLAN-Grid

1
Biology Education, University of Salzburg, 5020 Salzburg, Austria
2
Didactics of Biology, University of Kassel, 34132 Kassel, Germany
3
Heidelberg School of Education, Universität Heidelberg, 69115 Heidelberg, Germany
4
Institute for Biology, Biology Education, Leipzig University, 04103 Leipzig, Germany
5
Department of Physics, Technical University of Darmstadt, 64289 Darmstadt, Germany
6
Chair of Science Education, University of Konstanz, 78464 Konstanz, Germany
7
University of Education Thurgau, 8280 Kreuzlingen, Switzerland
8
Chair of Physics Education, LMU München, 80333 Munich, Germany
9
Digital Education Research Group, University of Cologne, 50923 Cologne, Germany
10
Institute of Education, Leibniz University Hannover, 30159 Hannover, Germany
11
Biology Education Research Group, University of Kaiserslautern, 67663 Kaiserslautern, Germany
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(12), 775; https://doi.org/10.3390/educsci11120775
Submission received: 20 October 2021 / Revised: 16 November 2021 / Accepted: 26 November 2021 / Published: 30 November 2021

Abstract

:
For the planning and implementation of lessons with digital technologies, a subject-specific technology-related professional competence of teachers is of central importance. However, the competency frameworks developed so far remain in a general perspective and do not explicitly address subject-specific issues. Furthermore, digital competencies are predominantly measured with subject-unspecific self-assessment instruments, as subject-specific operationalizations for this area are not yet available in a differentiated form. In this article, the framework for Digital Competencies for Teaching in Science Education (DiKoLAN), a subject-specific framework for pre-service science teachers, is introduced, on the one hand, and, on the other hand, first results of a self-assessment tool based on the framework are described. DiKoLAN defines competency areas highly specific to science, as well as more general competency areas that include aspects common to all subjects. Each competency area is described by competency expectations, which, in turn, are structured with reference to the four technology-related dimensions of the TPACK framework (i.e., Technological and Pedagogical Content Knowledge) and three levels of performance (Name, Describe, Use/Apply). Derived from DiKoLAN, a corresponding self-assessment instrument (DiKoLAN-Grid) was developed and empirically tested for the two competency areas, (n = 118) and Information Search and Evaluation (n = 90), in biology student teachers. By means of path models, tendencies regarding structural correlations of the four components Special Tools (TK), Content-specific Context (TCK), Methods and Digitality (TPK), and Teaching (TPACK) are presented for both competency areas and discussed, as well as in comparison to previously conducted, subject-unspecific surveys.

1. Introduction

The successively increasing use of digital technologies for teaching and learning in class is accompanied by media-specific processes of change in schools and teacher training, which include, in particular, the structuring and operationalization of media-specific competence requirements. In order to use the potential of digital technologies in teaching-learning processes and to be able to plan and implement high-quality teaching using digital technologies, teachers need specific technology-related competencies. For example, associated demands on training structures and content are already being addressed in the 21st century skills movements launched more than a decade ago. The 21st century skills are partly pupil-focused and only proportionately address skills for using Information and Communication Technology (ICT) (e.g., [1]: ICT literacy). However, they are also related to different curricular changes internationally and the need for extensions of skills for teaching (e.g., [2]). Digitalization in education has now also been a driver for transforming instructional/educational goals for a number of years and is consequently reflected in a variety of standards for educating learners in the field. What has already been initiated in the international/Anglo-American area by the National Educational Technology Standards for Students [3,4] or the International Society for Technology in Education (ISTE) Standards for Students [5] has been initiated in Germany, in particular, by the formulation of the Standards for Education in a digitized World by the Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany [6]. Along with the skills and knowledge that learners should acquire in ICT, the question arose internationally, nationally, and regionally as to which competencies teachers need in order to support learners appropriately and to design lessons accordingly. An interdisciplinary working group (Working Group Digital Core Competencies) has been formed to answer this question with a focus on the natural sciences. For the genesis of the framework DiKoLAN (Digital Competencies for Teaching in Science Education), international ICT frameworks and models for the description of teacher digital competencies were first analyzed.

1.1. ICT Frameworks and Models for the Description of Teacher Digital Competencies

The orientation framework for subject-specific digital basic competencies for student teachers presented in this article was accompanied by an analysis of existing frameworks. Published frameworks were analyzed in order to identify the basic lines of a digitally supported competency profile laid out here. Due to their variety and versatility both internationally and, in particular, nationally, there is no claim to an all-encompassing review at this point. Rather, the aim is to summarize essential frameworks that are used as a basis for the development of other frameworks, that are frequently addressed in teacher education, and/or that have already built up a body of empirical research.
With the ICT Competency Framework for Teachers, UNESCO set an international benchmark in the description of competency facets required for ICT teaching at a very early stage [7]. This framework is based less on a concretizing than guiding claim for the design of educational concepts that harmonize digitally oriented teacher training with the respective national development goals. The target groups at the content level are teachers in primary and secondary schools, whose ICT teaching activities are described in terms of six aspects: ICT, ICT in education, curriculum assessment, pedagogy, organization, and administration, and teacher professional learning [7] (p. 9). In line with the intended and functional importance of the UNESCO ICT Competency Framework for Teachers, many other (inter-) national models were based on it, such as the ICT-enhanced Teacher Standards for Africa (ICTeTSA) [8]. The ICTeTSA focuses primarily on the use of digital technologies in the school context. It describes six domains, each of which is divided into sub-competencies (Knowledge, Skill, and Attitudes) and, in turn, differentiated into performance levels (Emerging, Applying, Infusing, Transforming).
Other frameworks, such as the ISTE Standards for Educators [9], preferably in the Anglo-American area, and the European Competence Framework for Educators (DigCompEdu [10]), for the European area, have a similar basic function in a more general description of guidelines and standards for teacher educators and teachers themselves. The ISTE Standards for Educators are identified as an informative map for educators who, as actors in different spheres of action (including Learner, Leader, Citizen), are helping to shape changes in a digitally-enabled future of teaching and learning. In addition to a superordinate description of what is required in terms of competence in these areas of action in order to effectively integrate technologies into learning and teaching, performance indicators are listed in each case that educators must fulfill in their professional practice. The ISTE standards, with their generalized orientation, are used in the development or further development of university training areas, such as the University of Northern Iowa (UNI) Preservice Teacher Technology Competencies, which are extended by a specific technology-related facet to specific content areas [11]. The European Union has developed a framework, DigCompEdu, which focuses on the pedagogical aspects of teachers in dealing with and using ICT. In the DigCompEdu framework, five subject-unspecific competency areas describe skills that teachers need in order to exploit potentials of digital technologies in teaching-learning processes [10]. Teaching digital competencies to students is also considered with a sixth, subject-unspecific competency area. The DigCompEdu framework with the associated self-assessment tool [12] also distinguishes different teaching institutions with three versions (for teachers at schools, at universities, in adult education). The specifications described in the DigCompEdu have, in turn, been incorporated into national curricular developments in many countries, for example, both the national frameworks in Spain (Common Digital Competence Framework for Teachers—CDCFT; INTEF [13]) and in Norway (Professional Digital Competence Framework for Teachers [14]) are based on it. Both are similar in structure to the DigCompEdu, but define key competency areas in a much more differentiated way via sub-competencies at different performance levels. Going beyond a general description of competencies for teachers regardless of their profession, the Austrian digi.kompP framework incorporates digital literacy acquisition in different phases of teacher education [15]. The framework classifies eight categories chronologically into the individual phases of education. In doing so, university-based teacher education (stage 1) is clearly delineated from subsequent stages. In particular, prerequisites at the beginning of the study are also included as expected competencies in a starting point (level 0). The framework serves, among other things, for self-assessment (DIGIcheck) and the continuous professional development of teachers.
In addition to the rather normative conceptualization in the frameworks for teacher digital competence, the descriptive-theoretical framework of Technological Pedagogical Content Knowledge (TPACK) plays an important role in the description of necessary teacher competencies for successful teaching with digital technologies. Mishra and Koehler’s [16] TPACK framework represents an extension of Shulman’s (1986) PCK framework. In the overlap with the previous knowledge and skills domains of the PCK framework can be seen that knowledge and skills about technology (TK) in teaching has a content component (TCK) and a pedagogical component (TPK), as well as a pedagogical content component (TPACK). Despite the clearly anchored subject reference in the knowledge and skills domains, the framework does neither cover subject-specific content nor are competency expectations specified [17]. The structure of TPACK with its content components are taken into the account during the development of frameworks mentioned earlier. As examples the teacher digital competence (TDC) framework [18] and the K19 framework for core competencies of teachers for teaching in a digitalized world [19], which is established in the national, German-speaking field, can be cited. Based on the latter one, the K19 framework, a scenario-based instrument IN.K19 was validated that simultaneously assesses technology-related teaching skills and attitudes via self-assessment [20]. The competency domains of the TPACK framework have already been investigated in numerous studies in terms of their respective characteristics in different target groups, as well as with regard to existing correlations between the dimensions and other person variables. Most of these studies used non-specific self-assessment instruments (e.g., [21,22]). The significance of individual knowledge domains that can be derived from such studies is inconsistent in that, for example, a direct influence on TPACK can be shown for TK in some studies, but not in others [23,24,25,26].
In summary, competency frameworks and/or models that are to be profitably used in teacher education should incorporate various aspects: Orientation to classroom practice, outlining a progression in education, and concretization on an interdisciplinary, as well as a subject-specific, level. On such a basis, appropriate curricula and assessments can be derived. With regard to the extent to which these aspects are taken into account, currently available frameworks and models differ greatly and show deficits. The frameworks and models outlined above provide an important orientation for structuring and training ICT competencies in the teaching profession. At the same time, in order to fulfill the broad respective international or national tasks, they are kept quite general. Therefore, they often consider mainly the pedagogical perspective on ICT for learning, or name only general performance goals. A concrete subject-specific description of individual competencies that a teacher needs at the subject level in order to integrate digital technologies goal-oriented in subject contexts into classroom activities is, therefore, mainly missing. Despite the structural implementation of the finally subject-specific content dimension in the DigCompEdu or TPACK framework, this dimension is not specified in terms of digital technology-related subject-specific content and competencies teachers need. No subject-specific digital competencies or levels to structure teacher training in terms of curriculum elements can be operationalized on the basis of the existing competency frameworks and models, because subject-specific formulations, as well as references between the formulation of competencies and the concrete teaching situations, are missing. Although these frameworks and models provide guidance, they must also be specified for individual subjects and the different phases of training.

1.2. The Framework DiKoLAN

Following up on existing frameworks (e.g., TPACK, DigCompEdu; see Section 1.1), DiKoLAN describes the digital competencies of student teachers that are relevant for the design and implementation of digitally supported science education (also see [17]). These include digitization-related knowledge, as well as methodological skills, which are important for a wide range of objectives in school practice. DiKoLAN distinguishes between four more general, less subject-specific competency areas (e.g., documentation, presentation) and three more subject-specific competency areas (e.g., data acquisition, data processing) (Figure 1). In addition to these seven central competency areas in university teaching, further peripheral competency areas must be considered. These include the general Technical Core Competencies (TCC), which describe the basic individual skills and abilities to name, describe, and use common connection systems and interfaces (e.g., HDMI, USB, and their connector formats). Furthermore, wireless connection standards should be named, and their range, as well as connection processes, should be described. The aim is to be able to set up and use functional working environments independently, and to solve arising technical problems. On the other hand, the legal framework must be considered. The competency area Legal Framework (LEG) describes the individual ability to identify legal issues when using digital technologies and platforms in schools, such as data protection regulations for the processing and storage of personal data, licensing regulations, age and content restrictions, and copyright. This includes a basic knowledge required to clarify the situation prior to the use of digital technologies. These two areas, which are superordinate to the content-related competency areas, are not specified in DiKoLAN, as they are subject to constant change due to technical progress and the changing legal basis, as well as country-specific regulations and licensing situations. Furthermore, the central core of DiKoLAN addresses university teacher training, which, e.g., in Germany, is supplemented by a consecutive practical phase, in which the legal action scope of teachers is a particular focus.
In addition to the peripheral competencies TCC and LEG, which are basic prerequisites for the integration of digital technologies but can be classified as rather subject-unspecific, seven subject-specific core competency areas are in the center of DiKoLAN. These seven core competency areas can be divided into more general and more subject-specific competency areas.
The more general competency areas include competencies that are essential for the implementation of digitally supported teaching in all subjects. These competencies should be applied to the methodological design with digital technologies in subject-specific implementations. For example, although the competency area of presentation is of central importance in subjects, such as art, music, or in the natural sciences, content is, nevertheless, presented and taught using different, subject-specific formats, and tools in each case. The competencies described in DiKoLAN for the more general competency areas relate to the natural sciences and can be divided into the areas of Documentation, Presentation, Communication/Collaboration, and Information Search and Evaluation (see Figure 1). The more subject-specific competency areas include digital competencies that are differentiated domain-specifically in individual subjects and corresponding disciplines. In the natural sciences, competency requirements are partly tied to subject-specific teaching methods and instructional strategies. In these methods and instructional strategies, hands-on work in the lab and in the field involves specific digital tools and media. This hands-on work includes, for example, the use of data loggers, handling digital microscopes, and likewise working in virtual laboratories and/or with specific software [27]. The more subject-specific competency areas for Data Acquisition, Data Processing, and Simulation and Modeling can be described taking into account various authors [28,29] and basing the competency areas on core elements of the natural sciences, that are, subject-specific working methods for data acquisition, data processing, and data representation [30].

1.2.1. Central Content-Related Competency Areas of DiKoLAN

DiKoLAN provides guidance with regard to the (subject-specific) performance level of basic competencies that are aimed for during the study phase. The framework is an open system in which already formulated competency areas can be adapted and supplemented with competency areas according to the subject. With changing subject-specific orientation (here, currently, natural sciences) and requirements of future teaching, as well as the structure of teacher training, further competency areas for the university teacher training are also conceivable, such as assessment and feedback. In the following, the seven competency areas are defined in terms of their content orientation [17] and briefly described (see Figure 1).
The competency area Documentation (DOC) comprises the individual ability to use digital tools for the systematic filing and permanent storage of data and information in order to use them professionally. This also includes taking, editing, and integrating photos, images, and videos, combining, and saving different media, structuring and archiving information, and displaying processes and contexts (see Figure 2). Using digital technologies in the classroom is associated, among other things, with the generation, storage, management, and archiving/backup of data. This use of digital technologies includes digital documentation of work products, as well as protected management of student data. Knowledge of tools for effective documentation seems essential to make informed decisions [31]. As a result, the “documentation” competency area addresses facets of data literacy that are often still underrepresented in teacher education [32], and it differentiates them subject-specifically.
The competency area Presentation (PRE) describes the individual ability to use digital media in a targeted and addressee-oriented manner for the knowledge acquisition and communication process, as well as the knowledge of the limits and potentials of different digital presentation media (see Figure 3). Presenting content, results, and processes is one of the central elements of (science) teaching. The range of applications extends from the use of presentation software to the design of one’s own digital presentation media. This includes, for example, the presentation of a process, such as time-lapse recordings of the growth of plants, as well as the presentation of images and videos made during microscopy. It is crucial that (future) teachers are able not only to master the individual digital presentation media confidently but also to recognize their advantages and disadvantages for lesson design (e.g., with regard to forms of organization and presentation, time required, motivation). Lessons should move beyond the purely frontal presentation and give students the opportunity to experience various presentation possibilities. For this purpose, suitable presentation media/forms must be known and a situation-specific selection must be made in order to release their potential for the learning process. Therefore, design aspects [33,34] and the principles of multimedia learning (CTML [35]) should be known and applied in the classroom.
The competency area Communication/Collaboration (COM) comprises the individual ability to plan synchronous or asynchronous work of individuals or groups with digital tools towards a common goal and to carry it out with the learners. For this purpose, shared files or products are created and processed, common data pools are created and dealt with, and systems for assigning rights are planned and implemented (see Figure 4). Teaching depends on communication and information exchange or management (e.g., [36]). Accordingly, the integration of appropriate communication techniques and phases, as well as collaborative elements for participation in the learning process, are key elements of teaching. Opportunities arise here through digitalization, e.g., simultaneous and location-independent work on the same document and the merging of data or results collected in a split work process for an overall solution. In view of the resulting time savings, more complex problems can be solved collaboratively in the process of knowledge acquisition. Digital communication tools support thereby the communicative control of such coordinated processes.
The competency area Information Search and Evaluation (ISE) includes the individual ability to use digital tools to obtain information on given subject areas or to solve problems, and to structure and evaluate them. For this purpose, search targets are defined, various information sources are integrated and evaluated (see Figure 5). Due to the complexity of available information and the diversity of editorial control mechanisms, skills for a successful internet-based information search have increased in importance. Therefore, both cognitive skills for information search and evaluation, and metacognitive skills for evaluating the search process are necessary, in addition to technical skills, for information retrieval [37]. The IPS-I model (information problem solving while using the internet [38]) formulates steps for answering problem-oriented questions that should be followed, at least in part, during a successful internet-based information search (e.g., [38,39]). The competent use of scientific databases is also a subject-specific requirement for science teachers.
The competency area Data Acquisition (DAQ) describes the individual ability to collect data directly or indirectly with digital tools by entering data, digitizing analogue data, taking images, and making films, using probes, sensors, and programs (or apps), and obtaining data from documentation media, such as images or videos (see Figure 6). Digital measurement and data acquisition offer access to scientific phenomena that are difficult to capture in analog form (e.g., high-speed and thermal imaging or the concentration of invisible gases, e.g., [40]). When investigating particularly fast or slow processes, computer-aided measurement acquisition has an advantage over analog measurement technology (e.g., [41]). In addition, the obtained measured values can be presented in different display formats, one after the other or simultaneously, several measurement series can be displayed comparatively in the same coordinate system, and the axis scaling can be changed as desired. Mobile devices, in which not only cameras but also sensors are integrated, expand the range of applications for measurement acquisition due to their independence from the power grid and the integration of a wide variety of sensors [42]. As these devices are available to students, instructional scenarios in science should take into account prior knowledge of the desired use, forms of organization, methods, or personal and social consequences of the intended action when using digital measurement.
The competency area Data Processing (DAP) describes the individual ability to process data with digital tools. This includes filtering, calculating new quantities, processing, statistical analysis and merging of data sets (see Figure 7). Data processing is particularly important in science teaching since it is only through the (further) processing of measurement data that insights can be gained, or further research questions can be raised. In science teaching, digital data processing opens up subject-specific access to specific data sets, which also adequately represent the current methods of the corresponding discipline and are authentic in this sense (e.g., the automatic evaluation of slopes in calibration curves for concentration measurements [43]). In addition, data collections and formats (in the form of measurement series, images, videos, audio files, or texts) can be filtered, recoded, and analyzed using digital tools, such as statistical programs or simple spreadsheets, in preparation or in class. A range of tools for filtering, statistical analysis, image, audio, and video analysis, including visualization options, are available for this purpose [44]. Digital data processing automates time-consuming processes so that more learning time is available in class for preparing and following up on the experiment.
The competency area Simulation and Modeling (SIM) describes the individual skills to perform computer-aided modeling and to use existing digital simulations in a targeted and addressee-oriented manner for the knowledge acquisition and communication process, as well as the knowledge of limits and potentials of models and simulations in the process of knowledge acquisition (see Figure 8). Based on underlying models and a limited number of variables, computer simulations can be used to model scientific processes. Simulations are used to analyze systems and processes, as well as to predict the development of rule-based systems over time. Since simulations respond to user input, influences and interrelationships can be investigated interactively and experienced directly, which ultimately facilitates an adaptation of the students’ mental models to the scientific processes. While, in simulations, the dynamic models are usually already implemented, modeling programs offer the possibility to develop and test own dynamic models of an issue or phenomenon. Furthermore, different models can be tested and their predictive power and vividness can be compared. The very large range of applications for different end devices also requires a deeper examination of the limitations of existing software in order to ensure that it can also be used in a professional and appropriate manner [17].

1.2.2. Structure and Gradations in the Competency Areas of DiKoLAN

For these seven central competency areas, competencies are described in tabular overviews based on the four technology-related knowledge dimensions of the TPACK framework ([45]; see Figure 2, Figure 3, Figure 4, Figure 5, Figure 6, Figure 7 and Figure 8): Special Tools (S/TK), Content-specific contexts (C/TCK), Methods and Digitality (M/TPK), and Teaching (T/TPACK). The formulated competencies are further differentiated in the tables on the basis of the performance levels Name (N), Describe (D), and Use/Apply (A). According to this differentiation, explicit knowledge facets are listed first (e.g., ISE.C.N3: Name at least two quality criteria for evaluating digital sources from a discipline perspective […]), then the procedure is to be described (e.g., ISE.C.D5: Describe at least two of the quality criteria listed in ISE.C.N3 […]), and then derive particular actions to apply the knowledge in lesson planning (e.g., ISE.C.A1: Conduct a subject-specific search according to the quality criteria and evaluate the results found). In formulating the individual competencies, care was taken to ensure that the domains are as distinct as possible and, thus, independent of one another.
These formulated competency descriptions can be used, among other things, to construct suitable tasks for competency tests with a fit to the corresponding competency areas. This would, for example, also enable transparent, subject-specific evaluation within the framework of teacher training.

1.3. Aims and Research Questions

One of the main objectives of this paper is to present the framework DiKoLAN that explicitly refers to the digital competencies of science student teachers. It considers a progression of competencies and describes and operationalizes competencies in detail. The competency areas delineated in the framework DiKoLAN, and the competence descriptions formulated for them (see Section 1.2), can be target dimensions of competence-oriented, digitization-related training in university teaching, as well as a reference for the self-assessment of student teachers. Therefore, competence descriptions must be operationalized.
Following the development of the framework, another goal was to derive items based on the competence descriptions formulated in DiKoLAN and to develop a corresponding instrument for self-assessment or external assessment (DiKoLAN-Grid). In this paper, the competencies of biology student teachers in two areas were recorded with the self-assessment tool DiKoLAN-Grid and were analyzed. In this context, the following research questions arise:
  • How do biology student teachers rate their competencies in Presentation, as well as in Information Search and Evaluation, in the four components of Teaching (T/TPACK), Methods and Digitality (M/TPK), Content-specific context (C/TCK), and Special Tools (S/TK)? Which parallels and differences appear in the level of competency assessment between the four components?
  • Which correlations of the four components can be seen within the two competency areas Presentation and Information Search and Evaluation?

2. Materials and Methods

2.1. Participants

The sample consisted of 118 biology student teachers (♀ = 75.5%, ♂ = 24.6%; MAge = 21.40, SD = 3.43) from two universities in Germany. The questionnaire for the competency area Presentation (PRE) was completed by n = 118, and the one for Information Search and Evaluation (ISE) by n = 90 students. The survey took place in the context of courses for biology education, in summer term 2020 and winter term 2020/2021. The participating biology teacher students were, on average, in their 2nd semester (M = 2.28, SD = 2.06).

2.2. Procedure

The online survey took place by means of questionnaires in a pdf document directly at the beginning of each semester. Care was taken to ensure that the same test conditions prevailed at both universities. The questionnaire was completed asynchronously besides the regular course time, and the participants were able to take as much time as they needed here. Prior to participation, subjects were informed that study participation was voluntary and anonymous, and they signed an appropriate consent form. The competency area of Presentation was assessed by a total of 20 items, and the competency area of Information Search and Evaluation by a total of 27 items, in which the participants were asked to assess their competence for individual items on a Likert scale of 1 (do not agree at all) to 8 (agree completely). Individual items were assigned to the four competence components (TK, TCK, TPK, and TPACK). The survey instrument can be found at https://dikolan.de/downloads (last access on 28 November 2021).
The reliability of the scales of all eight competence components are in the good to excellent range (Table 1 and Table 2).

2.3. Data Analysis

First, descriptive statistics were compiled and analyzed. The uni-dimensionality of the competency components was tested individually by confirmatory factor analysis (estimator: ML), in which these constructs represent the only latent variable. Since all items loaded highly on their respective factor (factor loadings above 0.50) and could, therefore, be assigned to their construct without doubt from an empirical point of view, all items were included in the further calculations. Subsequently, path models were used to analyze the interrelationships of each of the four competence components (TK, TCK, TPK, TPACK) of the two competency areas Presentation, and Information Search and Evaluation. The number of items used for modeling the competency components varied between four and nine, depending on the scale (Table 1 and Table 2). The model fits χ2, χ2/df, the Comparative-Fit-Index CFI, and the Root-Mean-Square-Error of Approximation RMSEA are used as an absolute Fit-Index for the global assessment of the goodness of the CFAs and the path models. The analyses reported below were performed using the statistical programs AMOS, R, and Mplus 8. Descriptive statistics and reliability analyses were calculated using the program SPSS 26 (listwise case exclusion for missing values).

3. Results

3.1. Descriptive Statistics

In the competency area Presentation, the student teachers rated their skills regarding TK (special tools) as best developed (Figure 9; MTK-PRE = 5.42, SDTK-PRE = 1.36), followed by TCK (content-specific context) (MTCK-PRE = 4.40, SDTCK-PRE = 1.42). TPK (methods, digitality), but also TPACK (teaching), were assessed worse in comparison (MTPK-PRE = 3.86, SDTPK-PRE = 1.34; MTPACK-PRE = 4.05, SDTPACK-PRE = 1.49) [46]. In the competency area Information Search and Evaluation, the picture is quite balanced, with the difference between the lowest (MTCK-ISE = 3.78, SDTCK-ISE = 1.28), and the highest mean value (MTPK-ISE = 4.58, SDTPK-ISE =1.65) being just 0.8. In addition, students rated their abilities similarly high regarding TK (MTK-ISE = 4.10, SDTK-ISE = 1.28) and TPACK (MTPACK-ISE = 4.11, SDTPACK-ISE = 1.55). Except for MTK-PRE, all scores are below the scale mean value of 4.5. Nevertheless, differences can be seen between the two competency areas. When paired t-tests are calculated, significant differences between the self-assessed competencies in the two areas are evident for TK (t(87) = 7.33, p < 0.001), for TCK (t(87) = 3.22, p < 0.01), and for TPK (t(87) = −4.62, p < 0.001). Only TPACK showed no significant differences (t(87) = 0.15, p = 0.88). Thus, biology student teachers rated themselves significantly better in TK and TCK in Presentation than in Information Search and Evaluation.

3.2. Path Models

3.2.1. Competency Area Presentation

All four theoretically postulated constructs of the Presentation competency area (TK-PRE, TCK-PRE, TPK-PRE, and TPACK-PRE) were modeled as latent variables in a joint confirmatory factor analysis, with the assigned items all loading significantly onto their respective factor. The standardized loadings for the items are in the good to very good range. They range from 0.59 to 0.85 for TK-PRE, from 0.59 to 0.81 for TCK-PRE, from 0.60 to 0.85 for TPK-PRE, and the items of TPACK-PRE range from 0.68 to 0.91.
In the next step, a path model was calculated to get a deeper look into the interrelationships of the four components and, thus, into the inner structure of the Presentation competency area (see Figure 10). In the path model, the three components TK-PRE, TCK-PRE, TPK-PRE represent predictors for the component TPACK-PRE. The model fit shows very good values (χ2 = 0.437, df = 1, χ2/df = 0.191, CFI = 1.00, RMSEA = 0.00). Both TCK-PRE (β = 0.44, SE = 0.091, p < 0.001) and TPK-PRE (β = 0.42, SE = 0.082, p < 0.001) show a high significant effect on TPACK-PRE. In contrast, there is no predictive effect of TK-PRE on TPACK-PRE (β = 0.00, SE = 0.089, p = 0.959). TCK-PRE and TPK-PRE can account for 58% of the variance in TPACK-PRE. As expected, the three predictors (TK-PRE, TCK-PRE, TPK-PRE) correlate significantly with each other with values between 0.49 and 0.68, indicating that these components should rather not be taken as independent constructs.

3.2.2. Competency Area Information Search and Evaluation

A parallel approach was taken for the competency area Information Search and Evaluation. Again, all four theoretically postulated constructs (TK-ISE, TCK-ISE, TPK-ISE, and TPACK-ISE) were modeled as latent variables in a joint confirmatory factor analysis, with the assigned items all loading significantly onto their respective factor. The standardized loadings for the items are again in the good to very good range (TK-ISE: 0.65–0.78, TCK-ISE: 0.57–0.76, TPK-ISE: 0.78–0.92, TPACK-ISE: 0.76–0.87). In the subsequently calculated path model, the three components TK-ISE, TCK-ISE, TPK-ISE represent predictors for TPACK-ISE. Again, the path model also shows very good model values (χ2 = 0.516, df = 1, χ2/df = 0.52, CFI = 1.00, RMSEA = 0.00; see Figure 11). There is a significant effect of TK-ISE on TPACK-ISE (β = 0.19, SE = 0.112, p = 0.027), which is comparable in magnitude to the effect of TCK-ISE (β = 0.22, SE = 0.099, p = 0.007). In comparison, the effect of TPK-ISE is approximately three times higher and highly significant (β = 0.58, SE = 0.064, p < 0.001). The occurring correlations between the individual predictors are all significant and very similar in strength to the values determined for Presentation. A total of 70% of the variance of TPACK-ISE can be explained by the three predictors.

4. Discussion of the Research Questions

Derived from DiKoLAN, a self-developed science-specific self-assessment tool for digital competencies of student teachers could be used for the first time for the competency areas Presentation, as well as Information Search and Evaluation, with a scientific focus. The theory-based components of teaching (T/TPACK), methods, digitality (M/TPK), content-specific context (C/TCK), and special tools (S/TK) of both competency areas could be reliably assessed (in terms of Cronbach’s alpha values). Possible differences in the correlations of TK, TCK, and TPK with TPACK shown for individual areas of DiKoLAN indicate that the sub-instruments used for the respective areas capture different constructs.
In the context of the first research question, it was analyzed how biology teacher students assess their competencies for Presentation, as well as Information Search and Evaluation, in the four components (TK, TCK, TPK, and TPACK) and what differences emerge between the components and competency areas. The competency assessments available in the area Information Search and Evaluation can be interpreted both with respect to their levels but also with respect to the instrument used, DiKoLAN-Grid. Compared to Information Search and Evaluation, higher values in the area of Presentation as a central element of teaching and of university courses could result from the comparatively frequent previous experience with it in school and university courses: Up to the second or third semester, however, subject-specific presentations and, less frequently, the associated methodological implications are passively received and technical foundations are built up in individually performed course presentations by students. This could explain the higher level of competency here. However, the planned und structured integration of presentation methods into scholar teaching and learning settings is not specifically practiced. Information Search and Evaluation, on the other hand, is more often carried out jointly within the context of cooperative group work, so that experience can be gained with regard to methodological and social interaction in practice. This would favor the area of Presentation, but not in the component TPACK, which is in line with the observed data in the components TK and TCK. According to this interpretation, the perspective of planning lessons in consideration of a broad and situation-adapted integration of presentation techniques and procedures would still be missing. This could explain a level of competency comparable to the measured level. Further, this would explain higher assessed competency levels for TPK in the area of Information Search and Evaluation. The position of the values slightly below the scale mean value can be interpreted as plausible and fitting to the educational level; the students are only at the beginning of their studies. In addition, with regard to the survey instrument, the measured levels provide useful information. The location of the mean values and the distribution of the data (Figure 9, boxplots) indicate a good fit of the items with respect to the levels of competency, which shall be measured, as the items were neither too easy nor too difficult. Consequently, the scales can also be used to measure progress in competency levels, possibly up to graduation. For clarification in this regard, either longitudinal studies or comparative surveys with students in higher semesters need to be conducted.
The second research question aimed to analyze the internal structure of the two competency areas. The development of TPACK as a central component presupposes knowledge in all other components, which is why it should be possible to demonstrate corresponding correlations in the path models. Previous studies are not clear regarding the relevance of the individual components in terms of their specific influence on TPACK. In particular, the role of TK, as a significant component for the extension of PCK to TPACK, is unclear (e.g., [26,47,48,49,50]). To further investigate this issue, path models were used to examine the correlations of each of the four components. For both competency areas, the three components TK, TCK, and TPK correlate similarly high with each other. However, differences emerge with regard to the predictive effect on TPACK. Only in the area Information Search and Evaluation TK has a significant influence on TPACK, as measured by the factor loadings. The fact that an influence of TK can be proven sometimes and sometimes not matches the inconsistent study situation available so far. The question remains whether there are reasons for the missing or existing influence of TK and what these are. These studies (e.g., [26]) have in common that the surveys were conducted with subject-unspecific scales or items. What differs, however, are the subjects and fields on which the participants were surveyed. Unfortunately, there are no comparable studies to date. The reason for the divergent results regarding the influence of TK could be differences in the composition of the samples, as well as related content and skill areas (subject disciplines, teaching experience, etc.). Depending on subjects and related competency areas covered, the influence of TK could actually vary in magnitude or not be present at all, as can be hypothesized in the following considerations related to our case: There is certainly a difference between the competencies that a teacher must have and those that pupils should develop. The latter can be considered as a subset of the competencies available to teachers. The more the competencies available to teachers match the competencies that pupils should develop (e.g., as consequence of an ICT syllabus), the smaller the expectable difference should be. For example, the TK competencies in the area of biological Information Search and Evaluation should have higher intersections for both groups (e.g., ISE.S.N2: List aspects of the need for a research strategy (problem analysis, keywords, synonyms, and search services)) than, for example, in the area of Presentation (e.g., PRE.S. N1: Name in each case several technical ways of presentation: of content at different scales (e.g., document camera, video camera, smartphone, tablet, microscope camera), of processes on different time scales (e.g., slow motion, time lapse), for a larger auditorium (e.g., beamer, interactive boards) for multiple groups (e.g., display on multiple terminals) or for a single receiver). Here, for example, the competencies required for teaching go far beyond what needs to be taught to pupils, and not all of them need to be integrated as learning objectives in the classroom, resulting in a less strict correlation between existing TK and TPACK.
In terms of correlations between components in the TPACK model, it is interesting to observe that the correlations between TK, TPK, and TCK are almost identical for the individual domains. Comparing these to other studies in completely different content contexts and with different addressees [51,52], Ref. [26] reveals both similar and completely different correlations between these components. Again, similar to the varying influence of TCK on TPACK shown in this study, this could indicate subject- or domain-specific correlations. Ultimately, TCK as content-related knowledge infers an influence in the actual content area under consideration. Depending on the overall scope of the respective related TK and TCK, different correlations and effects can be assumed with respect to TPACK.

5. Conclusions

DiKoLAN can be considered as a science teaching- and practice-oriented approach and first step towards domain-specific structuring and assessment of TPACK, as previous subject-specific description of TPACK has been limited so far [53]. Subject non-specificity of TPACK test instruments may lead to reduced validity [54]. With regard to practical implications, the specifically formulated competency areas, levels, and goals of DiKoLAN result in numerous fields of application and potential use-cases. For example, they can guide the creation of curricula for the university phase of teacher education, as well as the evaluation of competency levels and development processes for student teachers in science. The construction of tasks for the measurement of competencies, as well as for self-assessment, requires corresponding formulations of goals and descriptions of desired competencies, as they are given by DiKoLAN. Lecturers can use such formulations to create subject-specific competence profiles and, thus, identify existing strengths and areas for development in the competency profiles of students. Based on this, tailored courses and learning environments can be designed or individual support offers can be composed. The precise identification of development areas of individual students also allows individual competence support with digital online offerings and self-learning courses, the content of which can be specifically tailored to the relevant competency levels and components in conjunction with the formulated basic digital competences in DiKoLAN. Here, the subject- or science-specific perspective provides more concrete results, that are clearly related to the subject taught, than general and subject-unspecific tools for competence analysis or reflection, such as self-assessment tools corresponding to, for example, DigCompEdu [12]. With regard to the possible domain- or area-specificity of competency characteristics and correlations of their impact, a structuring of digital competencies is also a basic prerequisite to be able to compare and relate data from research projects. A very comprehensive and unspecific view and interpretation of the competency components, as in the model developed by Mishra and Koehler [16], may not be sufficient for research purposes under certain aspects. In addition, for the conception of addressee-related support measures, a knowledge of the concretely existing knowledge and competence basis is indispensable, if target-oriented support and subsequent evaluation is to be carried out.
In the long run, the results collected by the self-assessment tool DiKoLAN-Grid should and could be used to derive indications for an effective promotion of TPACK. Based on the data of the two competency areas presented in the present study, it could be shown that TPACK is predicted by TK, TCK, TPK to a different extent depending on the competency area. Thus, TPK shows high predictive values for TPACK in the competency area of Information Search and Evaluation but also in the competency area of Presentation. This result is evident in many similar studies [24,26,55,56,57], suggesting a particular importance of methodology knowledge about teaching with (support of) digital media. In contrast, the predictive values of TCK on TPACK are lower for the competency area Information Search and Evaluation than for the competency area of Presentation. To date, however, data are only available for two competency areas and from just a small number of tested students. Further studies will also explore the interrelationships of the four more technology-related areas of the remaining five competency areas. An interesting question will be, among others, whether similar findings with a relatively low influence of TCK can be found for the subject-specific competency areas, such as data acquisition, or whether the subject-specific component TCK has a stronger influence on TPACK in these areas.
In conclusion, DiKoLAN and the corresponding self-assessment Tool DiKoLAN-Grid presented here provides, for the first time, an application-oriented, detailed overview, including the perspective of several subject specific didactics on digital competencies for the natural sciences, that should be acquired during teacher education. DiKoLAN is already used as a framework supporting concepts for teacher education [58,59]. Furthermore, the components of competency expectations within a competency area based on the TPACK framework into important areas relevant in scholar practice enables an assignment of the competencies and respective levels, which have to be built up, to different courses and actors in teacher education (content knowledge, pedagogical content knowledge, pedagogical knowledge). However, DiKoLAN is subject to constant change. Some basic competencies are quite stable over time, while others are conditioned by rapidly developing digitization. This is true not only for the area of education but also for scientific research in the natural sciences. The use of new approaches, such as augmented reality (e.g., [60]), 3D printing (e.g., [61]), or artificial intelligence (e.g., [62]), may also change teaching and learning [63] and, thus, the digital competencies required of the teacher. Competency areas resulting from the currently increased use of remote and virtual labs in research and teaching (distance learning) may also expand DiKoLAN in the future. At the latest, if these competencies prove to be useful in the long term, this must be reflected in teacher training (see [47]). In general, an evolving classroom culture and organization will necessitate extensions and/or revisions of such a framework.

Author Contributions

Developing DiKoLAN, all authors; Conceptualization, all authors; methodology, all authors; validation, C.T. and L.v.K.; investigation, C.T. and M.M.; data curation, C.T., M.M. and L.v.K.; writing—original draft preparation, all authors; writing—review and editing, all authors; visualization, L.-J.T., C.T., L.v.K. and S.B.; project administration, C.T.; funding acquisition, C.T., S.B. and J.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by Joachim Herz Foundation, Hamburg, Germany.

Institutional Review Board Statement

All participants were students at two German universities. They took part voluntarily and signed an informed consent form. Pseudonymization of participants was guaranteed during the study. Due to all these measures in the implementation of the study, an audit by an ethics committee was waived.

Informed Consent Statement

Written informed consent was obtained from all participants involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to the ongoing study.

Acknowledgments

Our thanks go to the participating student teachers and to Jenny Meßinger-Koppelt and Jörg Maxton-Küchenmeister (Joachim Herz Foundation) for the organizational and financial support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Partnership for 21st Century Skills. A State Leader’s Action Guide to 21st Century Skills: A New Vision for Education; Partnership for 21st Century Skills: Tuscon, AZ, USA, 2006; Available online: http://apcrsi.pt/website/wp-content/uploads/20170317_Partnership_for_21st_Century_Learning.pdf (accessed on 1 October 2021).
  2. Saavedra, A.R.; Opfer, V.D. Learning 21st-Century Skills Requires 21st-Century Teaching. Phi Delta Kappan 2012, 94, 8–13. [Google Scholar] [CrossRef]
  3. International Society for Technology in Education. National Educational Technology Standards for Students; International Society for Technology in Education: Eugene, OR, USA, 1998; Available online: http://cnets.iste.org (accessed on 1 October 2021).
  4. Wiebe, J.H.; Taylor, H.G.; Thomas, L.G. The national educational technology standards for PK–12 students: Implications for teacher education. J. Comput. Teach. Educ. 2000, 16, 12–17. [Google Scholar]
  5. Brooks-Young, S. ISTE Standards for Students. A practical Guide for Learning with Technology; International Society for Technology in Education: Eugene, OR, USA, 2016. [Google Scholar]
  6. KMK. Strategie der Kultusministerkonferenz; Bildung in der digitalen Welt: Berlin, Germany, 2016. [Google Scholar]
  7. United Nations Educational, Scientific and Cultural Organization. UNESCO ICT Competency Framework for Teachers; UNESCO: Paris, France, 2011; Available online: http://unesdoc.unesco.org/images/0021/002134/213475e.pdf (accessed on 20 September 2021).
  8. UNESCO International Institute for Capacity Building in Africa (IICBA). ICT-Enhanced Teacher Standards for Africa (ICTeTSA); UNESCO-IICBA: Addis Ababa, Ethiopia, 2012; Available online: https://unesdoc.unesco.org/ark:/48223/pf0000216105 (accessed on 20 September 2021).
  9. Crompton, H. ISTE Standards for Educators. A Guide for Teachers and Other Professionals; International Society for Technology in Education: Washington, DC, USA, 2017. [Google Scholar]
  10. Redecker, C. European Framework for the Digital Competence of Educators: DigCompEdu; Publications Office of the European Union: Luxembourg, 2017. [Google Scholar]
  11. Krueger, K.; Hansen, L.; Smaldino, S. Preservice Teacher Technology Competencies: A Model for Preparing Teachers of Tomorrow to Use Technology. TechTrends Link. Res. Pract. Improv. Learn. 2000, 44, 47–50. Available online: https://www.learntechlib.org/p/90326/ (accessed on 4 September 2021).
  12. Ghomi, M.; Redecker, C. Digital Competence of Educators (DigCompEdu): Development and Evaluation of a Self-assessment Instrument for Teachers’ Digital Competence. In Proceedings of the 11th International Conference on Computer Supported Education; SCITEPRESS—Science and Technology Publications: Crete, Greece, 2019; pp. 541–548. [Google Scholar] [CrossRef]
  13. National Institute of Educational Technologies and Teacher Training (INTEF). Common Digital Competence Framework for Teachers—October 2017. 2017. Available online: https://aprende.intef.es/sites/default/files/2018-05/2017_1024-Common-Digital-Competence-Framework-For-Teachers.pdf (accessed on 20 September 2021).
  14. Kelentrić, M.; Helland, K.; Arstorp, A.T. Professional Digital Competence Framework for Teachers; The Norwegian Centre for ICT in Education: Oslo, 2017; pp. 1–74. [Google Scholar]
  15. Brandhofer, G.; Kohl, A.; Miglbauer, M.; Nárosy, T. digi.kompP—Digitale Kompetenzen für Lehrende. Open Online J. Res. Educ. 2016, 6, 38–51. [Google Scholar]
  16. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  17. Becker, S.; Bruckermann, T.; Finger, A.; Huwer, J.; Kremser, E.; Meier, M.; Thoms, L.-J.; Thyssen, C.; von Kotzebue, L. Orientierungsrahmen Digitale Kompetenzen für das Lehramt in den Naturwissenschaften—DiKoLAN. In Digitale Basiskompetenzen—Orientierungshilfe und Praxisbeispiele für die universitäre Lehramtsausbildung in den Naturwissenschaften; Becker, S., Meßinger-Koppelt, J., Thyssen, C., Eds.; Joachim Herz Stiftung: Hamburg, Germany, 2020; pp. 14–43. [Google Scholar]
  18. Falloon, G. From digital literacy to digital competence: The teacher digital competency (TDC) framework. Educ. Technol. Res. Dev. 2020, 68, 2449–2472. [Google Scholar] [CrossRef] [Green Version]
  19. Schultz-Pernice, F.; von Kotzebue, L.; Franke, U.; Ascherl, C.; Hirner, C.; Neuhaus, B.J.; Ballis, A.; Hauck-Thum, U.; Aufleger, M.; Romeike, R.; et al. Kernkompetenzen von Lehrkräften für das Unterrichten in einer digitalisierten Welt. Merz Medien + Erziehung Zeitschrift für Medienpädagogik 2017, 61, 65–74. [Google Scholar]
  20. Sailer, M.; Stadle, M.; Schultz-Pernice, F.; Franke, U.; Schöffmann, C.; Paniotova, V.; Fischer, F. Technology-related teaching skills and attitudes: Validation of a scenario-based self-assessment instrument for teachers. Comput. Hum. Behav. 2021, 115, 106625. [Google Scholar] [CrossRef]
  21. Koehler, M.J.; Mishra, P.; Kereluik, K.; Shin, T.S.; Graham, C.R. The technological pedagogical content knowledge framework. In Handbook of Research on Educational Communications and Technology; Spector, J.M., Merrill, D.M., Elen, J., Bishop, M.J., Eds.; Springer: New York, NY, USA, 2014; pp. 101–111. [Google Scholar]
  22. Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological pedagogical content knowledge (TPACK). J. Res. Technol. Educ. 2009, 42, 123–149. [Google Scholar] [CrossRef]
  23. Absari, N.; Priyanto, P.; Muslikhin, M. The Effectiveness of Technology, Pedagogy and Content Knowledge (TPACK) in Learning. J. Pendidik. Teknol. Dan Kejuru. 2020, 26, 43–51. [Google Scholar] [CrossRef]
  24. Koh, J.H.L.; Chai, C.S.; Tsai, C.-C. Examining practicing teachers’ perceptions of technological pedagogical content knowledge (TPACK) pathways: A structural equation modeling approach. Instr. Sci. 2013, 41, 793–809. [Google Scholar] [CrossRef]
  25. Celik, I.; Sahin, I.; Akturk, A.O. Analysis of the Relations among the Components of Technological Pedagogical and Content Knowledge (Tpack): A Structural Equation Model. J. Educ. Comput. Res. 2014, 51, 1–22. [Google Scholar] [CrossRef]
  26. Schmid, M.; Brianza, E.; Petko, D. Developing a short assessment instrument for Technological Pedagogical Content Knowledge (TPACK.xs) and comparing the factor structure of an integrative and a transformative model. Comput. Educ. 2020, 157, 103967. [Google Scholar] [CrossRef]
  27. Šorgo, A.; Špernjak, A. Digital competence for science teaching. In Proceedings of the CECIIS: Central European Conference on Information and Intelligent Systems: 28th International Conference, Varazdin, Croatia, 27–29 September 2017; pp. 45–51. [Google Scholar]
  28. Nerdel, C. Grundlagen der Naturwissenschaftsdidaktik: Kompetenzorientiert und Aufgabenbasiert für Schule und Hochschule; Springer Spektrum: Heidelberg, Germany, 2017. [Google Scholar]
  29. Duit, R.; Gropengießer, H.; Stäudel, L. Naturwissenschaftliches Arbeiten: Unterricht und Material 5–10; Erhard Friedrich Verlag: Seelze-Velber, Germany, 2004. [Google Scholar]
  30. Bäuml, M.A. Fachspezifische Arbeitsweisen im grundlegenden Biologieunterricht. Sachunterr. Math. Grundsch. 1976, 12, 580–589. [Google Scholar]
  31. Wolff, A.; Gooch, D.; Cavero Montaner, J.; Rashid, U.; Kortuem, G. Creating an Understanding of Data Literacy for a Data-driven Society. J. Community Inform. 2016, 12, 9–26. [Google Scholar] [CrossRef]
  32. Ridsdale, C.; Rothwell, J.; Smit, M.; Ali-Hassan, H.; Bliemel, M.; Irvine, D.; Kelley, D.; Matwin, S.; Wuetherick, B. Strategies and Best Practices for Data Literacy Education. Knowledge Synthesis Report. 2015. Available online: https://dalspace.library.dal.ca/bitstream/handle/10222/64578/Strategies%20and%20Best%20Practices%20for%20Data%20Literacy%20Education.pdf?sequence=1 (accessed on 1 August 2021).
  33. Palmer, S.E. Vision Science: Photons to Phenomenology; MIT Press: Cambridge, MA, USA, 1999. [Google Scholar]
  34. Wertheimer, M. Untersuchungen zur Lehre von der Gestalt. II. Psychol. Forsch. 1923, 4, 301–350. [Google Scholar] [CrossRef]
  35. Mayer, R.E. The Cambridge Handbook of Multimedia Learning, 2nd ed.; Cambridge University Press: Cambridge, UK, 2014. [Google Scholar]
  36. Kron, F.W. Grundwissen Didaktik, 2nd ed.; UTB: München, Germany, 1994. [Google Scholar]
  37. Johnston, B.; Webber, S. Information Literacy in Higher Education: A review and case study. Stud. High. Educ. 2003, 28, 335–352. [Google Scholar] [CrossRef]
  38. Brand-Gruwel, S.; Wopereis, I.; Walraven, A. A descriptive model of information problem solving while using internet. Comput. Educ. 2009, 53, 1207–1217. [Google Scholar] [CrossRef]
  39. Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P. A descriptive model of information problem solving while using internet. Comput. Hum. Behav. 2008, 24, 623–648. [Google Scholar] [CrossRef]
  40. Von Kotzebue, L.; Fleischer, T. Experimentieren mit digitalen Sensoren—Unsichtbares sichtbar machen. In Digitale Basiskompetenzen—Orientierungshilfe und Praxisbeispiele für die universitäre Lehramtsausbildung in den Naturwissenschaften; Becker, S., Meßinger-Koppelt, J., Thyssen, C., Eds.; Joachim Herz Stiftung Verlag: Hamburg, Germany, 2020; pp. 58–61. [Google Scholar]
  41. Vollmer, M.; Möllmann, K.-P. High Speed—Slow Motion: Technik digitaler Hochgeschwindigkeitskameras. Phys. Unserer Zeit 2011, 42, 144–148. [Google Scholar] [CrossRef]
  42. Thyssen, C.; Huwer, J.; Krause, M. Digital Devices als Experimentalwerkzeuge—Potenziale digitalen Experimentierens mit Tablet und Smartphone. Unterricht Biologie 2020, 451, 44–47. [Google Scholar]
  43. Thyssen, C.; Huwer, J. Promotion of transformative education for sustainability with ICT in real-life contexts. In Building Bridges across Disciplines for Transformative Education and a Sustainable Future; Eilks, I., Markic, S., Ralle, B., Eds.; Shaker Verlag: Aachen, Germany, 2018; pp. 85–96. [Google Scholar]
  44. Kuhn, J. iMobilePhysics. Smartphone-Experimente im Physikunterricht. Comput. + Unterr. 2015, 97, 20–22. [Google Scholar]
  45. Koehler, M.J.; Mishra, P.; Cain, W. What is technological pedagogical content knowledge (TPACK)? J. Educ. 2013, 193, 13–19. [Google Scholar] [CrossRef] [Green Version]
  46. Meier, M.; Thyssen, C.; Becker, S.; Bruckermann, T.; Finger, A.; Kremser, E.; Thoms, L.-J.; von Kotzebue, L.; Huwer, J. Digitale Kompetenzen für das Lehramt in den Naturwissenschaften—Beschreibung und Messung von Kompetenzzielen der Studienphase im Bereich Präsentation. In Bildung in der digitalen Transformation; Wollersheim, H.-W., Pengel, N., Eds.; Waxmann: Münster, Germany, 2021; pp. 185–190. [Google Scholar]
  47. Thoms, L.-J.; Meier, M.; Huwer, J.; Thyssen, C.; von Kotzebue, L.; Becker, S.; Kremser, E.; Finger, A.; Bruckermann, T. DiKoLAN—A Framework to Identify and Classify Digital Competencies for Teaching in Science Education and to Restructure Pre-Service Teacher Training. In Procedings of the Society for Information Technology & Teacher Education International Conference, Online, 29 March 2021; Association for the Advancement of Computing in Education (AACE): Waynesville, NC, USA, 2021; pp. 1652–1657. [Google Scholar]
  48. Angeli, C.; Valanides, N. Epistemological and methodological issues for the conceptualization, development, and assessment of ICT-TPCK: Advances in technological pedagogical content knowledge (TPCK). Comput. Educ. 2009, 52, 154–168. [Google Scholar] [CrossRef]
  49. Jang, S.-J.; Chen, K.-C. From PCK to TPACK: Developing a transformative model for pre-service science teachers. J. Sci. Educ. Technol. 2010, 19, 553–564. [Google Scholar] [CrossRef]
  50. Jin, Y. The nature of TPACK: Is TPACK distinctive, integrative or transformative? In Procedings of the Society for Information Technology & Teacher Education International Conference, Las Vegas, NV, USA, 18 March 2019; Association for the Advancement of Computing in Education (AACE): Chesapeake, VA, USA, 2019; pp. 2199–2204. [Google Scholar]
  51. Koh, J.H.L.; Sing, C.C. Modeling pre-service teachers’ technological pedagogical content knowledge (TPACK) perceptions: The influence of demographic factors and TPACK constructs. In Procedings of the ASCILITE—Australian Society for Computers in Learning in Tertiary Education Annual Conference, Hobart, Australia, 4–7 December 2011; pp. 735–746. [Google Scholar]
  52. Sahin, I.; Celik, I.; Oguz Akturk, A.; Aydin, M. Analysis of Relationships between Technological Pedagogical Content Knowledge and Educational Internet Use. J. Digit. Learn. Teach. Educ. 2013, 29, 110–117. [Google Scholar] [CrossRef]
  53. Akyuz, D. Measuring technological pedagogical content knowledge (TPACK) through performance assessment. Comput. Educ. 2018, 125, 212–225. [Google Scholar] [CrossRef]
  54. Deng, F.; Chai, C.S.; So, H.J.; Qian, Y.; Chen, L. Examining the validity of the technological pedagogical content knowledge (TPACK) framework for preservice chemistry teachers. Australas. J. Educ. Technol. 2017, 33, 1–14. [Google Scholar] [CrossRef]
  55. Dong, Y.; Chai, C.S.; Sang, G.-Y.; Koh, J.H.L.; Tsai, C.-C. Exploring the profiles and interplays of pre-service and in-service teachers’ technological pedagogical content knowledge (TPACK) in China. J. Educ. Technol. Soc. 2015, 18, 158–169. [Google Scholar]
  56. Pamuk, S.; Ergun, M.; Cakir, R.; Yilmaz, H.B.; Ayas, C. Exploring relationships among TPACK components and development of the TPACK instrument. Educ. Inf. Technol. 2015, 20, 241–263. [Google Scholar] [CrossRef]
  57. Von Kotzebue, L. (sub.). Two is better than one — Examining biology-specific TPACK and its T-dimensions from two angles.
  58. Große-Heilmann, R.; Riese, J.; Burde, J.-P.; Schubatzky, T.; Weiler, D. Erwerb und Messung physikdidaktischer Kompetenzen zum Einsatz digitaler Medien. PhyDid B 2021, 171–178. [Google Scholar]
  59. Kubsch, M.; Sorge, S.; Arnold, J.; Graulich, N. Lehrkräftebildung neu gedacht—Ein Praxishandbuch für die Lehre in den Naturwissenschaften und deren Didaktiken. Waxmann: Münster, Germany, 2021. [Google Scholar]
  60. Thees, M.; Kapp, S.; Strzys, M.P.; Beil, F.; Lukowicz, P.; Kuhn, J. Effects of augmented reality on learning and cognitive load in university physics laboratory courses. Comput. Hum. Behav. 2020, 108. [Google Scholar] [CrossRef]
  61. Assante, D.; Cennamo, G.M.; Placidi, L. 3D Printing in Education: An European perspective. In Proceedings of the 2020 IEEE Global Engineering Education Conference (EDUCON), Porto, Portugal, 27–30 April 2020; pp. 1133–1138. [Google Scholar] [CrossRef]
  62. Küchemann, S.; Becker, S.; Klein, P.; Kuhn, J. Classification of students’ conceptual understanding in STEM education using their visual attention distributions: A comparison of three machine-learning approaches. In Proceedings of the 12th International Conference on Computer Supported Education, Prague, Czech Republic, 2–4 May 2020; Volume 2: CSEDU. Lane, H.C., Zvacek, S., Uhomoibhi, J., Eds.; SciTePress: Setúbal, Portugal, 2020; pp. 36–46. [Google Scholar] [CrossRef]
  63. Bennett, E.E.; McWhorter, R.R. Digital technologies for teaching and learning. In Handbook of Adult and Continuing Education, 2020 Edition (Chapter 18); Rocco, T.S., Smith, M.C., Mizzi, R.C., Merriweather, L.R., Hawley, J.D., Eds.; American Association for Adult and Continuing Education: Sterling, VA, USA, 2021. [Google Scholar]
Figure 1. The DiKoLAN framework (https://dikolan.de/en/, last access on 28 November 2021).
Figure 1. The DiKoLAN framework (https://dikolan.de/en/, last access on 28 November 2021).
Education 11 00775 g001
Figure 2. Competencies in the area of Documentation (DOC).
Figure 2. Competencies in the area of Documentation (DOC).
Education 11 00775 g002
Figure 3. Competencies in the area of Presentation (PRE).
Figure 3. Competencies in the area of Presentation (PRE).
Education 11 00775 g003
Figure 4. Competencies in the area of Communication/Collaboration (COM).
Figure 4. Competencies in the area of Communication/Collaboration (COM).
Education 11 00775 g004
Figure 5. Competencies in the area of Information Search and Evaluation (ISE).
Figure 5. Competencies in the area of Information Search and Evaluation (ISE).
Education 11 00775 g005
Figure 6. Competencies in the area of Data Acquisition (DAQ).
Figure 6. Competencies in the area of Data Acquisition (DAQ).
Education 11 00775 g006
Figure 7. Competencies in the area of Data Processing (DAP).
Figure 7. Competencies in the area of Data Processing (DAP).
Education 11 00775 g007
Figure 8. Competencies in the area of Simulation and Modeling (SIM).
Figure 8. Competencies in the area of Simulation and Modeling (SIM).
Education 11 00775 g008
Figure 9. Boxplots of the competency areas Presentation and Information Search and Evaluation.
Figure 9. Boxplots of the competency areas Presentation and Information Search and Evaluation.
Education 11 00775 g009
Figure 10. Path model of interrelationships in the competency area of Presentation, *** p < 0.001, + covariation fixed to increase df, without fixation p < 0.001.
Figure 10. Path model of interrelationships in the competency area of Presentation, *** p < 0.001, + covariation fixed to increase df, without fixation p < 0.001.
Education 11 00775 g010
Figure 11. Path model of interrelationships in the competency area of Information Search and Evaluation, *** p < 0.001, * p < 0.05, + covariation fixed to increase df, without fixation p < 0.001.
Figure 11. Path model of interrelationships in the competency area of Information Search and Evaluation, *** p < 0.001, * p < 0.05, + covariation fixed to increase df, without fixation p < 0.001.
Education 11 00775 g011
Table 1. Presentation.
Table 1. Presentation.
ComponentNumber of ItemsExampleCronbach’s Alpha
TK-PRE5I can describe at least one possibility of technical implementation for each type of presentation (e.g., of content, processes/for several groups or individual recipients), including the necessary procedure with reference to current hardware and software, as well as related technical standards. (PRE.S.D1)0.85
TCK-PRE4I can name several subject-specific/specialist scenarios and contexts, as appropriate for digital forms of presentation and digital presentation of processes (e.g., time-lapse for osmosis) and for the use of presentation hardware (e.g., microscope cameras, mobile devices with cameras). (PRE.C.N1)0.81
TPK-PRE5I can select, adapt, and use existing and created presentation media of my own, taking into account technical possibilities and limitations, as well as principles/criteria for audience-appropriate design. (PRE.M.A1)0.85
TPACK-PRE6I can use digital media to simplify subject matter for the school context and make them easier to understand. (PRE.T.A1)0.91
Table 2. Information Search and Evaluation.
Table 2. Information Search and Evaluation.
ComponentNumber of ItemsExampleCronbach’s Alpha
TK-ISE4I can name search options for digital research, e.g., search functions of library sites (including university library); subject databases (including electronic journal library); electronic full texts (including e-books, electronic dissertations). (ISE.S.N1)0.80
TCK-ISE9I can name at least two quality criteria for evaluating digital sources from a subject-specific perspective, e.g., recency; necessary scope/style/design; professionalism, scientificity, neutral language style; validity and reliability; review procedures, references. (ISE.C.N3)0.86
TPK-ISE5I can describe advantages, disadvantages, and limitations of digital databases and search engines for use in teaching-learning scenarios. (ISE.M.D1)0.94
TPACK-ISE8I can plan and implement science instructional scenarios incorporating the steps of a successful internet-based information search or problem solving, such as defining the problem to be solved, researching information, presenting the information. (ISE.T.A2)0.94
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kotzebue, L.v.; Meier, M.; Finger, A.; Kremser, E.; Huwer, J.; Thoms, L.-J.; Becker, S.; Bruckermann, T.; Thyssen, C. The Framework DiKoLAN (Digital Competencies for Teaching in Science Education) as Basis for the Self-Assessment Tool DiKoLAN-Grid. Educ. Sci. 2021, 11, 775. https://doi.org/10.3390/educsci11120775

AMA Style

Kotzebue Lv, Meier M, Finger A, Kremser E, Huwer J, Thoms L-J, Becker S, Bruckermann T, Thyssen C. The Framework DiKoLAN (Digital Competencies for Teaching in Science Education) as Basis for the Self-Assessment Tool DiKoLAN-Grid. Education Sciences. 2021; 11(12):775. https://doi.org/10.3390/educsci11120775

Chicago/Turabian Style

Kotzebue, Lena von, Monique Meier, Alexander Finger, Erik Kremser, Johannes Huwer, Lars-Jochen Thoms, Sebastian Becker, Till Bruckermann, and Christoph Thyssen. 2021. "The Framework DiKoLAN (Digital Competencies for Teaching in Science Education) as Basis for the Self-Assessment Tool DiKoLAN-Grid" Education Sciences 11, no. 12: 775. https://doi.org/10.3390/educsci11120775

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop