Declared and Real Level of Digital Skills of Future Teaching Staff

: Digital competence is undoubtedly one of the key skills that teaching staff should possess. Currently, there are many theoretical frameworks and ways to measure skills and knowledge related to the use of information and communication technologies (ICT). This article is an attempt to show the real and declared level of digital skills among future teaching staff. The research was conducted in Poland among 128 students of pedagogical faculties (ﬁrst-year undergraduate studies). The research used a triangulation of research methods and techniques: diagnostic survey and competency tests related to the use of word processors and spreadsheets, and the level of knowledge about the use of ICT. Competency tests were in accordance with the European Computer Skills Certiﬁcate (ECDL) standard. The collected data showed the following: (1) more than half of the students rate their own skills in the use of word processors and spreadsheets, and their overall theoretical knowledge as high or very high; (2) in the case of the real assessment of digital competence, only less than 20% reached the passing threshold in the areas of word processors and theoretical knowledge, with only 1.6% passing in the area of spreadsheets; (3) the declared and actual levels of digital competence were moderately related in the surveyed group; (4) attitudes towards new media, self-assessment of digital skills, and previous learning experience in handling ICT are not predictive factors for ECDL test results.


Introduction
Operating computers, smartphones, and the Internet are elementary skills within modern society. Limitations in this area generate a phenomenon defined as digital exclusion [1]. Therefore, for more than two decades, the issue of shaping the effective use of ICT in private and everyday life is a priority for formal, nonformal, and informal education. Digital knowledge and skills are crucial for many professional groups. Recent months showed how important it is for teachers and students to have appropriate ICT knowledge and skills [2]. Nowadays, the level of digital competence among teachers is attracting increasing interest from both public and school-related stakeholders, and experts in media pedagogy. All of this translated into increasingly bold attempts to operationalise the concept of digital competence along with its simultaneous measurement.
This text is part of the scientific discourse related to the digital competence of future teaching staff. It is the new generation of teachers that will soon change the face of school digitalisation. Moreover, this text is an attempt to validate research tools that measure different key aspects of digital skills such as data processing, text editing, and elementary knowledge related to the use of ICT. The text also shows the relationship between the self-assessment of digital competences and the actual results of competence tests. The text fills an important gap by emphasising the importance of moving away from measuring digital competences through self-declaration [3][4][5][6][7][8] in favour of real measurement based on competency tests, as is the case in determining the level of other skills, e.g., language skills [9,10]. Educ

Theoretical Framework
Digital competences are now firmly established in the canon of skills that teaching staff must possess [11]. In the literature, digital competencies can be encountered in different ways. Usually, these skills are included in the general category of the ability to use hardware and software. High proficiency in operating software and hardware is something natural in many professions and not subject to indepth analysis [12]. Systematic analysis of the concept reveals that digital literacy is most often understood as the ability to use information and communication technologies (ICT) [13]. However, this is not an exhaustive definition. Media scholars indicate that digital literacy should not be considered and defined in isolation from information processes or the understanding of social phenomena occurring through new media [14]. Reflecting on the mechanisms related to the impact of digital media on individuals and groups is thus an integral component of digital competence [15].
The definition of digital competences depends on the context in which these skills are situated. For example, researchers around the Digital Centre indicate that digital skills are a concept that is closely related to human needs. Therefore, analysis of these key skills should be in relation to the group that they concern [16]. This type of flexible approach to defining digital competences is also evident in other studies, where ICT skills are strictly assigned to occupation, potential occupation, or age category [17,18]. This approach makes the development of an unambiguous and subject-independent definition of digital competence a difficult task, fraught with the errors of reductionism. The complexity of this concept manifests itself primarily in the creation of definitions that depend on the time of the implementation of the research itself (i.e., the stage of development of the information society) [19,20], the needs and the level of computerisation of a given institution, or the represented field. Nevertheless, attempts to construct definitions of digital competences and their measurement are a valuable activity, allowing for intentional activities to be conducted, aimed at supporting individuals in their professional work and everyday life in the information society. In this text, digital competences were intentionally reduced to the use of selected software: those most often used in education and higher education. Nevertheless, digital competence is a very complex key skill that evolves in accordance with the changing sociotechnical environment. In this study, ICT skills and knowledge of new media are included as digital competences. Although there is some divergence in the definitions of digital literacy, digital skills, and digital competence in the literature, in this text, all three terms include basic software skills and ICT knowledge. This text does not claim to be definitive, but merely an attempt to measure the elementary components of all three terms. The main defining characteristics of digital literacy (DL) are presented in Figure 1.
Digital competence is, therefore, a complex concept that goes beyond the simple operation of software and hardware. Definitions of digital competence clearly emphasise that it is a skill; therefore, its measurement should be realised through the performance or solution of specific activities using target hardware and software. However, in the current literature, it is increasingly common to study this key competence using self-assessment, i.e., without the use of computers, smartphones, the Internet, or software, but only one's own intuition or conclusions resulting from the experiences of the respondents' use of ICT. Looking at research carried out in the last five years in Poland, it is immediately apparent that digital competences as measured by media educators are not realised through the use of the more complex and laborious measurement procedure, the competence test. A brief summary of the results of research conducted among future media educators in this area is presented in Table 1. Digital competence is, therefore, a complex concept that goes beyond the simple operation of software and hardware. Definitions of digital competence clearly emphasise that it is a skill; therefore, its measurement should be realised through the performance or solution of specific activities using target hardware and software. However, in the current literature, it is increasingly common to study this key competence using self-assessment, i.e., without the use of computers, smartphones, the Internet, or software, but only one's own intuition or conclusions resulting from the experiences of the respondents' use of ICT. Looking at research carried out in the last five years in Poland, it is immediately apparent that digital competences as measured by media educators are not realised through the use of the more complex and laborious measurement procedure, the competence test. A brief summary of the results of research conducted among future media educators in this area is presented in Table 1. Data from e-learning platform Students of pedagogical faculties have a low level of information literacy and self-education in relation to operating e-learning platforms Pulek, Staniek (2017) [25] Diagnostic survey, self-evaluation Students highly rate their skills in operating entertainment websites (e.g., social networking sites, music and video sites), while their skills in operating office software are much lower Romaniuk, Łukasiewicz-Wieleba (2020) [26] Diagnostic survey, self-evaluation Most students rate their own digital skills as good   [24] Data from e-learning platform Students of pedagogical faculties have a low level of information literacy and self-education in relation to operating e-learning platforms Pulek, Staniek (2017) [25] Diagnostic survey, self-evaluation Students highly rate their skills in operating entertainment websites (e.g., social networking sites, music and video sites), while their skills in operating office software are much lower Romaniuk, Łukasiewicz-Wieleba (2020) [26] Diagnostic survey, self-evaluation Most students rate their own digital skills as good Such diverse methodologies for measuring digital competences are, on the one hand, a richness, as they allow for an understanding of a selected fragment of digital competences to be reached, and offer a new perspective on the operationalisation of the concept. On the other hand, increasing the number of indicators makes it impossible to conduct comparative research or to create universal research tools. Moreover, taking into account the issue of the great freedom in interpreting the notion of key competences, this area is characterised by increasing distortions resulting from the misuse of measurement techniques based on self-declarations.
Another issue, and one that should be clearly stated when conducting research on digital competences among future teaching staff, is the conditions associated with the specific characteristics of the current generation of students training to become teachers. This is a group whose formal education falls at an intensive time in the development of the information society. It is a collective characterised by a relatively high level of confidence in the effectiveness of the use of ICT in education, and who very frequently use media [27,28] while at the same time recognising the possibilities of using software and hardware not only in education, but also in the effective management of leisure time [29]. Simultaneously, the prior preparation of pedagogical students in the effective use of ICT might raise concerns and reservations about the level of digital competence that they possess [30]. Therefore, the training of new pedagogical staff for an increasingly computerised school requires analysis using standardised knowledge tests, showing the real level of possessed skills, as it is these that are the foundation for building more complex, specialised knowledge linking digital competence to pedagogy [31]. On the basis of the above theoretical analyses, the following research hypotheses are posited: Hypothesis 1 (H1). Students rate their own digital competence well or very well.

Hypothesis 2 (H2).
The real level of digital competence differs from the declared one.

Hypothesis 3 (H3).
Digital competence is related to previous educational experiences.

Research Problems
The aim of the research was to measure digital competence among future teaching staff. The goal was achieved using a triangulation of research tools based primarily on a competency test showing the actual level of knowledge and skills in operating the most popular office software. The indirect aim of the research was to validate the research tools that test skill level in the operation of selected office suite software.
The specific aim of the research was to juxtapose the results of the competency test with self-declarations in terms of assessing one's own level of DL and assessing the validity of using ICT in education, and one's own experiences of shaping DL at earlier stages of education.
The objects of the study were answers from a questionnaire survey. In addition, the results of competence tests on the use of word processors, spreadsheets, and ICT knowledge were examined.
The following research problems were assumed in the study:

1.
What is the level of knowledge and skills related to the use of ICT equipment among students of education? 2.
How do future educators assess their own digital competencies?

3.
To what extent is the level of digital competence in using an office suite related to the self-diagnosis of digital competence, the assessment of the relevance of using ICT in education, and previous experience with formal education in the development of digital competence-in short, how well does the objective align with the subjective in the evaluation of digital competence?

Test Procedure
The research was conducted at the largest Polish state university orientated towards the education of pedagogical personnel, the Pedagogical University of Kraków. Research was conducted using a triangulation of research methods and techniques. Measurement was conducted during three meetings that took place within the framework of an academic course in information technology. During the first meeting, the students, future pedagogical staff, completed an online questionnaire composed of questions related to the self-assessment of the level of their digital competence, a self-assessment of the speed Educ. Sci. 2021, 11, 619 5 of 16 by which they learn to use new software and hardware, their attitude towards the use of ICT in the teaching process, and their own experiences related to the formation of digital competence in secondary school. In addition, during the first meeting, the students also completed a knowledge assessment test on the basics of handling IT equipment (ECDL Module I). The students had 45 min to complete the test.
The second stage of the research, which took place a week later, involved a test of skills connected with the use of text editors. These activities were also conducted in the computer lab of the Pedagogical University in Krakow. The students had 45 min to complete a series of practical tasks associated with operating word processors (Word 2013) in accordance with the ECDL standard (word processing module). After completing the tasks, the number of correctly solved commands was separately recorded in the protocol for each of the students.
The third and final stage, which took place during the third week, was the spreadsheet skills test. The students completed the ECDL compliant test (spreadsheets module) using hardware and software resources available in the computer lab. The students had 45 min to complete 32 tasks. Results were checked by the tutor and entered against each student's name. Figure 2 presents the entire research procedure.
was conducted during three meetings that took place within the framework of an demic course in information technology. During the first meeting, the students, fu pedagogical staff, completed an online questionnaire composed of questions relate the self-assessment of the level of their digital competence, a self-assessment of the s by which they learn to use new software and hardware, their attitude towards the u ICT in the teaching process, and their own experiences related to the formation of d competence in secondary school. In addition, during the first meeting, the students completed a knowledge assessment test on the basics of handling IT equipment (E Module I). The students had 45 min to complete the test. The second stage of the research, which took place a week later, involved a te skills connected with the use of text editors. These activities were also conducted i computer lab of the Pedagogical University in Krakow. The students had 45 m complete a series of practical tasks associated with operating word processors (W 2013) in accordance with the ECDL standard (word processing module). After com ing the tasks, the number of correctly solved commands was separately recorded i protocol for each of the students.
The third and final stage, which took place during the third week, was the sp sheet skills test. The students completed the ECDL compliant test (spreadsheets mo using hardware and software resources available in the computer lab. The students 45 min to complete 32 tasks. Results were checked by the tutor and entered against student's name. Figure 2 presents the entire research procedure.

Research Tools
The whole research was embedded in the quantitative stream of pedagogica search allowing for the measurement of digital competence and attitudes towards th of new media. A triangulation of research methods and techniques was used in the st The measurement of digital competence was realised according to the standard o European Computer Skills Certificate to capture the real level of knowledge and

Research Tools
The whole research was embedded in the quantitative stream of pedagogical research allowing for the measurement of digital competence and attitudes towards the use of new media. A triangulation of research methods and techniques was used in the study. The measurement of digital competence was realised according to the standard of the European Computer Skills Certificate to capture the real level of knowledge and skills related to the use of ICT possessed by the respondents. The following tests were used in the implementation of the ECDL standard [32,33]: Operation of digital devices and knowledge of IT equipment (theoretical test) consisting of 32 questions with single-choice answers. A maximum of 1 point could be obtained for each question. The points were then converted into a percentage of correct answers on a scale of 0 to 100%. Each student had 45 min to complete the test.

2.
Use of word processing software at a basic level. Each student received a set of instructions in a PDF file and a set of work files in which the activities were performed. Each student was given 45 min to complete 32 tasks. Each correctly completed task could be awarded 1 point. The points were then converted into percentages on a scale from 0 to 100%. 3.
Spreadsheet maintenance. As in the previous modules, students were given work files and 32 tasks to complete with a time limit of 45 min. Results were then compiled and stored as percentages.
All of the ECDL assessment tasks contained within this module were completed in accordance with the ECDL Foundation Syllabus: Computer Fundamentals Module B1 Syllabus-Version 1.0; ECDL/ICDL Word Processing Module B3 Syllabus-Version 5.0; ECDL/ICDL Spreadsheets Module B4 Syllabus-Version 5.0.
Additionally, the objective measurement of digital competence was preceded by a diagnostic survey consisting of: • A general self-assessment of digital competence level (5 questions) using a Likert scale from 1-very low to 5-very high. Previously used tools [34] were employed to develop the scale. The Cronbach alpha coefficient was 0.778. • Self-assessment of digital competence in using an office suite, consisting of 5 questions using a Likert scale from 1-very low to 5-very high. This scale was used in previous comparative studies in Visegrad countries [35,36]. The Cronbach alpha coefficient was 0.780. • An assessment of the ability to use new hardware and software (3 questions). Previous multiauthor studies from the Smart Ecosystem for Learning and Inclusion (SELI) project [37] were used to create the scale using responses on a 5-degree Likert scale from 1-very rarely to 5-very often. The Cronbach alpha coefficient was 0.720.

•
The validity of the use of ICT in education consisting of 6 questions using a 5-degree response scale from 1-very much disagree to 5-very much agree. The scale was the author's own and was derived from Serbian research on the use of ICT in education [38]. The Cronbach alpha coefficient was 0.630.
Cronbach's alpha for the entire tool was 0.816. Exploratory factor analysis (EFA) was conducted for the entire survey questionnaire. Results are attached as Appendix A.

Sampling Procedure and Characteristics of the Study Sample
The sample selection was purposive in nature. The key for the selection of respondents was the following conditions: (1) having the status of a student of pedagogy; (2) participating in the academic course "information technology" as first year students; (3) being able to use ICT at a basic level. The research should be treated as a pilot study aimed primarily at testing the time-consuming measurement of digital competences with the use of the ECDL standard and going beyond self-evaluation. The collected results do not authorise generalisation to the whole population of students in Poland. The research requires the procedure to be restarted with the use of sampling frames.
The research involved 128 first-year full-time and extramural students. The research was conducted in Poland at the largest national pedagogical university, namely, the Pedagogical University of Kraków. The research comprised 94.53% females and 5.47% males, which reflects the structure of students in pedagogical faculties. The students came from the following areas: large city (26.56%), small city (16.40%), village (39.84%), medium city (17.2%). The vast majority of the respondents graduated from a general secondary school (82.03%), while 17.97% had a diploma from a secondary technical school. Data were collected in the academic year 2020/2021.

Research Ethics
Participation in the study was linked to the completion of mandatory classes in the academic subject of information technology. The students were informed about the purpose of the study. Completing the online survey and competency tests is a standard procedure that first-year undergraduate students undergo. Participation in the research had no bearing on obtaining credit for the course, and was solely related to the diagnostic activities. The results of the study were not personally linked to the students in any way, and results were anonymised prior to analysis. The protocols with the results of individual tests after the final results had been recorded were secured by removing the students' personal data, i.e., name and surname.

Declared Level of Digital Literacy-A Diagnostic Survey
Future teachers rate their own skills in operating the software for creating multimedia presentations the highest, with more than half of the respondents declaring that they have high or very high skills in this field. A little more than half of the respondents also assessed their digital competence in operating text editors as very high. Almost one-quarter of the respondents declared having high skills in handling spreadsheets. The weakest assessment of the respondents was given to their skills in creating and using databases. The detailed percentage distribution of responses related to self-assessment is presented in Figure 3.
school. Data were collected in the academic year 2020/2021.

Research Ethics
Participation in the study was linked to the completion of mandatory classes in the academic subject of information technology. The students were informed about the purpose of the study. Completing the online survey and competency tests is a standard procedure that first-year undergraduate students undergo. Participation in the research had no bearing on obtaining credit for the course, and was solely related to the diagnostic activities. The results of the study were not personally linked to the students in any way, and results were anonymised prior to analysis. The protocols with the results of individual tests after the final results had been recorded were secured by removing the students' personal data, i.e., name and surname.

Declared Level of Digital Literacy-A Diagnostic Survey
Future teachers rate their own skills in operating the software for creating multimedia presentations the highest, with more than half of the respondents declaring that they have high or very high skills in this field. A little more than half of the respondents also assessed their digital competence in operating text editors as very high. Almost one-quarter of the respondents declared having high skills in handling spreadsheets. The weakest assessment of the respondents was given to their skills in creating and using databases. The detailed percentage distribution of responses related to self-assessment is presented in Figure 3.

Objective Level of Digital Skills-Skills Test
Each of the ECDL skills tests allows a maximum of 32 points to be gained, i.e., 1 point for each correctly solved task. The points were converted into percentages on a scale from 0% to 100%. On the basis of the collected data, it is clear that the respondents are least able to handle spreadsheets, while significantly better results were obtained in word processing and theoretical knowledge related to the use of ICT. Thus, in this group, word processing skills and theoretical knowledge are at a higher level than that of mathematical calculations in a spreadsheet. Basic descriptive statistics for the competence tests are presented in Table 2. The pass rate for the ECDL competency tests was calibrated at 75% and above. Taking into account the ECDL principles, the highest pass rates here were for the word processing skills test and the use of ICT knowledge test. However, in both cases, less than 20% of the respondents reached the official pass mark. For the handling of spreadsheets, only slightly more than 1.5% of the respondents achieved a positive result. The detailed distribution of results is presented in the Figure 4. scale from 0% to 100%. On the basis of the collected data, it is clear that the respondents are least able to handle spreadsheets, while significantly better results were obtained in word processing and theoretical knowledge related to the use of ICT. Thus, in this group, word processing skills and theoretical knowledge are at a higher level than that of mathematical calculations in a spreadsheet. Basic descriptive statistics for the competence tests are presented in Table 2. The pass rate for the ECDL competency tests was calibrated at 75% and above. Taking into account the ECDL principles, the highest pass rates here were for the word processing skills test and the use of ICT knowledge test. However, in both cases, less than 20% of the respondents reached the official pass mark. For the handling of spreadsheets, only slightly more than 1.5% of the respondents achieved a positive result. The detailed distribution of results is presented in the Figure 4.

Declared vs. Actual Level of Digital Skills
The declared and actual levels of digital competences were positively correlated. This first applies to the use of word processors and spreadsheets. Only the relationship between the self-evaluation of theoretical knowledge and the actual result of the ECDL test, which assesses the level of knowledge, was not statistically significant. However, the self-evaluation and actual test scores were only slightly above the threshold of average correlation power. This means that the two forms of assessment were not compatible with each other; therefore, the self-evaluation is not the same as the hard evaluation by means of standardised tests. Correlation values are shown in Table 3. Table 3. Spearman's correlation coefficient between self-evaluation and ECDL test scores.

Declared vs. Actual Level of Digital Skills
The declared and actual levels of digital competences were positively correlated. This first applies to the use of word processors and spreadsheets. Only the relationship between the self-evaluation of theoretical knowledge and the actual result of the ECDL test, which assesses the level of knowledge, was not statistically significant. However, the self-evaluation and actual test scores were only slightly above the threshold of average correlation power. This means that the two forms of assessment were not compatible with each other; therefore, the self-evaluation is not the same as the hard evaluation by means of standardised tests. Correlation values are shown in Table 3.

Prediction of Real Level of Digital Skills
Using multivariate regression analysis, predictive analyses were conducted on changes in the results of competence tests regarding the knowledge of ICT use, word processing, and the use of spreadsheets. All three variables mentioned were included in the model as dependent variables. In turn, the independent variables became the subjective assessment of new media literacy, ways of operating new hardware and software, attitudes towards the use of new media in education, and experiences with formal education in information technology at earlier stages. Results are surprising because none of the aforementioned factors had any effect on the adopted model on the change in the level of basic ICT skills as determined by the ECDL tests. Details of multivariate regression analysis are shown in Table 4.

Discussion
Digital competence is not a new theoretical and practical construct. We encountered the term in reference to ICT literacy in the literature ever since the commercialisation of the first personal computers [39,40]. With the development of information society, including the rise in popularity of various e-services, this concept is also increasingly prominent in the pedagogical literature [41,42]. Although every educator is intuitively aware of the indicators that characterise this key skill, there are now many diverse approaches to defining these skills and measuring the concept [43]. We can often observe the dominance of quick diagnostic surveys, which are characterised by a relatively uncomplicated methodology based on self-declarations. It is a simple way of measuring digital competences, but it is burdened with many errors. The measurement of digital competence through the use of tests of knowledge and skills, i.e., classical methods known and valued for years in the pedagogical sciences, appear to be slightly more complicated. Both methods have their positive and negative sides, and these are presented in Table 5. The aim of the present text was to validate a selected aspect of digital competence that is consistent with the ECDL framework, and to compare these results with self-declarations among pedagogical students. The study was conducted in several stages. The first stage was characterised by rapid data collection due to the used diagnostic survey. However, the next stage, which involved the assessment of digital competence levels using the ECDL standard, was time-consuming and required the involvement of additional resources in the form of both hardware and software. The issue of time-consumption is the justification for the use of quick measurements through self-declaration by many researchers dealing with this topic.
However, collected data showed that students overestimate their own digital skills. At the stage of self-evaluation, most of the respondents defined their knowledge and skills above their actual level, which was confirmed by the results of competence tests. Therefore, one should be aware when using self-evaluation in this type of research that the Dunning-Kruger effect [44][45][46] can significantly distort the level of assessed digital competence. Research conducted using self-assessment alone can lead to significant distortions in research findings, generating unreliable data. In many cases, it is difficult for the respondents to relate to the assessment of their own digital skills. Therefore, a clear postulate arising from this research is to limit the measurement of digital competences based on self-declaration, and develop new standards of measurement that involve authentic activities using ICT. Transposing the described situation to other areas, it is easy to imagine how the results of tests of key competences in other areas, e.g., foreign language and mathematical skills, which contain measurements based only on vague, subjective feelings, can be distorted. Collected data showed that the level of DL among students is a challenge in preparing for the teaching profession. Only one in five students of pedagogical sciences would meet the passing criteria of the ECDL tests in word processing and knowledge of ICT activities. In addition, only less than 2% of the respondents would be able to obtain a positive result in managing a spreadsheet. This means that the effects of education in the field of information technology at earlier stages (secondary and primary school) were not achieved or were erased (e.g., forgotten due to the underuse of individual skills formed in secondary school). The research results presented in the article, therefore, represent a question about the quality of education related to digital competences and the necessary catalogue of digital competences that secondary school graduates should have [47,48]. Due to the complexity of the notion of digital competence, a further series of questions also arise relating to the preparation of a universal set of competences characteristic of particular professional groups, e.g., teachers [49][50][51].
Predicting changes in digital competence levels is not an easy task. Collected data showed that the change in proficiency in using word processing and spreadsheets, and the knowledge of ICT among future teachers, is not influenced by attitudes towards the use of new media in education, previous educational experiences, or experimenting with new software and hardware. The lack of such influences may be due to the methodological limitations of this text (and especially the small research sample used). Other individual variables, such as the frequency of using Office suites [52], readiness to learn [53][54][55], and situational factors related to work experience [56], which are difficult to include in the course of a single study due to the limitations of the length of the tool, may also be a limiting factor in the prediction of change in the selected elements of key competences.

Research Limitations and New Directions in Digital Literacy Research
This study consisted of two parts, the measurement of digital competences using a diagnostic survey allowing for a quick, albeit "surface" verification of this key skill, and the more complex study that involved taking a competency test. This research was characterised by several methodological limitations that may have contributed to the distortion of the results. First, the self-assessment part is characterised by a high level of subjectivity that does not provide a realistic representation of the level of knowledge and skills related to the use of ICT. Second, the test surveys were conducted using real tasks related to the operation of the Office suite. Not all students at earlier educational stages had the opportunity to form these competencies using the MS Office suite. In addition, the imposed time constraints (45 min) to complete the 32 tasks for each test may have caused stress among the test takers. Nevertheless, this stage of the study was conducted under friendly conditions. This limitation was minimised by repeatedly emphasising to the subjects that the test scores did not have any bearing on the final evaluation of their academic course activities.
The research is also characterised by theoretical limitations. The approach applied to measuring digital competence was only limited to selected elements of operating an Office suite and knowledge of how ICT works, which means that only a fragment of the measurement of digital competence is presented in this text. Moreover, due to the rapid development of digital skills, it is currently difficult to construct permanent definitions and measurement tools that measure a universal core of digital competences, and one that is not subject to rapid change over time.
Therefore, further research should include the development of a universal theoretical framework, while preparing tools that allow for the quick and precise measurement of digital competences. Such a measure should be similar to competence tests related to foreign language skills (e.g., TELC A1-C2 standard).

Conclusions
Measuring digital competence using self-assessment is a quick but very imprecise process. On the other hand, measuring digital competences using competence tests that require completing specific actions at the computer is a complex, time-consuming task that requires direct reference to existing standards. Due to the convenience of data collection, many researchers prefer the first type of data collection. The second type, going beyond selfdeclaration, is time-consuming and burdened with technical requirements (e.g., software unification), but nevertheless brings much more precise results.
This text compared the two measurements, showing the pros and cons of both approaches. The collected data did not allow for generalisation due to the sampling procedure, but they are a voice in the discussion on the need for increased attention to the preparation of new pedagogical staff in the use of ICT in education. Moreover, the text is an attempt to polemicise against the popularised standards of data collection based on a fast but vague path, fraught with many theoretical and procedural shortcomings. Therefore, a deeper debate on the development of a universal digital competence framework for future educators is necessary [57].
Funding: The article was written as part of the project "Teachers of the future in the information society-between risk and opportunity paradigm" funded by the Polish National Agency for Academic Exchange (NAWA) under the Bekker programme-grant number: PPN/BEK/2020/1/00176.

Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Informed consent was obtained from all subjects involved in the study.
Data Availability Statement: Data available on request by e-mail.

Conflicts of Interest:
The author declares no conflict of interest. Table A1. EFA: exploratory factor analysis.

Kaiser-Meyer-Olkin test
MSA Overall MSA 0.751 I can create a web page 0.730 I can work with a computer better than others 0.830 I can work on the Internet better than others 0.813 I can use a smartphone better than others 0.795 I can also use the Internet in a creative way 0.680 Handling a word processor-self-assessment 0.776 Self-assessment on using a spreadsheet programme. 0.792 Self-assessment in using a presentation software 0.776 Self-assessment on using a database editor.