Digital Competence Development in Public Administration Higher Education

: Today, citizens’ digital readiness and competence are competitive factors. While users with a high level of digital competence are usually associated with intensive and conscious use by consumers, those with a lower level of competence are characterized by passive consumer use and lagging in technology. It is essential to continuously develop and monitor competence levels because digital skills, the way digital tools are used, and the effects of use are interlinked. The importance of digital competence is also evident in public administration because its presence affects the quality of public services, which has an impact on the entire population. In this social aspect, digital competence is extremely important for public administration employees. In our study, we aim to investigate the input competence level of undergraduate (BA) students in public administration, and the output competence level reached after one semester in targeted training. We analyze the values in several dimensions and draw conclusions. When measuring digital competence, we relied on the EU reference material, the Citizen Digital Competence Framework, based on digital competence areas and proﬁciency levels such as information and data literacy, communication and collaboration, digital content creation, safety, and problem solving. The DigCompSat self-assessment questionnaire was used for the survey. The result of the study was an order of magnitude difference between the self-assessment and the real level of competence of the students. It can be useful for respondents to realize where they have gaps in their knowledge and competence. This is what is referred to as a ’change in attitude‘ in our study. There are factors related to gender, age, and prior training that inﬂuence the potential shifts between the segments. It was also found that the above factors applied differently in each range of competence. Concerning implementation of this research, it can be assumed that the larger amplitude movements may not necessarily be knowledge or skills, but rather ’changes in attitude’. Its practical importance might lie in the possibility to orient students.


Introduction
For a long time, it was assumed that Millennials, or the Alpha Generation, born after 2000, are already digital natives in the digital ecosystem, as can be seen from a study carried out between 2018 and 2022, regarding the parents' attitudes toward the formation of digital literacy in their children [1].However, the theory began to doubt the validity of this approach.This was also supported by our practical experience during a research project on the digital development of local governments and local communities [2].The concept of digital competence is becoming increasingly complex; it includes increasingly more diverse knowledge, skills, and experiences.For example, according to Iryna Hrebenyk (2019), the main aspects of digital competence include a rather high level of functional literacy in ICT, which refers to effective, well-founded application of digital technologies in educational and professional activities, and understanding digital technologies as the basis of a new paradigm in education aimed at the development of students as members of the information society.The main ways of forming digital competence, with special attention to the heads of institutions, can be identified as a method of teaching outside the workplace and inside the workplace, and mixed teaching methods [3].
That is why we considered it important to give attention to the assessment and development of students' digital skills.There are several applications popular among students that are not regulated or supervised by law.Due to the distributed nature of these applications, the question of jurisdiction is also indeterminable, so it would be difficult for us to interpret the state supervision of the operation and the implementation of possible coercive measures.This is why currently some governments are ambivalent about some of the latest applications, like blockchain.Some governments ban certain parts of it (e.g., in China, USA), and others take risks and are working on blockchain-based solutions (e.g., in Estonia, Russia, Brazil, Sweden, United Arab Emirates).In public administration education, Budai (2018) considers private blockchain solutions applicable, where-contrary to the spirit of the blockchain-the rules are not made by individual actors of the chain, but the government creates the playing field.Thus, all the benefits of the technology can be used in a limited and supervised environment with appropriate legal guarantees and with the government acting as a trusted third party [4].This is why the University of Public Service launched a fourteen-week practiceoriented course (two hours of theory, two hours of practice per week) in its BA in Administrative Management program, which is also where we assessed the input and output results.Besides assessing current skill levels, we wanted to know whether we could achieve an impact in the field of digital skills development, and, if so, what kind of impact, in which segments, and for whom we could see the greatest shift.

Theoretical Background and Literature Review
As the digital ecosystem evolves, so do the names and content of the skills needed to thrive within it.While at the beginning we considered only computer literacy [5], which covered the knowledge of computers and their applications, we began to consider digital information or info-communication literacy during the rise in networks, where the understanding, evaluation, and application of information from different digital sources are concerned.Nowadays, we consider e-competencies [6], or rather d-competencies, where these competencies are at least as much a default in terms of social prosperity as the other basic competencies.
Owing to various professional and intergovernmental organizations, there are many definitions and groupings of digital competences.As a member of the European Union, in Hungary, we should consider for citizens the work entitled DIGCOMP-European Digital Competence Framework [7].This study identifies five areas of competence in which citizens must become proficient: (1) information gathering and processing, (2) communication, (3) content creation, (4) security, and (5) problem-solving.These areas are then divided into further competences, presented in detail, which are now essential skills for active participation in society and economic life.Digital education will also be included in the new Hungarian National Basic Curriculum.The goal is to not only include digital competencies in the methodology, but also to create a digital platform and medium in which teachers can share their experiences.Thus, teaching can be continuously fine-tuned considering the results, and teachers can also set an example for their students by taking advantage of the opportunities offered by digitalization.Moreover, the benefits of a wellfunctioning digital state for the country's residents can be clearly seen from the example of Estonia.In the world's first digital state, decreasing bureaucracy is not only good for the mood of citizens, but also for the central budget.All this, combined with an education system capable of countering social inequalities, has opened new opportunities for this small Baltic state.
According to today's dominant attempts at definition, the emphasis in the digital age is on the need to manage technology.The ascendancy of new hardware and software poses new challenges for professionals in the field of management and human resources as corporations and companies also routinely implement and incorporate digital software, for example, improve worker productivity, or screen highly qualified candidates during the hiring process [8].The development of digital skills is a defining element of education; as of today, 90% of new positions require excellent digital skills from potential employees [9].It is for this reason that they were included in the eight areas of key competences (see below), and then in Hungary's national core curricula for public schools, following the recommendations of the Working Group of the European Parliament and the Council on Lifelong Learning (see EUR-Lex, 2006) [10].The Reference Framework sets out these eight key competences: (1) Communication in the native language; (2) Communication in foreign languages; (1) Communication in the native language is the ability to express and interpret concepts, thoughts, feelings, facts, and opinions in both oral and written form (listening, speaking, reading, and writing), and to interact linguistically in an appropriate and creative way in a full range of social and cultural contexts; in education and training, work, home and leisure [10].
The European Union has a multilingual administration in which speakers of 24 different official languages are expected to communicate with each other and with the citizens of the 27 member countries.Effective administrative communication is a challenge for all countries, but especially for those with more than one official language, or with national minority languages.The actions to be taken to avoid too much bureaucratic attitude in official communication, and in official documents and forms, and the communication between citizens and their public administrations is of paramount importance in improving public services [11].Therefore, in our investigation into public administration courses, the use of the native language issue is present in problem solving, communication and collaboration, digital content creation, and information and data literacy as well.
(2) Communication in foreign languages broadly shares the main skill dimensions of communication in the native language: it is based on the ability to understand, express, and interpret concepts, thoughts, feelings, facts, and opinions in both oral and written form (listening, speaking, reading, and writing) in an appropriate range of social and cultural contexts (in education and training, work, home, and leisure).
Communication in foreign languages also calls for skills such as mediation and intercultural understanding.An individual's level of proficiency will vary among the four dimensions (listening, speaking, reading, and writing) and among the different languages, and according to that individual's social and cultural background, environment, needs, and/or interests [10].
According to Jakub Zouhar [12], people without any knowledge of a foreign language can be more easily manipulated and are often accused of being insular.Indeed, such language barriers can be insuperable obstacles not only in administrative issues, but in the path of any integration process.This applies to public administration and highly influences digital competences in all its aspects, from problem-solving and safety to digital content creation, communication, and data literacy.
(3) Mathematical competence is the ability to develop and apply mathematical thinking to solve a range of problems in everyday situations.Building on a sound mastery of numeracy, the emphasis is on process, activity, and knowledge.Mathematical competence involves, to different degrees, the ability and willingness to use mathematical modes of thought (logical and spatial thinking) and presentation (formulas, models, figures, graphs, and charts).Competence in science refers to the ability and willingness to use the body of knowledge and methodology employed to explain the natural world to identify questions and draw evidence-based conclusions [10].
The Student Guide of Guardian addresses the question of whether mathematics is needed in public administration studies or not [13].In most public administration programs, mathematics is not a required subject.Public administration, however, gives the students a wide range of opportunities to choose from.Some of the careers open to them are working as a lecturer, principal, politician, political analyst, administrator, human resource manager, business manager, etc., and basic mathematical, statistical, and computing knowledge is useful.Administrators work through service provision, policymaking, and good governance enforcement.Public administration requires expert skills in managing government affairs both at the national and local government levels.The scope of public administration studies comprises mainly administration, management, and science.Therefore, students can study public administration without mathematics, because, as mentioned earlier, in most cases there is no mathematics in public administration curricula and programs.However, there are mathematics-related courses students are likely to encounter when studying public administration, such as economics, finance, business, and marketing.From our point of view of digital competences, basic math skills are needed for subjects such as safety, digital content creation, and information and data literacy.
(4) Competence in technology is viewed as the application of knowledge and methodology in response to perceived human intentions or needs.Competence in science and technology involves an understanding of the changes caused by human activity and responsibility as an individual citizen [10].
During and after the COVID-19 pandemic, it was generally assumed that increasingly more people working and studying online from home entailed improved digital and Internet skills.However, a study by Judith Kausch-Zongo and Birgit Schenk [14] showed that e-learning phases during the COVID-19 pandemic have contributed little to students' technological competency and usage in public administration higher education.Moreover, the competency and usage during on-the-job training phases in public organizations clearly differ from the off-the-job training phases at university.Hence, the findings reveal a general technological competency and usage gap between public administration education and actual duties in public administration.These findings justify our research regarding the minimum technical knowledge in safety, digital content creation, and information literacy.
(5) Digital competence involves the confident and critical use of the information society technology (IST) for work, leisure, and communication.It is underpinned by basic skills in ICT: the use of computers to retrieve, assess, store, produce, present and exchange information, and to communicate and participate in collaborative networks via the Internet [10].
In Poland, a project examined 142 public administration organizations and concludes that digital competence is the key success factor of e-government in the field of human resource management processes [15].This study indicates that the inclusion of digital competences into the process of employee recruitment from universities, evaluation, and development is significantly correlated with an organization's performance in human resource management (namely, employees' engagement and satisfaction and the efficiency of human resource management).Therefore, digital competence is of primary importance in problem-solving, communication, and collaboration, as we emphasize in our present survey.
(6) 'Learning to learn' is the ability to pursue and persist in learning, and to organize one's own learning, including through effective management of time and information, both individually and in groups.This competence includes awareness of one's learning process and needs, identifying available opportunities, and the ability to overcome obstacles to learn successfully.This competence means gaining, processing, and assimilating new knowledge and skills as well as seeking and making use of guidance.
Learning to learn engages learners to build on prior learning and life experiences to use and apply knowledge and skills in a variety of contexts: at home, at work, in education, and in training.Motivation and confidence are crucial to an individual's competence [10].
A practical interpretation of learning to learn in public administration studies is MMs (mixed methods).Depending on the specific research design [16], MMs can contribute to corroborating hypotheses, providing multidimensional interpretations of existing phenomena, and discover the underlying causal mechanisms in public administration higher education.MMs can help students understand the complexity of public administration, contributing to making the 'public machine' function more smoothly and efficiently.In our approach and survey, this view relates to information and data literacy.
(7) Sense of initiative and entrepreneurship refers to an individual's ability to turn ideas into action.It includes creativity, innovation, and risk-taking, as well as the ability to plan and manage projects to achieve objectives.This supports individuals, both in their personal and professional lives, to be more aware of the context of their work and better able to seize opportunities.In addition, it is a foundation for more specific skills and knowledge needed by those establishing or contributing to social or commercial activity.This should include awareness of ethical values and promote good governance [10].
The findings of a German/American study show that, on one hand, there is high demand among public administration and public policy students for entrepreneurship training, and that, on the other hand, few curricula offer it [17].Since the interrelationships between entrepreneurship and public administration/public policy education are still underdeveloped, in our research we gave special emphasis to this aspect in the problemsolving issue, and in digital content creation and communication.
(8) Cultural knowledge includes an awareness of local, national, and European cultural heritage and their place in the world.It covers basic knowledge of major cultural works, including popular contemporary culture.It is essential to understand the cultural and linguistic diversity in Europe and other regions of the world, the need to preserve it, and the importance of aesthetic factors in daily life [10].
A research project led by Jamez Stare and Maja Klun [18] investigated the most important competencies that should be included in public administration higher education.The results show that the competencies related to culture-based ethics and ethical behavior were evaluated as the most important: trustworthiness, responsibility, being ethical in decisions, respectability, not taking advantage of one's personal position, communicating in one's native language, and working in accordance with authorizations.These also correlate with our investigation aspects, insofar as problem-solving, communication, and collaboration are concerned.
Although the term digital competence is increasingly used today, and digital literacy is increasingly being displaced, we could not find an internationally accepted definition for it.As Garry Faloon (2020) puts it, 'the view of teacher digital competence (TDC) moves beyond prevailing technical and literacies conceptualizations, arguing for more holistic and broader-based understandings that recognize the increasingly complex knowledge and skills young people need to function ethically, safely, and productively in diverse, digitally-mediated environments' [19].Given also that the spectrum of skills is constantly expanding, it is not possible to identify a universal concept.That is why we see that the content of digital competence varies by author and approach.
According to one approach, digital competence is an evolving concept related to the development of digital technology and the political aims and expectations of citizenship in a knowledge society.While it is regarded as a core competence in policy papers, it is not yet a standardized concept in educational research [20].Ilomäki and coauthors suggest that it is a useful boundary concept that can be used in various contexts.They analyzed 76 educational research articles in which digital competence, described by different terms, was investigated.As a result, we found that digital competence consists of a variety of skills and competences, and that it has a broad scope and background, ranging from media studies and computer science to library and literacy studies.They suggest that digital competence is defined as consisting of (1) technical competence; (2) the ability to use digital technologies in a meaningful way for work, study, and everyday life; (3) the ability to evaluate digital technologies critically; and (4) motivation to participate and commit in the digital culture.
Another interpretation presents two scopes of understanding: a complete overview of digital competence in Europe, via the DigComp (2011-2013) Project and the latest version of the project, DigComp 2.0 (2016), which emphasizes the digital environment.The main conclusion could be summarized as the paramount importance that digital competence has gained in society as a result of digital transformation and of its inclusion in the Europe Strategy 2020, both of which are reasons to prioritize its integration into education [21].
Spanish researchers [22] found that digital competence (as one of the eight key competences for lifelong learning developed by the European Commission) is a requisite for personal fulfillment and development, active citizenship, social inclusion, and employment in a knowledge society.To accompany young learners in the development of competence, and to guarantee optimal implementation of information and communication technologies (ICTs), teachers must be digitally competent as well.Fraile and coauthors worked with a sample of 43 secondary school teachers in initial training who self-assessed their level of competence in 21 subcompetences in five areas identified by the DigComp project, using the rubrics provided in the Common Digital Competence Framework for Teachers.Overall, pre-service teachers' conceptions about their level of digital competence were low.Students scored highest in information, which refers mostly to the operations they performed during their studies, while the second-highest scores were in safety and communication, excluding the protection of digital data and preservation of digital identity.The lowest values were achieved in content creation and problem-solving, the dimensions most closely related to the inclusion of ICTs to transform teaching and learning processes.The knowledge or skills they exhibit are largely self-taught and, so, we perceive an urgent need to purposefully incorporate relational and didactic aspects of ICT integration.
Additionally, the European Commission tried to find the common denominator of these approaches, which it identified as the highlighting of positive attitudes toward digital culture, besides having the technical knowledge required for ICT tools, the ability to solve problems, and a critical approach to information and its management [10].This is why the European Committee created the DigComp framework project (EU Science Hub, 2018) [23], the basic objective of which was to build a unified system of all the details and components of digital competence, providing support for the launch of related developments.This unified system also makes the concept of digital competence itself clearer and more understandable, and above all, measurable [24].The DigComp framework defines five main areas of digital competence: information, communication, content creation, security, and problem-solving [23].
DigComp has been developing since its formation, and many reference tools are already connected to it.These set out not only citizen skill levels, but also frameworks for educators (DigCompEdu), educational institutions (DigCompOrg), and entrepreneurs (EntreComp).These include the DigComp self-assessment survey (DigCompSat), which means that a framework and evaluation system with uniform content and form is now available in the European context, allowing for comparable measurements of different digital competence levels.
At the same time, it is important to note that there can be huge differences between self-assessments and the actual level of competence.A study titled 'Perception and Reality: Measuring Digital Skills Gaps in Europe, India, and Singapore' [25] also states that people tend to overestimate their own digital skills, in addition to the fact that we see permanent and age-dependent gaps in Europe and outside Europe.The overestimation-as the Figure 1 shows-is massive.
educators (DigCompEdu), educational institutions (DigCompOrg), and entrepreneurs (EntreComp).These include the DigComp self-assessment survey (DigCompSat), which means that a framework and evaluation system with uniform content and form is now available in the European context, allowing for comparable measurements of different digital competence levels.
At the same time, it is important to note that there can be huge differences between self-assessments and the actual level of competence.A study titled 'Perception and Reality: Measuring Digital Skills Gaps in Europe, India, and Singapore' [25] also states that people tend to overestimate their own digital skills, in addition to the fact that we see permanent and age-dependent gaps in Europe and outside Europe.The overestimation-as the Figure 1 shows-is massive.Another finding of this study is that those who have undergone previous digital skills development perform better than those without such a background.
Therefore, important questions follow: What is the real level of digital competence?What factors distort self-assessments?Is it possible to achieve significant development and in which areas?
After considering the present state of the research regarding digital competences in the field of public administration, we formulated three research questions:

RQ1: Can a significant improvement in digital competence be achieved with a 30 h theoretical and 30 h practical university skills development course?
RQ2: Do age, gender, department, and previous IT studies have a moderating effect on the shift in digital competence?

RQ3: Is there a statistically significant difference between the potential shift in the different segments of digital competence?
When studying the eight key competences offered by EUR-Lex (2006), we were aware of the areas of digital competences, as noted in the section above, and tried to interpret the key factors from the point of view of public administration higher education.Thus, our questionnaire investigation could rely on the five areas of digital competences recommended by the EU Science Hub (2008).That is the foundation of the methodology we used.Another finding of this study is that those who have undergone previous digital skills development perform better than those without such a background.
Therefore, important questions follow: What is the real level of digital competence?What factors distort self-assessments?Is it possible to achieve significant development and in which areas?
After considering the present state of the research regarding digital competences in the field of public administration, we formulated three research questions: RQ1: Can a significant improvement in digital competence be achieved with a 30 h theoretical and 30 h practical university skills development course?RQ2: Do age, gender, department, and previous IT studies have a moderating effect on the shift in digital competence?RQ3: Is there a statistically significant difference between the potential shift in the different segments of digital competence?
When studying the eight key competences offered by EUR-Lex (2006), we were aware of the areas of digital competences, as noted in the section above, and tried to interpret the key factors from the point of view of public administration higher education.Thus, our questionnaire investigation could rely on the five areas of digital competences recommended by the EU Science Hub (2008).That is the foundation of the methodology we used.

Materials and Methods
When measuring digital competence, we relied on the EU reference material, the Citizen Digital Competence Framework [26], which defines five digital competence areas, with twenty-one digital competence elements and eight proficiency levels.The five areas of competence follow: 1.

Information and data literacy
This includes articulating information needs, identifying the location and retrieval of digital data, information, and content; judging the relevance of the source and its content; and storing, managing, and organizing digital data, information, and content.

Communication and collaboration
Interaction, communication, and collaboration through digital technologies while being aware of cultural and generational diversity; participating in society through public and private digital services and participatory citizenship; managing people's digital presence, identity, and reputation.

Digital content creation
Creating and editing digital content; improving and integrating information and content into an existing body of knowledge while understanding how copyright and licensees are used; knowing how to give understandable instructions for computer systems.

Safety
Protecting devices, content, personal data, and privacy in digital environments; protecting physical and psychological health and being aware of digital technologies for social well-being and social inclusion; being aware of the environmental impact of digital technologies and their use.

Problem-solving
Identifying needs and problems and resolving conceptual problems and problem situations in digital environments; using digital tools to innovate processes and products; keeping up to date with digital evolution [23].
The DigCompSat [27] self-assessment questionnaire was used for the survey.Its technique and content are in line with previous questionnaire-based research, including the index construction and evaluation methods used.The model was created based on a methodological approach that can measure the three components of digital competences (knowledge, skills, and attitude) in all five areas of digital competences, while also providing participants with the opportunity to develop self-awareness skills related to their own abilities.
In the DigCompSAT self-assessment questionnaire, students had to evaluate, on a four-point scale, 82 statements about digital competence components.The values obtained on the scales could be summed and examined in a suitable manner, along with the entire DigCompSAT and the five digital competence areas with mean value indicators.Median values and averages could be formed, and accurate percentage results could be calculated from them.The tool proved to be reliable in the survey for both the full scale and the subscales (Cronbach's alpha > 0.8).
The first (input) survey was conducted between 17 January and 28 February 2022, using an online form.The second (output) measurement was conducted between 11 May and 27 May 2022.The input assessment was completed by N = 120 undergraduate students, while the output assessment was completed by N = 58 students (many did not complete the questionnaire).
The empirical samples of the two measurements were not based on random selection, but in order to ensure comparability, the survey aimed to cover the entire population and similarity check of the two samples was performed.(There was no systematic bias or significant difference in the composition of the samples that substantially affected the results.)The analysis was based on self-assessment responses.This research did not aim to verify the self-assessment responses with tests or to directly measure digital competency.
Regarding management of the data obtained, we excluded from the database any records in the empirical sample having more than 90% missing data, and in this way organized the items into indexes in the dataset obtained.In so doing, we chose one of the two basic Likert scale analytical procedures, whereby, after converting the scale levels into values, we also considered the calculation of the averages from the observed mean values when evaluating the data.Our method of converting the scale levels was to relate the four-degree scale to the values 0-0.33-0.66-1,and the analysis of the empirical sample was based on this resolution.Therefore, the average values calculated in this way fell between 0 and 1, and could be converted to percentage values and compared.
The sampling method did not include a random method, so we did not use probabilistic statistical tools when comparing the mean values.Index trainings using mean values (median, average) were carried out both for the competence components (knowledge, skills, and attitude) and for the main thematic competence areas.The values of the indices were then examined along two or more dimensions, divided into subgroups and cross-tabulations, in which each of the characteristics of the students (age; major; type of studies, i.e., full-time or correspondence course; etc.) were also included as one dimension in the analysis.In the evaluation of the subgroups, we also considered the limitations of evaluating subgroups with a very small number of elements.

Results
As a starting point, we wanted to obtain an insight into students' digital competence, to realize in which areas they show adequate proficiency and where they have clear deficiencies.While we conducted the survey for all majors in the input measurement, in the output measurement we focused only on those who had participated in the course.The average digital competence level of those surveyed was 63.3%, which is slightly higher than the Spanish (57.1%) and Latvian (55.8%) results measured in the DigCompSat test measurements [26].
We examined the effect of age, training department (full-time department: 62.7%, correspondence department 64.7%), and major (BA: 60.7%, MA: 67%) on the input values, but the largest difference was found in the gender dimension (male 69.3%, female 59.4%) (Figure 2).aim to verify the self-assessment responses with tests or to directly measure digital competency.
Regarding management of the data obtained, we excluded from the database any records in the empirical sample having more than 90% missing data, and in this way organized the items into indexes in the dataset obtained.In so doing, we chose one of the two basic Likert scale analytical procedures, whereby, after converting the scale levels into values, we also considered the calculation of the averages from the observed mean values when evaluating the data.Our method of converting the scale levels was to relate the four-degree scale to the values 0-0.33-0.66-1,and the analysis of the empirical sample was based on this resolution.Therefore, the average values calculated in this way fell between 0 and 1, and could be converted to percentage values and compared.
The sampling method did not include a random method, so we did not use probabilistic statistical tools when comparing the mean values.Index trainings using mean values (median, average) were carried out both for the competence components (knowledge, skills, and attitude) and for the main thematic competence areas.The values of the indices were then examined along two or more dimensions, divided into subgroups and cross-tabulations, in which each of the characteristics of the students (age; major; type of studies, i.e., full-time or correspondence course; etc.) were also included as one dimension in the analysis.In the evaluation of the subgroups, we also considered the limitations of evaluating subgroups with a very small number of elements.

Results
As a starting point, we wanted to obtain an insight into students' digital competence, to realize in which areas they show adequate proficiency and where they have clear deficiencies.While we conducted the survey for all majors in the input measurement, in the output measurement we focused only on those who had participated in the course.The average digital competence level of those surveyed was 63.3%, which is slightly higher than the Spanish (57.1%) and Latvian (55.8%) results measured in the DigCompSat test measurements [26].
We examined the effect of age, training department (full-time department: 62.7%, correspondence department 64.7%), and major (BA: 60.7%, MA: 67%) on the input values, but the largest difference was found in the gender dimension (male 69.3%, female 59.4%) (Figure 2).In view of the average values obtained from the DigCompSAT self-assessment questionnaires, males achieved significantly higher values than females in all areas of digital competence and in terms of overall digital competence.As you can see in our own diagram above, the gender gap is particularly noticeable in the areas of information search and data management, digital content creation, and problem-solving.The reason for this may be that women lack confidence in this area and may have been more prone to undervaluation.
It is also very interesting to examine how the distribution of the results obtained within each competence area is formed for the entire student sample.In the distribution curves for communication and collaboration, and in information search and data management, a higher proportion of students appeared at higher proficiency levels, while in the creation of digital content, most students achieved a result between 30 and 70%.Very few students achieved results above 80% in digital content creation, security, and problem-solving.Considering the individual areas of competence, the area of digital content creation shows the lowest value, so it should be a priority in the future.In addition, the area of security and problem-solving can also be the focus of improvements, due to their lower values.
Dividing the survey results, among students who had digital skills development as a compulsory subject, the input assessment showed an average result of only 60.1%.The results, therefore, show a strong middle level.
Based on our surveys, grouped by research questions, we make the following findings.
RQ1: Can a significant improvement in digital competence be achieved with a thirty-hour theoretical and thirty-hour practical university skills development course?
Our answer is clearly yes.The median of digital competence measurements increased from 0.626 to 0.652.Compared to the March admission base, this is a 2.863% shift, which is not significant, but further divisions already show a more significant shift.Among the individual competence elements, the greatest progress was found in the averages for skills (3.19%).The truly significant findings are obtained after further analysis of the data.Figure 3 provides additional information.
It is also very interesting to examine how the distribution of the results obtained within each competence area is formed for the entire student sample.In the distribution curves for communication and collaboration, and in information search and data management, a higher proportion of students appeared at higher proficiency levels, while in the creation of digital content, most students achieved a result between 30 and 70%.Very few students achieved results above 80% in digital content creation, security, and problem-solving.Considering the individual areas of competence, the area of digital content creation shows the lowest value, so it should be a priority in the future.In addition, the area of security and problem-solving can also be the focus of improvements, due to their lower values.
Dividing the survey results, among students who had digital skills development as a compulsory subject, the input assessment showed an average result of only 60.1%.The results, therefore, show a strong middle level.
Based on our surveys, grouped by research questions, we make the following findings.

RQ1: Can a significant improvement in digital competence be achieved with a thirty-hour theoretical and thirty-hour practical university skills development course?
Our answer is clearly yes.The median of digital competence measurements increased from 0.626 to 0.652.Compared to the March admission base, this is a 2.863% shift, which is not significant, but further divisions already show a more significant shift.Among the individual competence elements, the greatest progress was found in the averages for skills (3.19%).The truly significant findings are obtained after further analysis of the data.Figure 3 provides additional information.RQ2: Do age, gender, department, and previous IT studies have a moderating effect on the shift in digital competence?
As you can see in Figure 4, among the age groups, the 25-29 and 30-34 age groups reached the highest competence levels.In the second measurement, on the other hand, there is a decline in both age groups, which, starting from the self-evaluation, may also stem from the fact that the respondents realized their true depth of knowledge.
The largest negative shift was measured in the 30-34 age group (39.7%).Within this, the creation of digital content, security awareness, and problem-solving are outstanding.As shown in Figure 5, we see the biggest positive shift in the 35-39 age group.Here, we measured a positive shift of 15.81% in the average level of digital competence.It should be noted that the safety awareness and attitude in this age group improved remarkably: 27.29% and 26.98%, respectively.At the same time, their problem-solving skills also improved by 19.15%.In our opinion, it would be worthwhile to investigate in a separate study why the reality of self-evaluations tilts above the 35-39 age group.However, it is quite clear that the respondents in the input survey of the 30-34 age group overestimated their competence level, while the following age group presumably underestimated their abilities.
RQ2: Do age, gender, department, and previous IT studies have a moderating effect on the shift in digital competence?
As you can see in Figure 4, among the age groups, the 25-29 and 30-34 age groups reached the highest competence levels.In the second measurement, on the other hand, there is a decline in both age groups, which, starting from the self-evaluation, may also stem from the fact that the respondents realized their true depth of knowledge.The largest negative shift was measured in the 30-34 age group (39.7%).Within this, the creation of digital content, security awareness, and problem-solving are outstanding.As shown in Figure 5, we see the biggest positive shift in the 35-39 age group.Here, we measured a positive shift of 15.81% in the average level of digital competence.It should be noted that the safety awareness and attitude in this age group improved remarkably: 27.29% and 26.98%, respectively.At the same time, their problem-solving skills also improved by 19.15%.In our opinion, it would be worthwhile to investigate in a separate study why the reality of self-evaluations tilts above the 35-39 age group.However, it is quite clear that the respondents in the input survey of the 30-34 age group overestimated their competence level, while the following age group presumably underestimated their abilities.Regarding gender distribution, males achieve a higher value in terms of competence levels.In terms of change, however, females responded significantly and consistently to skill development in all dimensions.As illustrated in Figure 6, the average level of digital competence increased by 8.85%.We measured an almost similar decline in males (−7.22%).Concerning all attributes, the shift for females was positive and that for males was negative.Here, too, the increase in skills for females is outstanding (10.2%), which is already a significant shift.There is also a strong shift in the creation of digital content (7.26%) and the average of knowledge and information and data management topics (6.41% and 6.32%, respectively).Regarding gender distribution, males achieve a higher value in terms of competence levels.In terms of change, however, females responded significantly and consistently to skill development in all dimensions.As illustrated in Figure 6, the average level of digital competence increased by 8.85%.We measured an almost similar decline in males (−7.22%).Concerning all attributes, the shift for females was positive and that for males was negative.Here, too, the increase in skills for females is outstanding (10.2%), which is already a significant shift.There is also a strong shift in the creation of digital content (7.26%) and the average of knowledge and information and data management topics (6.41% and 6.32%, respectively).

Change by gender
skill development in all dimensions.As illustrated in Figure 6, the average level of digital competence increased by 8.85%.We measured an almost similar decline in males (−7.22%).Concerning all attributes, the shift for females was positive and that for males was negative.Here, too, the increase in skills for females is outstanding (10.2%), which is already a significant shift.There is also a strong shift in the creation of digital content (7.26%) and the average of knowledge and information and data management topics (6.41% and 6.32%, respectively).When the public administration undergraduate program included a subject on digital skills development, the average of digital competence improved by 8.97%.Within this, the range in digital content creation (10.65% improvement), the range in skills (9.43% improvement), and the average knowledge stand out.

Change by gender
In the case of full-time courses, there is no significant difference in the level of competence; however, the values are slightly higher in the case of correspondence courses.The average displacement is higher for full-time students with 3.62%, while for correspondence students it is only 0.37%.Here, too, in full-time classes, the subject of skills improved the most (4.96%).
From the point of view of the level of competence, there is no significant difference according to the presence or absence of a background in IT.The average competence of students who had preliminary IT training increased by 4.91%, while the competence average of those who had not previously taken an IT-related course decreased by 3.57%.

RQ3:
Is there a statistically significant difference between the potential shift in the different segments of digital competence?At first approach, our answer is no; however, if we examine the age dimension here as well, we can already see significant differences.The older age group responds better in the areas of security awareness and problem-solving, while the younger generation shows greater shifts in terms of digital content production and the development of communication and collaboration.

Discussion
The final items to discuss include the significance of the results, their novelty, what the present research adds to the literature in the field, and potential new explanations and aspects.

•
There is very little specialized literature regarding DigCompSat testing in higher education.Our study intends to remedy this gap, specifically in the education environment of public administration.

•
During the research, we realized that the values based on self-assessment can shift, and that these shifts vary by age group, area, and range of competence.

•
We also realized that these digital competencies can be developed and found some confirmation that limited results can be achieved even in such a short time.

•
Our findings also showed that, in addition to self-assessment, independent survey solutions can show true competency levels and shifts.

•
In addition, we realized that it would be better to examine the progress of a target group over longer periods of time, but more regularly.

•
We can also state that the pilot version of this method will form the basis of the local competitiveness test, which will be implemented as part of a large-scale national survey.
Regarding the limitations and the future of our research, we must consider its importance in the social sciences context.There are three viewpoints that stem from similar research projects but focus on digital divides.
The first possibility to broaden our approach would be the stratification hypothesis, where there is a positive link between a person's corresponding fields of resources (economic, cultural, social, and personal) in the digital context.This includes specific areas of social and digital inclusion or exclusion, which are not randomly distributed nor independent from social divides [28].
The second aspect is the compound digital exclusion hypothesis assuming a connection between different digital resources.It presumes that a deficiency in one type of digital competence leads to a deficiency in another area.Applied to public administration studies, this could be interpreted as meaning that if a student cannot benefit from or use digitization in one area, that student will not be able to benefit from and use digitization in another area either-a situation which should be avoided in public administration training as well.The underlying idea is that deficiencies in different types of digital skills are cumulative, and that the same is true for the inability to use digitalization for various purposes and to benefit from their use [29].
The third aspect is the sequential digital exclusion hypothesis, assuming a connection between the first-level, second-level, and third-level digital divides, which we plan to interpret in the future by identifying the shifts we found in our research.A student who is in danger of falling into one of the divides is also at risk of falling into another one.In more concrete terms, those who have poor access to digital tools probably also lag behind in the development of digital skills and the use of digital tools in a more routine-like manner, and are unable to benefit from their use in the same way as those with better access or higher skills [29].
Thus, our research could be placed in a social science context focusing on digital divides and their effects, through the examination of digital competences of public administration students.Because of the role of public administration influencing the well-being of the whole society, it is a research aspect that must not be neglected-or so we assume.

Conclusions
In accordance with the special literature background, we also established that there can be an order of magnitude difference between the self-assessment and the real level of competence.It is therefore not certain that one should start from a self-assessment.At the most, it could be useful for respondents to realize where they have gaps in their knowledge and competence.This is what we refer to as a 'change in attitude'.The real starting point could be independent surveys, where the students report on their knowledge in some kind of assessment system, such as an exam system, or perhaps their knowledge could be evaluated from their use of certain applications.
Therefore, even the directions of change can only be handled with an appropriate critical approach.It can be assumed that the larger amplitude movements may not necessarily be knowledge or skills, but rather a change in attitude.Its practical importance might lie in the possibility to orient students.We can open their eyes to see what they do not know, which may help them develop the need to know.At the same time, considering, for example, the field of security awareness, these are also very useful developments.As a result of the above, even the smallest progress in security awareness can result in major benefits.Even by merely making demands and increasing the desire for knowledge, we can prevent many problems from arising.
We also see that there are factors related to gender, age, and prior training (experience) that influence the potential shifts between the segments.It is also clearly visible that the above factors apply differently in each range of competence.Hence, the more targeted and the more regularly we implement digital competence development at the institutional level along these parameters, the greater results we can achieve.Moreover, the repetitive nature of our monitoring can help us understand these processes.
Some of the limitations in this research, including the conceptualization problems of digital competence, are as follows.On the one hand, we only asked about specific digital skills, as we focused on the DigCompSat methodology.On the other hand, it is based on self-assessment, but the attitude behind the answers is decisive.Thirdly, we could not monitor classroom attendance of the students who completed the questionnaires at the beginning and at the end of the study.Fourthly, the research took place with an experimental grade and an experimental curriculum, and this was the first test of this kind.

Figure 4 .
Figure 4. Change by age groups.

Figure 5 .
Figure 5. Input and output average values by age groups.Source: own data.