Next Article in Journal
Analyzing the Impact of Decreasing Out-of-Vehicle Time of Public Transportation Travel on Accessibility to Tertiary Hospitals
Previous Article in Journal
Mechanical Properties and Strength Characteristics of Rock–Coal–Rock Assemblages under Different Peripheral Pressures
Previous Article in Special Issue
Leading Edge or Bleeding Edge: Designing a Framework for the Adoption of AI Technology in an Educational Organization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Competence Development in Public Administration Higher Education

by
Balázs Benjámin Budai
1,*,
Sándor Csuhai
2 and
István Tózsa
1,3
1
Department of Public Management and Information Technology, Faculty of Political Science and International Studies, University of Public Service, 82 Üllői Rd., H-1083 Budapest, Hungary
2
Eötvös József Research Centre, University of Public Service, 82 Üllői Rd., H-1083 Budapest, Hungary
3
Economic Geography and Urban Marketing Centre, John von Neumann University, 10 Izsáki Rd., H-6000 Kecskemét, Hungary
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(16), 12462; https://doi.org/10.3390/su151612462
Submission received: 14 June 2023 / Revised: 31 July 2023 / Accepted: 9 August 2023 / Published: 16 August 2023

Abstract

:
Today, citizens’ digital readiness and competence are competitive factors. While users with a high level of digital competence are usually associated with intensive and conscious use by consumers, those with a lower level of competence are characterized by passive consumer use and lagging in technology. It is essential to continuously develop and monitor competence levels because digital skills, the way digital tools are used, and the effects of use are interlinked. The importance of digital competence is also evident in public administration because its presence affects the quality of public services, which has an impact on the entire population. In this social aspect, digital competence is extremely important for public administration employees. In our study, we aim to investigate the input competence level of undergraduate (BA) students in public administration, and the output competence level reached after one semester in targeted training. We analyze the values in several dimensions and draw conclusions. When measuring digital competence, we relied on the EU reference material, the Citizen Digital Competence Framework, based on digital competence areas and proficiency levels such as information and data literacy, communication and collaboration, digital content creation, safety, and problem solving. The DigCompSat self-assessment questionnaire was used for the survey. The result of the study was an order of magnitude difference between the self-assessment and the real level of competence of the students. It can be useful for respondents to realize where they have gaps in their knowledge and competence. This is what is referred to as a ’change in attitude‘ in our study. There are factors related to gender, age, and prior training that influence the potential shifts between the segments. It was also found that the above factors applied differently in each range of competence. Concerning implementation of this research, it can be assumed that the larger amplitude movements may not necessarily be knowledge or skills, but rather ’changes in attitude’. Its practical importance might lie in the possibility to orient students.

1. Introduction

For a long time, it was assumed that Millennials, or the Alpha Generation, born after 2000, are already digital natives in the digital ecosystem, as can be seen from a study carried out between 2018 and 2022, regarding the parents’ attitudes toward the formation of digital literacy in their children [1]. However, the theory began to doubt the validity of this approach. This was also supported by our practical experience during a research project on the digital development of local governments and local communities [2]. The concept of digital competence is becoming increasingly complex; it includes increasingly more diverse knowledge, skills, and experiences. For example, according to Iryna Hrebenyk (2019), the main aspects of digital competence include a rather high level of functional literacy in ICT, which refers to effective, well-founded application of digital technologies in educational and professional activities, and understanding digital technologies as the basis of a new paradigm in education aimed at the development of students as members of the information society. The main ways of forming digital competence, with special attention to the heads of institutions, can be identified as a method of teaching outside the workplace and inside the workplace, and mixed teaching methods [3].
That is why we considered it important to give attention to the assessment and development of students’ digital skills. There are several applications popular among students that are not regulated or supervised by law. Due to the distributed nature of these applications, the question of jurisdiction is also indeterminable, so it would be difficult for us to interpret the state supervision of the operation and the implementation of possible coercive measures. This is why currently some governments are ambivalent about some of the latest applications, like blockchain. Some governments ban certain parts of it (e.g., in China, USA), and others take risks and are working on blockchain-based solutions (e.g., in Estonia, Russia, Brazil, Sweden, United Arab Emirates). In public administration education, Budai (2018) considers private blockchain solutions applicable, where—contrary to the spirit of the blockchain—the rules are not made by individual actors of the chain, but the government creates the playing field. Thus, all the benefits of the technology can be used in a limited and supervised environment with appropriate legal guarantees and with the government acting as a trusted third party [4].
This is why the University of Public Service launched a fourteen-week practice-oriented course (two hours of theory, two hours of practice per week) in its BA in Administrative Management program, which is also where we assessed the input and output results. Besides assessing current skill levels, we wanted to know whether we could achieve an impact in the field of digital skills development, and, if so, what kind of impact, in which segments, and for whom we could see the greatest shift.

Theoretical Background and Literature Review

As the digital ecosystem evolves, so do the names and content of the skills needed to thrive within it. While at the beginning we considered only computer literacy [5], which covered the knowledge of computers and their applications, we began to consider digital information or info-communication literacy during the rise in networks, where the understanding, evaluation, and application of information from different digital sources are concerned. Nowadays, we consider e-competencies [6], or rather d-competencies, where these competencies are at least as much a default in terms of social prosperity as the other basic competencies.
Owing to various professional and intergovernmental organizations, there are many definitions and groupings of digital competences. As a member of the European Union, in Hungary, we should consider for citizens the work entitled DIGCOMP—European Digital Competence Framework [7]. This study identifies five areas of competence in which citizens must become proficient: (1) information gathering and processing, (2) communication, (3) content creation, (4) security, and (5) problem-solving. These areas are then divided into further competences, presented in detail, which are now essential skills for active participation in society and economic life. Digital education will also be included in the new Hungarian National Basic Curriculum. The goal is to not only include digital competencies in the methodology, but also to create a digital platform and medium in which teachers can share their experiences. Thus, teaching can be continuously fine-tuned considering the results, and teachers can also set an example for their students by taking advantage of the opportunities offered by digitalization. Moreover, the benefits of a well-functioning digital state for the country’s residents can be clearly seen from the example of Estonia. In the world’s first digital state, decreasing bureaucracy is not only good for the mood of citizens, but also for the central budget. All this, combined with an education system capable of countering social inequalities, has opened new opportunities for this small Baltic state.
According to today’s dominant attempts at definition, the emphasis in the digital age is on the need to manage technology. The ascendancy of new hardware and software poses new challenges for professionals in the field of management and human resources as corporations and companies also routinely implement and incorporate digital software, for example, improve worker productivity, or screen highly qualified candidates during the hiring process [8]. The development of digital skills is a defining element of education; as of today, 90% of new positions require excellent digital skills from potential employees [9]. It is for this reason that they were included in the eight areas of key competences (see below), and then in Hungary’s national core curricula for public schools, following the recommendations of the Working Group of the European Parliament and the Council on Lifelong Learning (see EUR-Lex, 2006) [10]. The Reference Framework sets out these eight key competences:
(1)
Communication in the native language;
(2)
Communication in foreign languages;
(3)
Mathematical competence and basic competences in science and technology;
(4)
Digital competence;
(5)
Learning to learn;
(6)
Social and civic competences;
(7)
Sense of initiative and entrepreneurship;
(8)
Cultural awareness and expression.
(1)
Communication in the native language is the ability to express and interpret concepts, thoughts, feelings, facts, and opinions in both oral and written form (listening, speaking, reading, and writing), and to interact linguistically in an appropriate and creative way in a full range of social and cultural contexts; in education and training, work, home and leisure [10].
The European Union has a multilingual administration in which speakers of 24 different official languages are expected to communicate with each other and with the citizens of the 27 member countries. Effective administrative communication is a challenge for all countries, but especially for those with more than one official language, or with national minority languages. The actions to be taken to avoid too much bureaucratic attitude in official communication, and in official documents and forms, and the communication between citizens and their public administrations is of paramount importance in improving public services [11]. Therefore, in our investigation into public administration courses, the use of the native language issue is present in problem solving, communication and collaboration, digital content creation, and information and data literacy as well.
(2)
Communication in foreign languages broadly shares the main skill dimensions of communication in the native language: it is based on the ability to understand, express, and interpret concepts, thoughts, feelings, facts, and opinions in both oral and written form (listening, speaking, reading, and writing) in an appropriate range of social and cultural contexts (in education and training, work, home, and leisure). Communication in foreign languages also calls for skills such as mediation and intercultural understanding. An individual’s level of proficiency will vary among the four dimensions (listening, speaking, reading, and writing) and among the different languages, and according to that individual’s social and cultural background, environment, needs, and/or interests [10].
According to Jakub Zouhar [12], people without any knowledge of a foreign language can be more easily manipulated and are often accused of being insular. Indeed, such language barriers can be insuperable obstacles not only in administrative issues, but in the path of any integration process. This applies to public administration and highly influences digital competences in all its aspects, from problem-solving and safety to digital content creation, communication, and data literacy.
(3)
Mathematical competence is the ability to develop and apply mathematical thinking to solve a range of problems in everyday situations. Building on a sound mastery of numeracy, the emphasis is on process, activity, and knowledge. Mathematical competence involves, to different degrees, the ability and willingness to use mathematical modes of thought (logical and spatial thinking) and presentation (formulas, models, figures, graphs, and charts). Competence in science refers to the ability and willingness to use the body of knowledge and methodology employed to explain the natural world to identify questions and draw evidence-based conclusions [10].
The Student Guide of Guardian addresses the question of whether mathematics is needed in public administration studies or not [13]. In most public administration programs, mathematics is not a required subject. Public administration, however, gives the students a wide range of opportunities to choose from. Some of the careers open to them are working as a lecturer, principal, politician, political analyst, administrator, human resource manager, business manager, etc., and basic mathematical, statistical, and computing knowledge is useful. Administrators work through service provision, policymaking, and good governance enforcement. Public administration requires expert skills in managing government affairs both at the national and local government levels. The scope of public administration studies comprises mainly administration, management, and science. Therefore, students can study public administration without mathematics, because, as mentioned earlier, in most cases there is no mathematics in public administration curricula and programs. However, there are mathematics-related courses students are likely to encounter when studying public administration, such as economics, finance, business, and marketing. From our point of view of digital competences, basic math skills are needed for subjects such as safety, digital content creation, and information and data literacy.
(4)
Competence in technology is viewed as the application of knowledge and methodology in response to perceived human intentions or needs. Competence in science and technology involves an understanding of the changes caused by human activity and responsibility as an individual citizen [10].
During and after the COVID-19 pandemic, it was generally assumed that increasingly more people working and studying online from home entailed improved digital and Internet skills. However, a study by Judith Kausch-Zongo and Birgit Schenk [14] showed that e-learning phases during the COVID-19 pandemic have contributed little to students’ technological competency and usage in public administration higher education. Moreover, the competency and usage during on-the-job training phases in public organizations clearly differ from the off-the-job training phases at university. Hence, the findings reveal a general technological competency and usage gap between public administration education and actual duties in public administration. These findings justify our research regarding the minimum technical knowledge in safety, digital content creation, and information literacy.
(5)
Digital competence involves the confident and critical use of the information society technology (IST) for work, leisure, and communication. It is underpinned by basic skills in ICT: the use of computers to retrieve, assess, store, produce, present and exchange information, and to communicate and participate in collaborative networks via the Internet [10].
In Poland, a project examined 142 public administration organizations and concludes that digital competence is the key success factor of e-government in the field of human resource management processes [15]. This study indicates that the inclusion of digital competences into the process of employee recruitment from universities, evaluation, and development is significantly correlated with an organization’s performance in human resource management (namely, employees’ engagement and satisfaction and the efficiency of human resource management). Therefore, digital competence is of primary importance in problem-solving, communication, and collaboration, as we emphasize in our present survey.
(6)
‘Learning to learn’ is the ability to pursue and persist in learning, and to organize one’s own learning, including through effective management of time and information, both individually and in groups. This competence includes awareness of one’s learning process and needs, identifying available opportunities, and the ability to overcome obstacles to learn successfully. This competence means gaining, processing, and assimilating new knowledge and skills as well as seeking and making use of guidance. Learning to learn engages learners to build on prior learning and life experiences to use and apply knowledge and skills in a variety of contexts: at home, at work, in education, and in training. Motivation and confidence are crucial to an individual’s competence [10].
A practical interpretation of learning to learn in public administration studies is MMs (mixed methods). Depending on the specific research design [16], MMs can contribute to corroborating hypotheses, providing multidimensional interpretations of existing phenomena, and discover the underlying causal mechanisms in public administration higher education. MMs can help students understand the complexity of public administration, contributing to making the ‘public machine’ function more smoothly and efficiently. In our approach and survey, this view relates to information and data literacy.
(7)
Sense of initiative and entrepreneurship refers to an individual’s ability to turn ideas into action. It includes creativity, innovation, and risk-taking, as well as the ability to plan and manage projects to achieve objectives. This supports individuals, both in their personal and professional lives, to be more aware of the context of their work and better able to seize opportunities. In addition, it is a foundation for more specific skills and knowledge needed by those establishing or contributing to social or commercial activity. This should include awareness of ethical values and promote good governance [10].
The findings of a German/American study show that, on one hand, there is high demand among public administration and public policy students for entrepreneurship training, and that, on the other hand, few curricula offer it [17]. Since the interrelationships between entrepreneurship and public administration/public policy education are still underdeveloped, in our research we gave special emphasis to this aspect in the problem-solving issue, and in digital content creation and communication.
(8)
Cultural knowledge includes an awareness of local, national, and European cultural heritage and their place in the world. It covers basic knowledge of major cultural works, including popular contemporary culture. It is essential to understand the cultural and linguistic diversity in Europe and other regions of the world, the need to preserve it, and the importance of aesthetic factors in daily life [10].
A research project led by Jamez Stare and Maja Klun [18] investigated the most important competencies that should be included in public administration higher education. The results show that the competencies related to culture-based ethics and ethical behavior were evaluated as the most important: trustworthiness, responsibility, being ethical in decisions, respectability, not taking advantage of one’s personal position, communicating in one’s native language, and working in accordance with authorizations. These also correlate with our investigation aspects, insofar as problem-solving, communication, and collaboration are concerned.
Although the term digital competence is increasingly used today, and digital literacy is increasingly being displaced, we could not find an internationally accepted definition for it. As Garry Faloon (2020) puts it, ‘the view of teacher digital competence (TDC) moves beyond prevailing technical and literacies conceptualizations, arguing for more holistic and broader-based understandings that recognize the increasingly complex knowledge and skills young people need to function ethically, safely, and productively in diverse, digitally-mediated environments’ [19]. Given also that the spectrum of skills is constantly expanding, it is not possible to identify a universal concept. That is why we see that the content of digital competence varies by author and approach.
According to one approach, digital competence is an evolving concept related to the development of digital technology and the political aims and expectations of citizenship in a knowledge society. While it is regarded as a core competence in policy papers, it is not yet a standardized concept in educational research [20]. Ilomäki and coauthors suggest that it is a useful boundary concept that can be used in various contexts. They analyzed 76 educational research articles in which digital competence, described by different terms, was investigated. As a result, we found that digital competence consists of a variety of skills and competences, and that it has a broad scope and background, ranging from media studies and computer science to library and literacy studies. They suggest that digital competence is defined as consisting of (1) technical competence; (2) the ability to use digital technologies in a meaningful way for work, study, and everyday life; (3) the ability to evaluate digital technologies critically; and (4) motivation to participate and commit in the digital culture.
Another interpretation presents two scopes of understanding: a complete overview of digital competence in Europe, via the DigComp (2011–2013) Project and the latest version of the project, DigComp 2.0 (2016), which emphasizes the digital environment. The main conclusion could be summarized as the paramount importance that digital competence has gained in society as a result of digital transformation and of its inclusion in the Europe Strategy 2020, both of which are reasons to prioritize its integration into education [21].
Spanish researchers [22] found that digital competence (as one of the eight key competences for lifelong learning developed by the European Commission) is a requisite for personal fulfillment and development, active citizenship, social inclusion, and employment in a knowledge society. To accompany young learners in the development of competence, and to guarantee optimal implementation of information and communication technologies (ICTs), teachers must be digitally competent as well. Fraile and coauthors worked with a sample of 43 secondary school teachers in initial training who self-assessed their level of competence in 21 subcompetences in five areas identified by the DigComp project, using the rubrics provided in the Common Digital Competence Framework for Teachers. Overall, pre-service teachers’ conceptions about their level of digital competence were low. Students scored highest in information, which refers mostly to the operations they performed during their studies, while the second-highest scores were in safety and communication, excluding the protection of digital data and preservation of digital identity. The lowest values were achieved in content creation and problem-solving, the dimensions most closely related to the inclusion of ICTs to transform teaching and learning processes. The knowledge or skills they exhibit are largely self-taught and, so, we perceive an urgent need to purposefully incorporate relational and didactic aspects of ICT integration.
Additionally, the European Commission tried to find the common denominator of these approaches, which it identified as the highlighting of positive attitudes toward digital culture, besides having the technical knowledge required for ICT tools, the ability to solve problems, and a critical approach to information and its management [10].
This is why the European Committee created the DigComp framework project (EU Science Hub, 2018) [23], the basic objective of which was to build a unified system of all the details and components of digital competence, providing support for the launch of related developments. This unified system also makes the concept of digital competence itself clearer and more understandable, and above all, measurable [24]. The DigComp framework defines five main areas of digital competence: information, communication, content creation, security, and problem-solving [23].
DigComp has been developing since its formation, and many reference tools are already connected to it. These set out not only citizen skill levels, but also frameworks for educators (DigCompEdu), educational institutions (DigCompOrg), and entrepreneurs (EntreComp). These include the DigComp self-assessment survey (DigCompSat), which means that a framework and evaluation system with uniform content and form is now available in the European context, allowing for comparable measurements of different digital competence levels.
At the same time, it is important to note that there can be huge differences between self-assessments and the actual level of competence. A study titled ‘Perception and Reality: Measuring Digital Skills Gaps in Europe, India, and Singapore’ [25] also states that people tend to overestimate their own digital skills, in addition to the fact that we see permanent and age-dependent gaps in Europe and outside Europe. The overestimation—as the Figure 1 shows—is massive.
Another finding of this study is that those who have undergone previous digital skills development perform better than those without such a background.
Therefore, important questions follow: What is the real level of digital competence? What factors distort self-assessments? Is it possible to achieve significant development and in which areas?
After considering the present state of the research regarding digital competences in the field of public administration, we formulated three research questions:
RQ1: 
Can a significant improvement in digital competence be achieved with a 30 h theoretical and 30 h practical university skills development course?
RQ2: 
Do age, gender, department, and previous IT studies have a moderating effect on the shift in digital competence?
RQ3: 
Is there a statistically significant difference between the potential shift in the different segments of digital competence?
When studying the eight key competences offered by EUR-Lex (2006), we were aware of the areas of digital competences, as noted in the section above, and tried to interpret the key factors from the point of view of public administration higher education. Thus, our questionnaire investigation could rely on the five areas of digital competences recommended by the EU Science Hub (2008). That is the foundation of the methodology we used.

2. Materials and Methods

When measuring digital competence, we relied on the EU reference material, the Citizen Digital Competence Framework [26], which defines five digital competence areas, with twenty-one digital competence elements and eight proficiency levels. The five areas of competence follow:
  • Information and data literacy
This includes articulating information needs, identifying the location and retrieval of digital data, information, and content; judging the relevance of the source and its content; and storing, managing, and organizing digital data, information, and content.
2.
Communication and collaboration
Interaction, communication, and collaboration through digital technologies while being aware of cultural and generational diversity; participating in society through public and private digital services and participatory citizenship; managing people’s digital presence, identity, and reputation.
3.
Digital content creation
Creating and editing digital content; improving and integrating information and content into an existing body of knowledge while understanding how copyright and licensees are used; knowing how to give understandable instructions for computer systems.
4.
Safety
Protecting devices, content, personal data, and privacy in digital environments; protecting physical and psychological health and being aware of digital technologies for social well-being and social inclusion; being aware of the environmental impact of digital technologies and their use.
5.
Problem-solving
Identifying needs and problems and resolving conceptual problems and problem situations in digital environments; using digital tools to innovate processes and products; keeping up to date with digital evolution [23].
The DigCompSat [27] self-assessment questionnaire was used for the survey. Its technique and content are in line with previous questionnaire-based research, including the index construction and evaluation methods used. The model was created based on a methodological approach that can measure the three components of digital competences (knowledge, skills, and attitude) in all five areas of digital competences, while also providing participants with the opportunity to develop self-awareness skills related to their own abilities.
In the DigCompSAT self-assessment questionnaire, students had to evaluate, on a four-point scale, 82 statements about digital competence components. The values obtained on the scales could be summed and examined in a suitable manner, along with the entire DigCompSAT and the five digital competence areas with mean value indicators. Median values and averages could be formed, and accurate percentage results could be calculated from them. The tool proved to be reliable in the survey for both the full scale and the subscales (Cronbach’s alpha > 0.8).
The first (input) survey was conducted between 17 January and 28 February 2022, using an online form. The second (output) measurement was conducted between 11 May and 27 May 2022. The input assessment was completed by N = 120 undergraduate students, while the output assessment was completed by N = 58 students (many did not complete the questionnaire).
The empirical samples of the two measurements were not based on random selection, but in order to ensure comparability, the survey aimed to cover the entire population and similarity check of the two samples was performed. (There was no systematic bias or significant difference in the composition of the samples that substantially affected the results.) The analysis was based on self-assessment responses. This research did not aim to verify the self-assessment responses with tests or to directly measure digital competency.
Regarding management of the data obtained, we excluded from the database any records in the empirical sample having more than 90% missing data, and in this way organized the items into indexes in the dataset obtained. In so doing, we chose one of the two basic Likert scale analytical procedures, whereby, after converting the scale levels into values, we also considered the calculation of the averages from the observed mean values when evaluating the data. Our method of converting the scale levels was to relate the four-degree scale to the values 0–0.33–0.66–1, and the analysis of the empirical sample was based on this resolution. Therefore, the average values calculated in this way fell between 0 and 1, and could be converted to percentage values and compared.
The sampling method did not include a random method, so we did not use probabilistic statistical tools when comparing the mean values. Index trainings using mean values (median, average) were carried out both for the competence components (knowledge, skills, and attitude) and for the main thematic competence areas. The values of the indices were then examined along two or more dimensions, divided into subgroups and cross-tabulations, in which each of the characteristics of the students (age; major; type of studies, i.e., full-time or correspondence course; etc.) were also included as one dimension in the analysis. In the evaluation of the subgroups, we also considered the limitations of evaluating subgroups with a very small number of elements.

3. Results

As a starting point, we wanted to obtain an insight into students’ digital competence, to realize in which areas they show adequate proficiency and where they have clear deficiencies. While we conducted the survey for all majors in the input measurement, in the output measurement we focused only on those who had participated in the course. The average digital competence level of those surveyed was 63.3%, which is slightly higher than the Spanish (57.1%) and Latvian (55.8%) results measured in the DigCompSat test measurements [26].
We examined the effect of age, training department (full-time department: 62.7%, correspondence department 64.7%), and major (BA: 60.7%, MA: 67%) on the input values, but the largest difference was found in the gender dimension (male 69.3%, female 59.4%) (Figure 2).
In view of the average values obtained from the DigCompSAT self-assessment questionnaires, males achieved significantly higher values than females in all areas of digital competence and in terms of overall digital competence. As you can see in our own diagram above, the gender gap is particularly noticeable in the areas of information search and data management, digital content creation, and problem-solving. The reason for this may be that women lack confidence in this area and may have been more prone to undervaluation.
It is also very interesting to examine how the distribution of the results obtained within each competence area is formed for the entire student sample. In the distribution curves for communication and collaboration, and in information search and data management, a higher proportion of students appeared at higher proficiency levels, while in the creation of digital content, most students achieved a result between 30 and 70%. Very few students achieved results above 80% in digital content creation, security, and problem-solving. Considering the individual areas of competence, the area of digital content creation shows the lowest value, so it should be a priority in the future. In addition, the area of security and problem-solving can also be the focus of improvements, due to their lower values.
Dividing the survey results, among students who had digital skills development as a compulsory subject, the input assessment showed an average result of only 60.1%. The results, therefore, show a strong middle level.
Based on our surveys, grouped by research questions, we make the following findings.
RQ1: 
Can a significant improvement in digital competence be achieved with a thirty-hour theoretical and thirty-hour practical university skills development course?
Our answer is clearly yes. The median of digital competence measurements increased from 0.626 to 0.652. Compared to the March admission base, this is a 2.863% shift, which is not significant, but further divisions already show a more significant shift. Among the individual competence elements, the greatest progress was found in the averages for skills (3.19%). The truly significant findings are obtained after further analysis of the data. Figure 3 provides additional information.
RQ2: 
Do age, gender, department, and previous IT studies have a moderating effect on the shift in digital competence?
As you can see in Figure 4, among the age groups, the 25–29 and 30–34 age groups reached the highest competence levels. In the second measurement, on the other hand, there is a decline in both age groups, which, starting from the self-evaluation, may also stem from the fact that the respondents realized their true depth of knowledge.
The largest negative shift was measured in the 30–34 age group (39.7%). Within this, the creation of digital content, security awareness, and problem-solving are outstanding. As shown in Figure 5, we see the biggest positive shift in the 35–39 age group. Here, we measured a positive shift of 15.81% in the average level of digital competence. It should be noted that the safety awareness and attitude in this age group improved remarkably: 27.29% and 26.98%, respectively. At the same time, their problem-solving skills also improved by 19.15%. In our opinion, it would be worthwhile to investigate in a separate study why the reality of self-evaluations tilts above the 35–39 age group. However, it is quite clear that the respondents in the input survey of the 30–34 age group overestimated their competence level, while the following age group presumably underestimated their abilities.
Regarding gender distribution, males achieve a higher value in terms of competence levels. In terms of change, however, females responded significantly and consistently to skill development in all dimensions. As illustrated in Figure 6, the average level of digital competence increased by 8.85%. We measured an almost similar decline in males (−7.22%). Concerning all attributes, the shift for females was positive and that for males was negative. Here, too, the increase in skills for females is outstanding (10.2%), which is already a significant shift. There is also a strong shift in the creation of digital content (7.26%) and the average of knowledge and information and data management topics (6.41% and 6.32%, respectively).
When the public administration undergraduate program included a subject on digital skills development, the average of digital competence improved by 8.97%. Within this, the range in digital content creation (10.65% improvement), the range in skills (9.43% improvement), and the average knowledge stand out.
In the case of full-time courses, there is no significant difference in the level of competence; however, the values are slightly higher in the case of correspondence courses. The average displacement is higher for full-time students with 3.62%, while for correspondence students it is only 0.37%. Here, too, in full-time classes, the subject of skills improved the most (4.96%).
From the point of view of the level of competence, there is no significant difference according to the presence or absence of a background in IT. The average competence of students who had preliminary IT training increased by 4.91%, while the competence average of those who had not previously taken an IT-related course decreased by 3.57%.
RQ3: 
Is there a statistically significant difference between the potential shift in the different segments of digital competence?
At first approach, our answer is no; however, if we examine the age dimension here as well, we can already see significant differences. The older age group responds better in the areas of security awareness and problem-solving, while the younger generation shows greater shifts in terms of digital content production and the development of communication and collaboration.

4. Discussion

The final items to discuss include the significance of the results, their novelty, what the present research adds to the literature in the field, and potential new explanations and aspects.
  • There is very little specialized literature regarding DigCompSat testing in higher education. Our study intends to remedy this gap, specifically in the education environment of public administration.
  • During the research, we realized that the values based on self-assessment can shift, and that these shifts vary by age group, area, and range of competence.
  • We also realized that these digital competencies can be developed and found some confirmation that limited results can be achieved even in such a short time.
  • Our findings also showed that, in addition to self-assessment, independent survey solutions can show true competency levels and shifts.
  • In addition, we realized that it would be better to examine the progress of a target group over longer periods of time, but more regularly.
  • We can also state that the pilot version of this method will form the basis of the local competitiveness test, which will be implemented as part of a large-scale national survey.
Regarding the limitations and the future of our research, we must consider its importance in the social sciences context. There are three viewpoints that stem from similar research projects but focus on digital divides.
The first possibility to broaden our approach would be the stratification hypothesis, where there is a positive link between a person’s corresponding fields of resources (economic, cultural, social, and personal) in the digital context. This includes specific areas of social and digital inclusion or exclusion, which are not randomly distributed nor independent from social divides [28].
The second aspect is the compound digital exclusion hypothesis assuming a connection between different digital resources. It presumes that a deficiency in one type of digital competence leads to a deficiency in another area. Applied to public administration studies, this could be interpreted as meaning that if a student cannot benefit from or use digitization in one area, that student will not be able to benefit from and use digitization in another area either—a situation which should be avoided in public administration training as well. The underlying idea is that deficiencies in different types of digital skills are cumulative, and that the same is true for the inability to use digitalization for various purposes and to benefit from their use [29].
The third aspect is the sequential digital exclusion hypothesis, assuming a connection between the first-level, second-level, and third-level digital divides, which we plan to interpret in the future by identifying the shifts we found in our research. A student who is in danger of falling into one of the divides is also at risk of falling into another one. In more concrete terms, those who have poor access to digital tools probably also lag behind in the development of digital skills and the use of digital tools in a more routine-like manner, and are unable to benefit from their use in the same way as those with better access or higher skills [29].
Thus, our research could be placed in a social science context focusing on digital divides and their effects, through the examination of digital competences of public administration students. Because of the role of public administration influencing the well-being of the whole society, it is a research aspect that must not be neglected—or so we assume.

5. Conclusions

In accordance with the special literature background, we also established that there can be an order of magnitude difference between the self-assessment and the real level of competence. It is therefore not certain that one should start from a self-assessment. At the most, it could be useful for respondents to realize where they have gaps in their knowledge and competence. This is what we refer to as a ’change in attitude‘. The real starting point could be independent surveys, where the students report on their knowledge in some kind of assessment system, such as an exam system, or perhaps their knowledge could be evaluated from their use of certain applications.
Therefore, even the directions of change can only be handled with an appropriate critical approach. It can be assumed that the larger amplitude movements may not necessarily be knowledge or skills, but rather a change in attitude. Its practical importance might lie in the possibility to orient students. We can open their eyes to see what they do not know, which may help them develop the need to know. At the same time, considering, for example, the field of security awareness, these are also very useful developments. As a result of the above, even the smallest progress in security awareness can result in major benefits. Even by merely making demands and increasing the desire for knowledge, we can prevent many problems from arising.
We also see that there are factors related to gender, age, and prior training (experience) that influence the potential shifts between the segments. It is also clearly visible that the above factors apply differently in each range of competence. Hence, the more targeted and the more regularly we implement digital competence development at the institutional level along these parameters, the greater results we can achieve. Moreover, the repetitive nature of our monitoring can help us understand these processes.
Some of the limitations in this research, including the conceptualization problems of digital competence, are as follows. On the one hand, we only asked about specific digital skills, as we focused on the DigCompSat methodology. On the other hand, it is based on self-assessment, but the attitude behind the answers is decisive. Thirdly, we could not monitor classroom attendance of the students who completed the questionnaires at the beginning and at the end of the study. Fourthly, the research took place with an experimental grade and an experimental curriculum, and this was the first test of this kind.

Author Contributions

Conceptualization, B.B.B. and I.T.; methodology, S.C.; validation, I.T. and B.B.B.; formal analysis, S.C.; investigation, B.B.B.; resources S.C.; data curation, S.C.; writing—original draft preparation, B.B.B.; writing—review and editing, I.T.; visualization, B.B.B.; supervision, I.T. All authors have read and agreed to the published version of the manuscript.

Funding

Project No. TKP2021-NKTA-51 was implemented with support provided by the Ministry of Innovation and Technology of Hungary from the National Research, Development, and Innovation Fund, financed under the TKP2021-NKTA funding scheme.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

For data protection reasons, the data are not available.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Parijkova, L. Digital Technologies in the Daily Life of the Alfa Generation—Research Gate, August 2020. Available online: https://www.researchgate.net/publication/348245098_DIGITAL_TECHNOLOGIES_IN_THE_DAILY_LIFE_OF_THE_ALFA_GENERATION (accessed on 10 July 2023).
  2. Budai, B.B. Települési Önkormányzatok az Információs Tengerben (Local Governments in the Sea of Information)—Pro Publico Bono—Magyar Közigazgatás, 2017/3, 196–215. Based on the Project: Önkormányzati Fejlesztések Figyelemmel Kísérése II. (Monitoring Local Government Developments), KÖFOP-2.3.4-VEKOP-15. 2017. Available online: https://folyoirat.ludovika.hu/index.php/ppbmk/article/view/1823 (accessed on 10 July 2023).
  3. Hrebenyk, I. Formation of Digital Competence of Leaders of Educational Institutions—Research Gate, January 2019. Open Educational E-Environment of Modern Universities. Available online: https://www.researchgate.net/publication/332983239_FORMATION_OF_DIGITAL_COMPETENCE_OF_LEADERS_OF_EDUCATIONAL_INSTITUTIONS (accessed on 10 July 2023).
  4. Budai, B.B. A Blockchain Hype és a Közigazgatási Realitás (Blockchain Hype and Public Administration Reality). E-GOV Hírlevél, Közigazgatás és Informatika. Recent Challenges in Governing Public Goods & Services. 2018. Available online: https://hirlevel.egov.hu/2018/04/08/blokklanc-es-kozigazgatas-tulzasok-es-realitas/ (accessed on 10 July 2023).
  5. Hawkins, R.; Paris, A.E. Computer Literacy and Computer Use Among College Students: Differences in Black and White. J. Negro Educ. 1997, 66, 147–158. [Google Scholar] [CrossRef]
  6. Schneckenberg, D.; Wildt, J. Understanding the Concept of eCompetence for Academic Staff. In The Challenge of eCompetence in Academic Staff Development; MacLabhrainn, I., McDonald Legg, C., Schneckenberg, D., Wildt, J., Eds.; CELT/eCompInt Publications: Galway, Ireland, 2006. [Google Scholar]
  7. EFOP 3.2.15_VEKOP-17- 2017-00001 Project on “A Köznevelés Keretrendszeréhez Kapcsolódó Mérési- Értékelési és Digitális Fejlesztések, Innovatív Oktatásszervezési Eljárások Kialakítása, Megújítása” (Development and Renewal of Measurement Evaluation and Digital Developments, Innovative Educational Organization Procedures Related to the Framework of Public Education). Oktatás 2030 Tanulástudományi Kutatócsoport. Available online: https://www.oktatas2030.hu/a-boldogulas-kulcsa-digitalis-kompetenciak/ (accessed on 10 July 2023).
  8. Novo Melo, P.; Machado, C. (Eds.) Management and Technological Challenges in the Digital Age; CRC Press: Boca Raton, FL, USA, 2018; ISBN 9781498787604. Available online: https://www.routledge.com/Management-and-Technological-Challenges-in-the-Digital-Age/Melo-Machado/p/book/9781498787604 (accessed on 10 July 2023).
  9. Gallardo-Echenique, E.E.; De Oliveira, J.M.; Marqués-Molias, L.; Esteve-Mon, F.; Wang, Y.; Baker, R. Digital Competence In The Knowledge Society. MERLOT J. Online Learn. Teach. 2015, 11, 1–17. [Google Scholar]
  10. European Parliament and Council. Recommendation of the European Parliament and of the Council of 18 December 2006 on Key Competences for Lifelong Learning. Off. J. Eur. Union 2006, 394, 10–18. Available online: https://eur-lex.europa.eu/eli/reco/2006/962/oj (accessed on 10 July 2023).
  11. Nuolijärvi, P.; Stickel, G. (Eds.) Language Use in Public Administration. Theory and Practice in the European States—European Federation of National Institutions of Language. Contributions to the EFNIL Conference in Helsinki 2015; Research Institute for Linguistics Hungarian Academy of Sciences: Budapest, Hungary, 2015; ISBN 978-963-9074-65-1. Available online: http://www.efnil.org/conferences/13th-annual-conference-helsinki/proceedings/EFNIL-Helsinki-Book-Final.pdf (accessed on 10 July 2023).
  12. Zouhar, J. On a Small Mother Tongue as a Barrier to Intercultural Policies: The Czech Language. Exedra Rev. Científica 2011, 25–34. [Google Scholar]
  13. Guardian. The Student Guide. Studying Public Administration without Mathematics. 2021. Available online: https://www.zambianguardian.com/studying-public-administration-without-mathematics/#google_vignette (accessed on 10 July 2023).
  14. Kausch-Zongo, J.; Schenk, B. General technological competency and usage in public administration education: An evaluation study considering on-the-job trainings and home studies. Smart Cities Reg. Dev. J. 2022, 6, 55–65. [Google Scholar]
  15. Wodecka-Hyjek, A.; Kafel, T.; Kusa, R. Managing digital competences in public administration. In People in Organization. Selected Challenges for Management; Skalna, I., Kusa, R., Eds.; AGH University of Science and Technology Press: Krakow, Poland, 2021. [Google Scholar] [CrossRef]
  16. Mele, V. SDA Bocconi School of Management—Theory to Practice. Methods for Studying Public Administration. 2020. Available online: https://www.sdabocconi.it/en/sda-bocconi-insight/theory-to-practice/government/methods-for-studying-public-administration?gclid=EAIaIQobChMIzcOTzt3W_QIVpo1oCR1bugkXEAAYAiAAEgJR5vD_BwE&gclsrc=aw.ds (accessed on 10 July 2023).
  17. Grimm, H.M.; Bock, C.L. Entrepreneurship in public administration and public policy programs in Germany and the United States. Teach. Public Adm. 2022, 40, 322–353. [Google Scholar] [CrossRef]
  18. Stare, J.; Klun, M. Required competencies in public administration study programs. Transylv. Rev. Adm. Sci. 2018, 55, 80–97. [Google Scholar] [CrossRef]
  19. Falloon, G. From digital literacy to digital competence: The teacher digital competency (TDC) framework. Educ. Technol. Res. Dev. 2020, 68, 2449–2472. [Google Scholar] [CrossRef]
  20. Ilomäki, L.; Paavola, S.; Lakkala, M.; Kantosalo, A. Digital competence—An emergent boundary concept for policy and educational research. Educ. Inf. Technol. 2016, 21, 655–679. [Google Scholar] [CrossRef]
  21. Pérez-Escoda, A.; Fernández-Villavicencio, N.G. Digital competence in use: From DigComp 1 to DigComp 2. In Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 2–4 November 2016; pp. 619–624. [Google Scholar] [CrossRef]
  22. Fraile, M.N.; Peñalva-Vélez, O.A.; Lacambra, A.M.M. Development of Digital Competence in Secondary Education Teachers’ Training. Educ. Sci. 2018, 8, 104. [Google Scholar] [CrossRef]
  23. EU Science Hub. Dig Comp Framework. 2018. Available online: https://joint-research-centre.ec.europa.eu/digcomp/digcomp-framework_en (accessed on 10 July 2023).
  24. Carretero, S.; Vuorikari, R.; Punie, Y. DigComp 2.1: The Digital Competence Framework for Citizens with Eight Proficiency Levels and Examples of Use; EUR 28558 EN; Publications Office of the European Union: Luxembourg, 2017. [CrossRef]
  25. ICDL. A Perception & Reality: Measuring Digital Skills Gaps in Europe, India, and Singapore. (ICDL, 2018). 2018. Available online: https://www.icdleurope.org/policy-and-publications/perception-reality-measuring-digital-skills-gaps-in-europe-india-and-singapore/#f4 (accessed on 10 July 2023).
  26. Clifford, I.; Kluzer, S.; Troia, S.; Jakobsone, M.; Zandbergs, U.; Vuorikari, R.; Punie, Y.; Castaño Muñoz, J.; Centeno Mediavilla, I.C.; O’Keeffe, W.; et al. (Eds.) Digcompsat—A Self-Reflection Tool for The European Digital Competence Framework for Citizens; Publications Office of The European Union: Luxembourg, 2020; ISBN 978-92-76-27592-3. [CrossRef]
  27. Vuorikari, R.; Kluzer, S.; Punie, Y. DigComp 2.2: The Digital Competence Framework for Citizens—With New Ex-amples of Knowledge, Skills and Attitudes; EUR 31006 EN; Publications Office of the European Union: Luxembourg, 2022; ISBN 978-92-76-48882-8. [CrossRef]
  28. Helsper, E.J. A Corresponding Fields Model for the Links Between Social and Digital Exclusion. Commun. Theory 2012, 22, 403–426. [Google Scholar] [CrossRef]
  29. van Deursen, A.J.; Helsper, E.J.; Eynon, R. Development and validation of the Internet Skills Scale (ISS). Inf. Commun. Soc. 2016, 19, 804–823. [Google Scholar] [CrossRef]
Figure 1. Self-assessment vs. actual skills in Austria and Switzerland (2017).
Figure 1. Self-assessment vs. actual skills in Austria and Switzerland (2017).
Sustainability 15 12462 g001
Figure 2. Average results by gender in digital competence areas (%).
Figure 2. Average results by gender in digital competence areas (%).
Sustainability 15 12462 g002
Figure 3. Average input and output results in digital competence areas (%).
Figure 3. Average input and output results in digital competence areas (%).
Sustainability 15 12462 g003
Figure 4. Change by age groups.
Figure 4. Change by age groups.
Sustainability 15 12462 g004
Figure 5. Input and output average values by age groups. Source: own data.
Figure 5. Input and output average values by age groups. Source: own data.
Sustainability 15 12462 g005
Figure 6. Change by gender.
Figure 6. Change by gender.
Sustainability 15 12462 g006
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Budai, B.B.; Csuhai, S.; Tózsa, I. Digital Competence Development in Public Administration Higher Education. Sustainability 2023, 15, 12462. https://doi.org/10.3390/su151612462

AMA Style

Budai BB, Csuhai S, Tózsa I. Digital Competence Development in Public Administration Higher Education. Sustainability. 2023; 15(16):12462. https://doi.org/10.3390/su151612462

Chicago/Turabian Style

Budai, Balázs Benjámin, Sándor Csuhai, and István Tózsa. 2023. "Digital Competence Development in Public Administration Higher Education" Sustainability 15, no. 16: 12462. https://doi.org/10.3390/su151612462

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop